Follow
Xinchen Wan
Title
Cited by
Cited by
Year
{SRNIC}: A scalable architecture for {RDMA}{NICs}
Z Wang, L Luo, Q Ning, C Zeng, W Li, X Wan, P Xie, T Feng, K Cheng, ...
20th USENIX Symposium on Networked Systems Design and Implementation (NSDI …, 2023
262023
Rat-resilient allreduce tree for distributed machine learning
X Wan, H Zhang, H Wang, S Hu, J Zhang, K Chen
Proceedings of the 4th Asia-Pacific Workshop on Networking, 52-57, 2020
262020
Tacc: A full-stack cloud computing infrastructure for machine learning tasks
K Xu, X Wan, H Wang, Z Ren, X Liao, D Sun, C Zeng, K Chen
arXiv preprint arXiv:2110.01556, 2021
252021
Domain-specific communication optimization for distributed DNN training
H Wang, J Chen, X Wan, H Tian, J Xia, G Zeng, W Wang, K Chen, W Bai, ...
arXiv preprint arXiv:2008.08445, 2020
182020
Scalable and efficient full-graph gnn training for large graphs
X Wan, K Xu, X Liao, Y Jin, K Chen, X Jin
Proceedings of the ACM on Management of Data 1 (2), 1-23, 2023
102023
Dgs: Communication-efficient graph sampling for distributed gnn training
X Wan, K Chen, Y Zhang
2022 IEEE 30th International Conference on Network Protocols (ICNP), 1-11, 2022
42022
Towards fair and efficient learning-based congestion control
X Liao, H Tian, C Zeng, X Wan, K Chen
arXiv preprint arXiv:2403.01798, 2024
12024
Towards Domain-Specific Network Transport for Distributed DNN Training
H Wang, H Tian, J Chen, X Wan, J Xia, G Zeng, W Bai, J Jiang, Y Wang, ...
1
Astraea: Towards Fair and Efficient Learning-based Congestion Control
X Liao, H Tian, C Zeng, X Wan, K Chen
Proceedings of the Nineteenth European Conference on Computer Systems, 99-114, 2024
2024
Accelerating Neural Recommendation Training with Embedding Scheduling
C Zeng, X Liao, X Cheng, H Tian, X Wan, H Wang, K Chen
21st USENIX Symposium on Networked Systems Design and Implementation (NSDI …, 2024
2024
Accurate and Scalable Rate Limiter for RDMA NICs
Z Wang, X Wan, C Zeng, K Chen
Proceedings of the 7th Asia-Pacific Workshop on Networking, 15-20, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–11