关注
Shuxin Zheng
Shuxin Zheng
Principal Researcher, Microsoft Research
在 microsoft.com 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Do Transformers Really Perform Badly for Graph Representation?
C Ying, T Cai, S Luo, S Zheng, G Ke, D He, Y Shen, TY Liu
Thirty-Fifth Conference on Neural Information Processing Systems (NIPS), 2021, 2021
12992021
On layer normalization in the transformer architecture
R Xiong, Y Yang, D He, K Zheng, S Zheng, C Xing, H Zhang, Y Lan, ...
Proceedings of the 37th International Conference on Machine Learning, 2020, 2020
10332020
Asynchronous stochastic gradient descent with delay compensation
S Zheng, Q Meng, T Wang, W Chen, N Yu, ZM Ma, TY Liu
Proceedings of the 34th International Conference on Machine Learning, PMLR …, 2017
369*2017
Invertible Image Rescaling
M Xiao, S Zheng, C Liu, Y Wang, D He, G Ke, J Bian, Z Lin, TY Liu
European Conference on Computer Vision (ECCV) 2020, 126-144, 2020
2592020
Cross-Iteration Batch Normalization
Z Yao, Y Cao, S Zheng, G Huang, S Lin
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2021, 2020
1342020
One transformer can understand both 2d & 3d molecular data
S Luo, T Chen, Y Xu, S Zheng, TY Liu, L Wang, D He
The Eleventh International Conference on Learning Representations, 2022
892022
Deep learning for prediction of the air quality response to emission changes
J Xing, S Zheng, D Ding, JT Kelly, S Wang, S Li, T Qin, M Ma, Z Dong, ...
Environmental science & technology 54 (14), 8589-8600, 2020
852020
Predicting equilibrium distributions for molecular systems with deep learning
S Zheng, J He, C Liu, Y Shi, Z Lu, W Feng, F Ju, J Wang, J Zhu, Y Min, ...
Nature Machine Intelligence, 1-10, 2024
73*2024
Benchmarking graphormer on large-scale molecular modeling datasets
Y Shi, S Zheng, G Ke, Y Shen, J You, J He, S Luo, C Liu, D He, TY Liu
arXiv preprint arXiv:2203.04810, 2022
662022
How could Neural Networks understand Programs?
D Peng, S Zheng, Y Li, G Ke, D He, TY Liu
Proceedings of International Conference on Machine Learning (ICML), 2021 …, 2021
612021
Your transformer may not be as powerful as you expect
S Luo, S Li, S Zheng, TY Liu, L Wang, D He
Advances in Neural Information Processing Systems 35, 4301-4315, 2022
492022
Stable, Fast and Accurate: Kernelized Attention with Relative Positional Encoding
S Luo, S Li, T Cai, D He, D Peng, S Zheng, G Ke, L Wang, TY Liu
Advances in Neural Information Processing Systems, 2021 (NeurIPS 2021), 2021
482021
Molecule generation for target protein binding with structural motifs
Z Zhang, Y Min, S Zheng, Q Liu
The Eleventh International Conference on Learning Representations, 2023
442023
The impact of large language models on scientific discovery: a preliminary study using gpt-4
MR AI4Science, MA Quantum
arXiv preprint arXiv:2311.07361, 2023
352023
-SGD: Optimizing ReLU Neural Networks in its Positively Scale-Invariant Space
Q Meng, S Zheng, H Zhang, W Chen, Q Ye, ZM Ma, TY Liu
Proceedings of the 7th International Conference on Learning Representations …, 2018
342018
Modeling Lost Information in Lossy Image Compression
Y Wang, M Xiao, C Liu, S Zheng, TY Liu
arXiv preprint arXiv:2006.11999, 2020
272020
Capacity control of relu neural networks by basis-path norm
S Zheng, Q Meng, H Zhang, W Chen, N Yu, TY Liu
Proceedings of the 33rd AAAI Conference on Artificial Intelligence, 2019, 2018
252018
Invertible rescaling network and its extensions
M Xiao, S Zheng, C Liu, Z Lin, TY Liu
International Journal of Computer Vision 131 (1), 134-159, 2023
232023
Quantized training of gradient boosting decision trees
Y Shi, G Ke, Z Chen, S Zheng, TY Liu
Advances in neural information processing systems 35, 18822-18833, 2022
202022
Mimicking atmospheric photochemical modeling with a deep neural network
J Xing, S Zheng, S Li, L Huang, X Wang, JT Kelly, S Wang, C Liu, C Jang, ...
Atmospheric research 265, 105919, 2022
172022
系统目前无法执行此操作,请稍后再试。
文章 1–20