关注
Lin Zheng
Lin Zheng
在 connect.hku.hk 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Linear complexity randomized self-attention mechanism
L Zheng, C Wang, L Kong
International conference on machine learning, 27011-27041, 2022
342022
A reparameterized discrete diffusion model for text generation
L Zheng, J Yuan, L Yu, L Kong
arXiv preprint arXiv:2302.05737, 2023
292023
Efficient attention via control variates
L Zheng, J Yuan, C Wang, L Kong
arXiv preprint arXiv:2302.04542, 2023
142023
Generative semantic hashing enhanced via Boltzmann machines
L Zheng, Q Su, D Shen, C Chen
arXiv preprint arXiv:2006.08858, 2020
92020
Cab: comprehensive attention benchmarking on long sequence modeling
J Zhang, S Jiang, J Feng, L Zheng, L Kong
International Conference on Machine Learning, 41194-41218, 2023
42023
Ripple attention for visual perception with sub-quadratic complexity
L Zheng, H Pan, L Kong
International Conference on Machine Learning, 26993-27010, 2022
42022
Linear Attention via Orthogonal Memory
J Zhang, S Jiang, J Feng, L Zheng, L Kong
arXiv preprint arXiv:2312.11135, 2023
22023
Cascaded head-colliding attention
L Zheng, Z Wu, L Kong
arXiv preprint arXiv:2105.14850, 2021
22021
Diffusion of Thoughts: Chain-of-Thought Reasoning in Diffusion Language Models
J Ye, S Gong, L Chen, L Zheng, J Gao, H Shi, C Wu, Z Li, W Bi, L Kong
arXiv preprint arXiv:2402.07754, 2024
2024
Self-Infilling Code Generation
L Zheng, J Yuan, Z Zhang, H Yang, L Kong
Forty-first International Conference on Machine Learning, 2023
2023
Attentive Multi-Layer Perceptron for Non-autoregressive Generation
S Jiang, J Zhang, J Feng, L Zheng, L Kong
Joint European Conference on Machine Learning and Knowledge Discovery in …, 2023
2023
Attentive MLP for Non-Autoregressive Generation
S Jiang, J Zhang, J Feng, L Zheng, L Kong
系统目前无法执行此操作,请稍后再试。
文章 1–12