Xiaohan Zhang
Xiaohan Zhang
Verified email at
Cited by
Cited by
Kola: Carefully benchmarking world knowledge of large language models
J Yu, X Wang, S Tu, S Cao, D Zhang-Li, X Lv, H Peng, Z Yao, X Zhang, ...
arXiv preprint arXiv:2306.09296, 2023
Free-Standing Two-Dimensional Gold Membranes Produced by Extreme Mechanical Thinning
Q Zhu, Y Hong, G Cao, Y Zhang, X Zhang, K Du, Z Zhang, T Zhu, J Wang
ACS nano 14 (12), 17091-17099, 2020
XDAI: A tuning-free framework for exploiting pre-trained language models in knowledge grounded dialogue generation
J Yu, X Zhang, Y Xu, X Lei, X Guan, J Zhang, L Hou, J Li, J Tang
Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and íK, 2022
Glm-dialog: Noise-tolerant pre-training for knowledge-grounded dialogue generation
J Zhang, X Zhang, D Zhang-Li, J Yu, Z Yao, Z Ma, Y Xu, H Wang, X Zhang, ...
Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and íK, 2023
The system can't perform the operation now. Try again later.
Articles 1–4