Follow
Zhaopeng Tu
Zhaopeng Tu
Principal Researcher, Tencent AI Lab
Verified email at tencent.com - Homepage
Title
Cited by
Cited by
Year
Modeling Coverage for Neural Machine Translation
Z Tu, Z Lu, Y Liu, X Liu, H Li
ACL, 2016
9182016
Is ChatGPT a good translator? Yes with GPT-4 as the engine
W Jiao, W Wang, JT Huang, X Wang, Z Tu
arXiv preprint arXiv:2301.08745, 2023
465*2023
On the Localness of Software
Z Tu, Z Su, P Devanbu
FSE, 2014
3412014
On the Naturalness of Buggy Code
B Ray, V Hellendoorn, S Godhane, Z Tu, A Bacchelli, P Devanbu
ICSE, 2016
321*2016
Exploiting Cross-Sentence Context for Neural Machine Translation
L Wang, Z Tu, A Way, Q Liu
EMNLP, 2017
2242017
Neural Machine Translation with Reconstruction
Z Tu, Y Liu, L Shang, X Liu, H Li
AAAI, 2017
2202017
Learning to Remember Translation History with a Continuous Cache
Z Tu, Y Liu, S Shi, T Zhang
TACL, 2018
1952018
Modeling Localness for Self-Attention Networks
B Yang, Z Tu, DF Wong, F Meng, LS Chao, T Zhang
EMNLP, 2018
1922018
Modeling Source Syntax for Neural Machine Translation
J Li, D Xiong, Z Tu, M Zhu, M Zhang, G Zhou
ACL, 2017
1742017
Towards Robust Neural Machine Translation
Y Cheng, Z Tu, F Meng, J Zhai, Y Liu
ACL, 2018
1722018
Multi-Head Attention with Disagreement Regularization
J Li, Z Tu, B Yang, MR Lyu, T Zhang
EMNLP, 2018
1712018
Context Gates for Neural Machine Translation
Z Tu, Y Liu, Z Lu, X Liu, H Li
TACL, 2017
1402017
Convolutional Self-Attention Network
B Yang, L Wang, DF Wong, LS Chao, Z Tu
NAACL, 2019
1342019
EmpGAN: Multi-Resolution Interactive Empathetic Dialogue Generation
Q Li, H Chen, Z Ren, Z Chen, Z Tu, J Ma
COLING, 2020
1212020
Context-Aware Self-Attention Networks
B Yang, J Li, DF Wong, LS Chao, X Wang, Z Tu
AAAI, 2019
1192019
Neural Machine Translation Advised by Statistical Machine Translation
X Wang, Z Lu, Z Tu, H Li, D Xiong, M Zhang
AAAI, 2017
1082017
Understanding and Improving Lexical Choice in Non-Autoregressive Translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
ICLR, 2021
1062021
Encouraging divergent thinking in large language models through multi-agent debate
T Liang, Z He, W Jiao, X Wang, Y Wang, R Wang, Y Yang, Z Tu, S Shi
arXiv preprint arXiv:2305.19118, 2023
1012023
Exploiting Deep Representations for Neural Machine Translation
ZY Dou, Z Tu, X Wang, S Shi, T Zhang
EMNLP, 2018
892018
Modeling Recurrence for Transformer
J Hao, X Wang, B Yang, L Wang, J Zhang, Z Tu
NAACL, 2019
882019
The system can't perform the operation now. Try again later.
Articles 1–20