关注
Ziqing Yang
Ziqing Yang
iFLYTEK Research
在 iflytek.com 的电子邮件经过验证
标题
引用次数
引用次数
年份
Pre-training with whole word masking for chinese bert
Y Cui, W Che, T Liu, B Qin, Z Yang
IEEE/ACM Transactions on Audio, Speech, and Language Processing 29, 3504-3514, 2021
6802021
TextBrewer: an open-source knowledge distillation toolkit for natural language processing
Z Yang, Y Cui, Z Chen, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:2002.12620, 2020
292020
Benchmarking robustness of machine reading comprehension models
C Si, Z Yang, Y Cui, W Ma, T Liu, S Wang
arXiv preprint arXiv:2004.14004, 2020
212020
On the evaporation of solar dark matter: spin-independent effective operators
ZL Liang, YL Wu, ZQ Yang, YF Zhou
Journal of Cosmology and Astroparticle Physics 2016 (09), 018, 2016
202016
Pre-training with whole word masking for chinese bert. arXiv 2019
Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu
arXiv preprint arXiv:1906.08101, 0
19
Improving machine reading comprehension via adversarial training
Z Yang, Y Cui, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:1911.03614, 2019
162019
PERT: pre-training BERT with permuted language model
Y Cui, Z Yang, T Liu
arXiv preprint arXiv:2203.06906, 2022
152022
A sentence cloze dataset for Chinese machine reading comprehension
Y Cui, T Liu, Z Yang, Z Chen, W Ma, W Che, S Wang, G Hu
arXiv preprint arXiv:2004.03116, 2020
102020
The leptophilic dark matter in the Sun: the minimum testable mass
ZL Liang, YL Tang, ZQ Yang
Journal of Cosmology and Astroparticle Physics 2018 (10), 035, 2018
102018
Critical behaviors and universality classes of percolation phase transitions on two-dimensional square lattice
Y Zhu, ZQ Yang, X Zhang, XS Chen
Communications in Theoretical Physics 64 (2), 231, 2015
92015
Pre-Training with Whole Word Masking for Chinese BERT. arXiv e-prints, art
Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu
arXiv preprint arXiv:1906.08101, 2019
52019
CINO: A Chinese Minority Pre-trained Language Model
Z Yang, Z Xu, Y Cui, B Wang, M Lin, D Wu, Z Chen
arXiv preprint arXiv:2202.13558, 2022
42022
Interactive gated decoder for machine reading comprehension
Y Cui, W Che, Z Yang, T Liu, B Qin, S Wang, G Hu
Transactions on Asian and Low-Resource Language Information Processing 21 (4 …, 2022
42022
TextPruner: A model pruning toolkit for pre-trained language models
Z Yang, Y Cui, Z Chen
arXiv preprint arXiv:2203.15996, 2022
32022
Cross-lingual text classification with multilingual distillation and zero-shot-aware training
Z Yang, Y Cui, Z Chen, S Wang
arXiv preprint arXiv:2202.13654, 2022
22022
Criticality of networks with long-range connections
ZQ Yang, MX Liu, XS Chen
Science China Physics, Mechanics, and Astronomy 60 (2), 20521, 2017
22017
Hit at semeval-2022 task 2: Pre-trained language model for idioms detection
Z Chu, Z Yang, Y Cui, Z Chen, M Liu
arXiv preprint arXiv:2204.06145, 2022
12022
HFL at SemEval-2022 Task 8: A Linguistics-inspired Regression Model with Data Augmentation for Multilingual News Similarity
Z Xu, Z Yang, Y Cui, Z Chen
arXiv preprint arXiv:2204.04844, 2022
12022
Adversarial training for machine reading comprehension with virtual embeddings
Z Yang, Y Cui, C Si, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:2106.04437, 2021
12021
Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer
Z Yang, W Ma, Y Cui, J Ye, W Che, S Wang
arXiv preprint arXiv:2106.01732, 2021
12021
系统目前无法执行此操作,请稍后再试。
文章 1–20