Pre-training with whole word masking for chinese bert Y Cui, W Che, T Liu, B Qin, Z Yang IEEE/ACM Transactions on Audio, Speech, and Language Processing 29, 3504-3514, 2021 | 680 | 2021 |
TextBrewer: an open-source knowledge distillation toolkit for natural language processing Z Yang, Y Cui, Z Chen, W Che, T Liu, S Wang, G Hu arXiv preprint arXiv:2002.12620, 2020 | 29 | 2020 |
Benchmarking robustness of machine reading comprehension models C Si, Z Yang, Y Cui, W Ma, T Liu, S Wang arXiv preprint arXiv:2004.14004, 2020 | 21 | 2020 |
On the evaporation of solar dark matter: spin-independent effective operators ZL Liang, YL Wu, ZQ Yang, YF Zhou Journal of Cosmology and Astroparticle Physics 2016 (09), 018, 2016 | 20 | 2016 |
Pre-training with whole word masking for chinese bert. arXiv 2019 Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu arXiv preprint arXiv:1906.08101, 0 | 19 | |
Improving machine reading comprehension via adversarial training Z Yang, Y Cui, W Che, T Liu, S Wang, G Hu arXiv preprint arXiv:1911.03614, 2019 | 16 | 2019 |
PERT: pre-training BERT with permuted language model Y Cui, Z Yang, T Liu arXiv preprint arXiv:2203.06906, 2022 | 15 | 2022 |
A sentence cloze dataset for Chinese machine reading comprehension Y Cui, T Liu, Z Yang, Z Chen, W Ma, W Che, S Wang, G Hu arXiv preprint arXiv:2004.03116, 2020 | 10 | 2020 |
The leptophilic dark matter in the Sun: the minimum testable mass ZL Liang, YL Tang, ZQ Yang Journal of Cosmology and Astroparticle Physics 2018 (10), 035, 2018 | 10 | 2018 |
Critical behaviors and universality classes of percolation phase transitions on two-dimensional square lattice Y Zhu, ZQ Yang, X Zhang, XS Chen Communications in Theoretical Physics 64 (2), 231, 2015 | 9 | 2015 |
Pre-Training with Whole Word Masking for Chinese BERT. arXiv e-prints, art Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu arXiv preprint arXiv:1906.08101, 2019 | 5 | 2019 |
CINO: A Chinese Minority Pre-trained Language Model Z Yang, Z Xu, Y Cui, B Wang, M Lin, D Wu, Z Chen arXiv preprint arXiv:2202.13558, 2022 | 4 | 2022 |
Interactive gated decoder for machine reading comprehension Y Cui, W Che, Z Yang, T Liu, B Qin, S Wang, G Hu Transactions on Asian and Low-Resource Language Information Processing 21 (4 …, 2022 | 4 | 2022 |
TextPruner: A model pruning toolkit for pre-trained language models Z Yang, Y Cui, Z Chen arXiv preprint arXiv:2203.15996, 2022 | 3 | 2022 |
Cross-lingual text classification with multilingual distillation and zero-shot-aware training Z Yang, Y Cui, Z Chen, S Wang arXiv preprint arXiv:2202.13654, 2022 | 2 | 2022 |
Criticality of networks with long-range connections ZQ Yang, MX Liu, XS Chen Science China Physics, Mechanics, and Astronomy 60 (2), 20521, 2017 | 2 | 2017 |
Hit at semeval-2022 task 2: Pre-trained language model for idioms detection Z Chu, Z Yang, Y Cui, Z Chen, M Liu arXiv preprint arXiv:2204.06145, 2022 | 1 | 2022 |
HFL at SemEval-2022 Task 8: A Linguistics-inspired Regression Model with Data Augmentation for Multilingual News Similarity Z Xu, Z Yang, Y Cui, Z Chen arXiv preprint arXiv:2204.04844, 2022 | 1 | 2022 |
Adversarial training for machine reading comprehension with virtual embeddings Z Yang, Y Cui, C Si, W Che, T Liu, S Wang, G Hu arXiv preprint arXiv:2106.04437, 2021 | 1 | 2021 |
Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer Z Yang, W Ma, Y Cui, J Ye, W Che, S Wang arXiv preprint arXiv:2106.01732, 2021 | 1 | 2021 |