Follow
Yujia Qin
Yujia Qin
Verified email at mails.tsinghua.edu.cn - Homepage
Title
Cited by
Cited by
Year
CPM: A large-scale generative Chinese pre-trained language model
Z Zhang, X Han, H Zhou, P Ke, Y Gu, D Ye, Y Qin, Y Su, H Ji, J Guan, F Qi, ...
AI Open 2, 93-99, 2021
682021
ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning
Y Qin, Y Lin, R Takanobu, Z Liu, P Li, H Ji, M Huang, M Sun, J Zhou
ACL 2021, 2020
672020
Delta tuning: A comprehensive study of parameter efficient methods for pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
Nature Machine Intelligence, 2022
642022
Learning from Explanations with Neural Execution Tree
Z Wang, Y Qin, W Zhou, J Yan, Q Ye, L Neves, Z Liu, X Ren
ICLR 2020, 2019
332019
On transferability of prompt tuning for natural language processing
Y Su, X Wang, Y Qin, CM Chan, Y Lin, H Wang, K Wen, Z Liu, P Li, J Li, ...
Proceedings of the 2022 Conference of the North American Chapter of the …, 2022
292022
Knowledge inheritance for pre-trained language models
Y Qin, Y Lin, J Yi, J Zhang, X Han, Z Zhang, Y Su, Z Liu, P Li, M Sun, ...
NAACL 2022, 2021
202021
ELLE: Efficient Lifelong Pre-training for Emerging Data
Y Qin, J Zhang, Y Lin, Z Liu, P Li, M Sun, J Zhou
Findings of ACL 2022, 2022
192022
Exploring low-dimensional intrinsic task subspace via prompt tuning
Y Qin, X Wang, Y Su, Y Lin, N Ding, Z Liu, J Li, L Hou, P Li, M Sun, J Zhou
Previously Accepted by Findings of ACL 2022 and EMNLP 2022, 2021
172021
Improving sequence modeling ability of recurrent neural networks via sememes
Y Qin, F Qi, S Ouyang, Z Liu, C Yang, Y Wang, Q Liu, M Sun
IEEE/ACM Transactions on Audio, Speech, and Language Processing 28, 2364-2373, 2020
152020
Tool learning with foundation models
Y Qin, S Hu, Y Lin, W Chen, N Ding, G Cui, Z Zeng, Y Huang, C Xiao, ...
arXiv preprint arXiv:2304.08354, 2023
142023
bert2BERT: Towards Reusable Pretrained Language Models
C Chen, Y Yin, L Shang, X Jiang, Y Qin, F Wang, Z Wang, X Chen, Z Liu, ...
ACL 2022, 2021
142021
ProQA: Structural Prompt-based Pre-training for Unified Question Answering
W Zhong, Y Gao, N Ding, Y Qin, Z Liu, M Zhou, J Wang, J Yin, N Duan
NAACL 2022, 2022
132022
On Transferability of Prompt Tuning for Natural Language Understanding
Y Su, X Wang, Y Qin, CM Chan, Y Lin, Z Liu, P Li, J Li, L Hou, M Sun, ...
NAACL 2022, 2021
132021
Parameter-efficient fine-tuning of large-scale pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
Nature Machine Intelligence 5 (3), 220-235, 2023
72023
Enhancing recurrent neural networks with sememes
Y Qin, F Qi, S Ouyang, Z Liu, C Yang, Y Wang, Q Liu, M Sun
IEEE/ACM Transactions on Audio, Speech, and Language Processing 28, 2364-2373, 2020
62020
Delta tuning: A comprehensive study of parameter efficient methods for pre-trained language models, 2022
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
URL https://arxiv. org/abs/2203.06904, 0
5
Exploring Mode Connectivity for Pre-trained Language Models
Y Qin, C Qian, J Yi, W Chen, Y Lin, X Han, Z Liu, M Sun, J Zhou
EMNLP 2022, 2022
22022
Different Tunes Played with Equal Skill: Exploring a Unified Optimization Subspace for Parameter-Efficient Tuning
J Yi, W Chen, Y Qin, Y Lin, N Ding, X Han, Z Liu, M Sun, J Zhou
Findings of the Association for Computational Linguistics: EMNLP 2022, 3348-3366, 2022
12022
Pass off fish eyes for pearls: Attacking model selection of pre-trained models
B Zhu, Y Qin, F Qi, Y Deng, Z Liu, M Sun, M Gu
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
12022
Exploring Universal Intrinsic Task Subspace via Prompt Tuning
Y Qin, X Wang, Y Su, Y Lin, N Ding, J Yi, W Chen, Z Liu, J Li, L Hou, P Li, ...
arXiv e-prints, arXiv: 2110.07867, 2021
12021
The system can't perform the operation now. Try again later.
Articles 1–20