Iulia Turc
Iulia Turc
Verified email at google.com
Title
Cited by
Cited by
Year
Well-read students learn better: On the importance of pre-training compact models
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962, 2019
1742019
Well-read students learn better: The impact of student initialization on knowledge distillation
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962 13, 2019
692019
Canine: Pre-training an efficient tokenization-free encoder for language representation
JH Clark, D Garrette, I Turc, J Wieting
arXiv preprint arXiv:2103.06874, 2021
182021
The multiberts: Bert reproductions for robustness analysis
T Sellam, S Yadlowsky, J Wei, N Saphra, A D'Amour, T Linzen, J Bastings, ...
arXiv preprint arXiv:2106.16163, 2021
52021
Well-read students learn better: On the importance of pre-training compact models. arXiv: Computation and Language
I Turc, MW Chang, K Lee, K Toutanova
52019
Revisiting the primacy of english in zero-shot cross-lingual transfer
I Turc, K Lee, J Eisenstein, MW Chang, K Toutanova
arXiv preprint arXiv:2106.16171, 2021
22021
High Performance Natural Language Processing
G Ilharco, C Ilharco, I Turc, T Dettmers, F Ferreira, K Lee
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020
22020
Learning Task Sampling Policy for Multitask Learning
D Sundararaman, H Tsai, KH Lee, IR Turc, L Carin
2021
The system can't perform the operation now. Try again later.
Articles 1–8