Transformers: State-of-the-Art Natural Language Processing T Wolf arXiv preprint arXiv:1910.03771, 2020 | 9426 | 2020 |
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter V Sanh arXiv preprint arXiv:1910.01108, 2019 | 8816 | 2019 |
Datasets: A community library for natural language processing Q Lhoest, AV Del Moral, Y Jernite, A Thakur, P Von Platen, S Patil, ... arXiv preprint arXiv:2109.02846, 2021 | 301 | 2021 |
Peft: State-of-the-art parameter-efficient fine-tuning methods S Mangrulkar, S Gugger, L Debut, Y Belkada, S Paul, B Bossan URL: https://github. com/huggingface/peft, 2022 | 295 | 2022 |
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 (2019) V Sanh, L Debut, J Chaumond, T Wolf URL: http://arxiv. org/abs/1910 1108, 1910 | 121 | 1910 |
Huggingface’s transformers: State-of-the-art natural language processing. arXiv T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... arXiv preprint arXiv:1910.03771, 2019 | 92 | 2019 |
Accelerate: Training and inference at scale made simple, efficient and adaptable S Gugger, L Debut, T Wolf, P Schmid, Z Mueller, S Mangrulkar, M Sun, ... | 66 | 2022 |
Platen Patrick von W Thomas, D Lysandre, S Victor, C Julien, D Clement, M Anthony, ... Ma Clara, Jernite Yacine, Plu Julien, Xu Canwen, Scao Teven Le, Gugger …, 2020 | 44 | 2020 |
Huggingface’s transformers: state-of-the-art natural language processing. CoRR abs/1910.03771 (2019) T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... URL: http://arxiv. org/abs/1910.03771, 1910 | 41 | 1910 |
PEFT: state-of-the-art parameter-efficient fine-tuning methods (2022) S Mangrulkar, S Gugger, L Debut, Y Belkada, S Paul, B Bossan URL https://github. com/huggingface/peft, 2023 | 30 | 2023 |
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv 2019. doi: 10.48550 V Sanh, L Debut, J Chaumond, T Wolf arxiv, 1910 | 30 | 1910 |
DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter (arXiv: 1910.01108). arXiv V Sanh, L Debut, J Chaumond, T Wolf Retrieved 2023-01-22, from http://arxiv. org/abs/1910.01108 doi: 10.48550/arXiv, 1910 | 21 | 1910 |
HuggingFace’s Transformers: State-of-the-art Natural Language Processing. arXiv e-prints T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... arXiv preprint arXiv:1910.03771, 2019 | 19 | 2019 |
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 V Sanh, L Debut, J Chaumond, T Wolf | 18 | 1910 |
HuggingFace’s Transformers: State-Of-The-Art Natural Language Processing. ArXiv191003771 Cs T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... | 12 | 2020 |
Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv: 191001108 V Sanh, L Debut, J Chaumond, T Wolf | 12 | 2019 |
Huggingface’s transformers: State-of-the-art natural language processing. arXiv preprint arXiv: 191003771 T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... | 10 | 2019 |
Transformers: State-of-the-Art Natural Language Processing. arXiv. 2020; 1910.03771 v5 T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi DOI 10, v1, 2020 | 9 | 2020 |
Transformers: State-of-the-art natural language processing.(2019) W Thomas, D Lysandre, S Victor, C Julien, D Clement, M Anthony, ... Google Scholar Google Scholar Cross Ref Cross Ref Reference, 2019 | 9 | 2019 |
Accelerate: Training and inference at scale made simple, efficient and adaptable G Sylvain, D Lysandre, W Thomas, S Philipp, M Zachary, M Sourab | 6 | 2022 |