Follow
Lysandre Debut
Lysandre Debut
Machine Learning Engineer, Hugging Face
Verified email at huggingface.co
Title
Cited by
Cited by
Year
Transformers: State-of-the-Art Natural Language Processing
T Wolf
arXiv preprint arXiv:1910.03771, 2020
94262020
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
V Sanh
arXiv preprint arXiv:1910.01108, 2019
88162019
Datasets: A community library for natural language processing
Q Lhoest, AV Del Moral, Y Jernite, A Thakur, P Von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
3012021
Peft: State-of-the-art parameter-efficient fine-tuning methods
S Mangrulkar, S Gugger, L Debut, Y Belkada, S Paul, B Bossan
URL: https://github. com/huggingface/peft, 2022
2952022
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 (2019)
V Sanh, L Debut, J Chaumond, T Wolf
URL: http://arxiv. org/abs/1910 1108, 1910
1211910
Huggingface’s transformers: State-of-the-art natural language processing. arXiv
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
arXiv preprint arXiv:1910.03771, 2019
922019
Accelerate: Training and inference at scale made simple, efficient and adaptable
S Gugger, L Debut, T Wolf, P Schmid, Z Mueller, S Mangrulkar, M Sun, ...
662022
Platen Patrick von
W Thomas, D Lysandre, S Victor, C Julien, D Clement, M Anthony, ...
Ma Clara, Jernite Yacine, Plu Julien, Xu Canwen, Scao Teven Le, Gugger …, 2020
442020
Huggingface’s transformers: state-of-the-art natural language processing. CoRR abs/1910.03771 (2019)
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
URL: http://arxiv. org/abs/1910.03771, 1910
411910
PEFT: state-of-the-art parameter-efficient fine-tuning methods (2022)
S Mangrulkar, S Gugger, L Debut, Y Belkada, S Paul, B Bossan
URL https://github. com/huggingface/peft, 2023
302023
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv 2019. doi: 10.48550
V Sanh, L Debut, J Chaumond, T Wolf
arxiv, 1910
301910
DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter (arXiv: 1910.01108). arXiv
V Sanh, L Debut, J Chaumond, T Wolf
Retrieved 2023-01-22, from http://arxiv. org/abs/1910.01108 doi: 10.48550/arXiv, 1910
211910
HuggingFace’s Transformers: State-of-the-art Natural Language Processing. arXiv e-prints
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
arXiv preprint arXiv:1910.03771, 2019
192019
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108
V Sanh, L Debut, J Chaumond, T Wolf
181910
HuggingFace’s Transformers: State-Of-The-Art Natural Language Processing. ArXiv191003771 Cs
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
122020
Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv: 191001108
V Sanh, L Debut, J Chaumond, T Wolf
122019
Huggingface’s transformers: State-of-the-art natural language processing. arXiv preprint arXiv: 191003771
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
102019
Transformers: State-of-the-Art Natural Language Processing. arXiv. 2020; 1910.03771 v5
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi
DOI 10, v1, 2020
92020
Transformers: State-of-the-art natural language processing.(2019)
W Thomas, D Lysandre, S Victor, C Julien, D Clement, M Anthony, ...
Google Scholar Google Scholar Cross Ref Cross Ref Reference, 2019
92019
Accelerate: Training and inference at scale made simple, efficient and adaptable
G Sylvain, D Lysandre, W Thomas, S Philipp, M Zachary, M Sourab
62022
The system can't perform the operation now. Try again later.
Articles 1–20