Follow
Thibault Castells
Thibault Castells
Research engineer at Nota.ai
Verified email at nota.ai
Title
Cited by
Cited by
Year
Superloss: A generic loss for robust curriculum learning
T Castells, P Weinzaepfel, J Revaud
Advances in Neural Information Processing Systems 33, 4308-4319, 2020
732020
On architectural compression of text-to-image diffusion models
BK Kim, HK Song, T Castells, S Choi
arXiv preprint arXiv:2305.15798, 2023
242023
Bk-sdm: Architecturally compressed stable diffusion for efficient text-to-image generation
BK Kim, HK Song, T Castells, S Choi
Workshop on Efficient Systems for Foundation Models@ ICML2023, 2023
92023
Automatic neural network pruning that efficiently preserves the model accuracy
T Castells, SK Yeom
arXiv preprint arXiv:2111.09635, 2021
52021
Superloss: A generic loss for robust curriculum learning
P Weinzaepfel, J Revaud, T Castells
US Patent App. 17/383,860, 2022
32022
Shortened LLaMA: A Simple Depth Pruning for Large Language Models
BK Kim, G Kim, TH Kim, T Castells, S Choi, J Shin, HK Song
arXiv preprint arXiv:2402.02834, 2024
22024
LD-Pruner: Efficient Pruning of Latent Diffusion Models using Task-Agnostic Insights
T Castells, HK Song, BK Kim, S Choi
arXiv preprint arXiv:2404.11936, 2024
2024
EdgeFusion: On-Device Text-to-Image Generation
T Castells, HK Song, T Piao, S Choi, BK Kim, H Yim, C Lee, JG Kim, ...
arXiv preprint arXiv:2404.11925, 2024
2024
Method and apparatus for information flow based automatic neural network compression that preserves the model accuracy
Y Seul-Ki, T Castells
US Patent App. 18/056,644, 2023
2023
Supplementary Material for SuperLoss: A Generic Loss for Robust Curriculum Learning
T Castells, P Weinzaepfel, J Revaud
The system can't perform the operation now. Try again later.
Articles 1–10