Follow
Advait Gadhikar
Advait Gadhikar
PhD student at CISPA
Verified email at cispa.de - Homepage
Title
Cited by
Cited by
Year
A field guide to federated optimization
J Wang, Z Charles, Z Xu, G Joshi, HB McMahan, M Al-Shedivat, G Andrew, ...
arXiv preprint arXiv:2107.06917, 2021
3992021
Adaptive quantization of model updates for communication-efficient federated learning
D Jhunjhunwala, A Gadhikar, G Joshi, YC Eldar
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021
1292021
Why random pruning is all we need to start sparse
A Gadhikar, S Mukherjee, R Burkholz
arXiv preprint arXiv:2210.02412, 2022
232022
Leveraging spatial and temporal correlations in sparsified mean estimation
D Jhunjhunwala, A Mallick, A Gadhikar, S Kadhe, G Joshi
Advances in Neural Information Processing Systems 34, 14280-14292, 2021
182021
Masks, signs, and learning rate rewinding
A Gadhikar, R Burkholz
arXiv preprint arXiv:2402.19262, 2024
62024
Lottery tickets with nonzero biases
J Fischer, A Gadhikar, R Burkholz
arXiv preprint arXiv:2110.11150, 2021
42021
Dynamical isometry for residual networks
A Gadhikar, R Burkholz
CISPA, 2022
22022
Attention Is All You Need For Mixture-of-Depths Routing
A Gadhikar, SK Majumdar, N Popp, P Saranrittichai, M Rapp, L Schott
arXiv preprint arXiv:2412.20875, 2024
2024
Cyclic Sparse Training: Is it Enough?
A Gadhikar, SH Nelaturu, R Burkholz
arXiv preprint arXiv:2406.02773, 2024
2024
PaI is getting competitive by training longer
A Gadhikar, SH Nelaturu, R Burkholz
The system can't perform the operation now. Try again later.
Articles 1–10