Asymptotics of ridge (less) regression under general source condition D Richards, J Mourtada, L Rosasco International Conference on Artificial Intelligence and Statistics, 3889-3897, 2021 | 77 | 2021 |
Graph-Dependent Implicit Regularisation for Distributed Stochastic Subgradient Descent D Richards, P Rebeschini Journal of Machine Learning Research 21 (34), 1-44, 2020 | 24 | 2020 |
Decentralised learning with distributed gradient descent and random features D Richards, P Rebeschini, L Rosasco Proceedings of Machine Learning Research, 2020 | 22* | 2020 |
Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up D Richards, P Rebeschini NeurIPS 2019, 2019 | 18 | 2019 |
Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel D Richards, I Kuzborskij Advances in Neural Information Processing Systems 34, 2021 | 16 | 2021 |
Distributed Machine Learning with Sparse Heterogeneous Data D Richards, S Negahban, P Rebeschini Advances in Neural Information Processing Systems 34, 2021 | 10* | 2021 |
Learning with Gradient Descent and Weakly Convex Losses D Richards, M Rabbat International Conference on Artificial Intelligence and Statistics, 1990-1998, 2021 | 10 | 2021 |
Comparing Classes of Estimators: When does Gradient Descent Beat Ridge Regression in Linear Models? D Richards, E Dobriban, P Rebeschini arXiv preprint arXiv:2108.11872, 2021 | 1 | 2021 |