Follow
Filip Hanzely
Filip Hanzely
Unknown affiliation
Verified email at kaust.edu.sa - Homepage
Title
Cited by
Cited by
Year
Federated learning of a mixture of global and local models
F Hanzely, P Richtárik
arXiv preprint arXiv:2002.05516, 2020
4262020
A field guide to federated optimization
J Wang, Z Charles, Z Xu, G Joshi, HB McMahan, M Al-Shedivat, G Andrew, ...
arXiv preprint arXiv:2107.06917, 2021
3792021
Lower bounds and optimal algorithms for personalized federated learning
F Hanzely, S Hanzely, S Horváth, P Richtárik
NeurIPS 2020, 2020
1912020
A unified theory of SGD: Variance reduction, sampling, quantization and coordinate descent
E Gorbunov, F Hanzely, P Richtárik
International Conference on Artificial Intelligence and Statistics, 680-690, 2020
1672020
Local sgd: Unified theory and new efficient methods
E Gorbunov, F Hanzely, P Richtárik
International Conference on Artificial Intelligence and Statistics, 3556-3564, 2021
1102021
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
F Hanzely, P Richtarik, L Xiao
Computational Optimization and Applications, 2018
882018
SEGA: Variance reduction via gradient sketching
F Hanzely, K Mishchenko, P Richtárik
NeruIPS 2018, 2018
862018
Fastest rates for stochastic mirror descent methods
F Hanzely, P Richtárik
Computational Optimization and Applications 79, 717-766, 2021
562021
Accelerated stochastic matrix inversion: general theory and speeding up BFGS rules for faster second-order optimization
RM Gower, F Hanzely, P Richtárik, S Stich
NeurIPS 2018, 2018
522018
Stochastic Subspace Cubic Newton Method
F Hanzely, N Doikov, P Richtárik, Y Nesterov
ICML 2020, 2020
512020
Accelerated coordinate descent with arbitrary sampling and best rates for minibatches
F Hanzely, P Richtárik
AISTATS 2019, 2019
482019
Personalized federated learning: A unified framework and universal optimization techniques
F Hanzely, B Zhao, M Kolar
arXiv preprint arXiv:2102.09743, 2021
472021
Testing for causality in reconstructed state spaces by an optimized mixed prediction method
A Krakovská, F Hanzely
Physical Review E 94 (5), 052203, 2016
362016
99% of worker-master communication in distributed optimization is not needed
K Mishchenko, F Hanzely, P Richtárik
Conference on Uncertainty in Artificial Intelligence, 979-988, 2020
34*2020
One method to rule them all: Variance reduction for data, parameters and many new methods
F Hanzely, P Richtárik
arXiv preprint arXiv:1905.11266, 2019
282019
Smoothness matrices beat smoothness constants: Better communication compression techniques for distributed optimization
M Safaryan, F Hanzely, P Richtárik
Advances in Neural Information Processing Systems 34, 25688-25702, 2021
272021
Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems
F Hanzely, D Kovalev, P Richtarik
ICML 2020, 2020
212020
Privacy preserving randomized gossip algorithms
F Hanzely, J Konečný, N Loizou, P Richtárik, D Grishchenko
arXiv preprint arXiv:1706.07636, 2017
212017
A nonconvex projection method for robust PCA
A Dutta, F Hanzely, P Richtárik
Proceedings of the AAAI conference on artificial intelligence 33 (01), 1468-1476, 2019
202019
A privacy preserving randomized gossip algorithm via controlled noise insertion
F Hanzely, J Konečný, N Loizou, P Richtárik, D Grishchenko
NeurIPS PPML workshop 2018, 2018
92018
The system can't perform the operation now. Try again later.
Articles 1–20