Follow
Nikita Doikov
Title
Cited by
Cited by
Year
Stochastic Subspace Cubic Newton Method
F Hanzely, N Doikov, P Richtárik, Y Nesterov
ICML 2020 (International Conference on Machine Learning), 2020
532020
Randomized Block Cubic Newton Method
N Doikov, P Richtárik
ICML 2018 (International Conference on Machine Learning), 2018
432018
Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method
N Doikov, Y Nesterov
Journal of Optimization Theory and Applications, 2021
412021
Gradient regularization of Newton method with Bregman distances
N Doikov, Y Nesterov
Mathematical Programming, 1-25, 2023
372023
Contracting Proximal Methods for Smooth Convex Optimization
N Doikov, Y Nesterov
SIAM Journal on Optimization 30 (4), 2020
312020
Local convergence of tensor methods
N Doikov, Y Nesterov
Mathematical Programming, 2021
292021
Super-Universal Regularized Newton Method
N Doikov, K Mishchenko, Y Nesterov
SIAM Journal on Optimization 34 (1), 27-56, 2024
262024
Inexact Tensor Methods with Dynamic Accuracies
N Doikov, Y Nesterov
ICML 2020 (International Conference on Machine Learning), 2020
252020
Second-order optimization with lazy Hessians
N Doikov, EM Chayti, M Jaggi
ICML 2023 (International Conference on Machine Learning), 2022
162022
Affine-invariant contracting-point methods for Convex Optimization
N Doikov, Y Nesterov
Mathematical Programming, 1-23, 2022
132022
High-Order Optimization Methods for Fully Composite Problems
N Doikov, Y Nesterov
SIAM Journal on Optimization 32 (3), 2402-2427, 2022
92022
Convex optimization based on global lower second-order models
N Doikov, Y Nesterov
NeurIPS 2020 (Advances in Neural Information Processing Systems 33), 2020
92020
New second-order and tensor methods in Convex Optimization
N Doikov
Université catholique de Louvain, 2021
72021
On Convergence of Incremental Gradient for Non-Convex Smooth Functions
A Koloskova, N Doikov, SU Stich, M Jaggi
ICML 2024 (International Conference on Machine Learning), 2023
6*2023
First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians
N Doikov, GN Grapiglia
arXiv preprint arXiv:2309.02412, 2023
42023
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method
N Doikov
arXiv preprint arXiv:2308.14742, 2023
32023
Многокритериальные и многомодальные вероятностные тематические модели коллекций текстовых документов
КВ Воронцов, АА Потапенко, АИ Фрей, МА Апишев, НВ Дойков, ...
10-я Междунар. конф. ИОИ, 198, 2014
32014
Cubic regularized subspace Newton for non-convex optimization
J Zhao, A Lucchi, N Doikov
arXiv preprint arXiv:2406.16666, 2024
22024
Linearization Algorithms for Fully Composite Optimization
ML Vladarean, N Doikov, M Jaggi, N Flammarion
COLT 2023 (Conference on Learning Theory), 2023
22023
Unified Convergence Theory of Stochastic and Variance-Reduced Cubic Newton Methods
EM Chayti, N Doikov, M Jaggi
arXiv preprint arXiv:2302.11962, 2023
22023
The system can't perform the operation now. Try again later.
Articles 1–20