Rahul Mazumder
Cited by
Cited by
Spectral regularization algorithms for learning large incomplete matrices
R Mazumder, T Hastie, R Tibshirani
The Journal of Machine Learning Research 11, 2287-2322, 2010
Best subset selection via a modern optimization lens
D Bertsimas, A King, R Mazumder
Annals of Statistics 44 (2), 813-852, 2016
Matrix completion and low-rank SVD via fast alternating least squares
T Hastie, R Mazumder, JD Lee, R Zadeh
The Journal of Machine Learning Research 16 (1), 3367-3402, 2015
SparseNet: Coordinate Descent With Nonconvex Penalties
R Mazumder, JH Friedman, T Hastie
Journal of the American Statistical Association 106 (495), 1125-1138, 2011
The graphical lasso: New insights and alternatives
R Mazumder, T Hastie
Electronic journal of statistics 6, 2125, 2012
Exact covariance thresholding into connected components for large-scale graphical lasso
R Mazumder, T Hastie
The Journal of Machine Learning Research 13 (1), 781-794, 2012
Fast best subset selection: Coordinate descent and local combinatorial optimization algorithms
H Hazimeh, R Mazumder
Operations Research 68 (5), 1517-1537, 2020
An extended Frank--Wolfe method with “in-face” directions, and its application to low-rank matrix completion
RM Freund, P Grigas, R Mazumder
SIAM Journal on optimization 27 (1), 319-346, 2017
Dselect-k: Differentiable selection in the mixture of experts with applications to multi-task learning
H Hazimeh, Z Zhao, A Chowdhery, M Sathiamoorthy, Y Chen, ...
Advances in Neural Information Processing Systems 34, 29335-29347, 2021
A computational framework for multivariate convex regression and its variants
R Mazumder, A Choudhury, G Iyengar, B Sen
Journal of the American Statistical Association 114 (525), 318-331, 2019
Subset selection with shrinkage: Sparse linear modeling when the SNR is low
R Mazumder, P Radchenko, A Dedieu
Operations Research 71 (1), 129-147, 2023
Sparse regression at scale: branch-and-bound rooted in first-order optimization
H Hazimeh, R Mazumder, A Saab
Mathematical Programming, 2021
The tree ensemble layer: Differentiability meets conditional computation
H Hazimeh, N Ponomareva, P Mol, Z Tan, R Mazumder
International Conference on Machine Learning, 4138-4148, 2020
A new perspective on boosting in linear regression via subgradient optimization and relatives
R M. Freund, P Grigas, R Mazumder
Annals of Statistics 45 (6), 2328-2364, 2017
Least quantile regression via modern optimization
D Bertsimas, R Mazumder
Annals of Statistics 42 (6), 2494-2525, 2014
Certifiably optimal low rank factor analysis
D Bertsimas, MS Copenhaver, R Mazumder
Journal of Machine Learning Research 18 (29), 1-53, 2017
Learning sparse classifiers: Continuous and mixed integer optimization perspectives
A Dedieu, H Hazimeh, R Mazumder
Journal of Machine Learning Research 22 (135), 1-47, 2021
The trimmed lasso: Sparsity and robustness
D Bertsimas, MS Copenhaver, R Mazumder
arXiv preprint arXiv:1708.04527, 2017
The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
R Mazumder, P Radchenko
IEEE Transactions on Information Theory 63 (5), 3053-3075, 2017
Assessing the significance of global and local correlations under spatial autocorrelation: a nonparametric approach
J Viladomat, R Mazumder, A McInturff, DJ McCauley, T Hastie
Biometrics 70 (2), 409-418, 2014
The system can't perform the operation now. Try again later.
Articles 1–20