关注
Dominik Stöger
Dominik Stöger
Assistant Professor, KU Eichstätt-Ingolstadt
在 ku.de 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Small random initialization is akin to spectral learning: Optimization and generalization guarantees for overparameterized low-rank matrix reconstruction
D Stöger, M Soltanolkotabi
Advances in Neural Information Processing Systems 34, 2021
802021
Blind demixing and deconvolution at near-optimal rate
P Jung, F Krahmer, D Stöger
IEEE Transactions on Information Theory 64 (2), 704-727, 2017
532017
Understanding overparameterization in generative adversarial networks
Y Balaji, M Sajedi, NM Kalibhat, M Ding, D Stöger, M Soltanolkotabi, ...
International Conference on Learning Representations 1, 2021
342021
Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate
C Kümmerle, C Mayrink Verdun, D Stöger
Advances in Neural Information Processing Systems 34, 2873-2886, 2021
23*2021
Implicit balancing and regularization: Generalization and convergence guarantees for overparameterized asymmetric matrix sensing
M Soltanolkotabi, D Stöger, C Xie
The Thirty Sixth Annual Conference on Learning Theory, 5140-5142, 2023
202023
On the convex geometry of blind deconvolution and matrix completion
F Krahmer, D Stöger
Communications on Pure and Applied Mathematics 74 (4), 790-832, 2021
172021
Complex phase retrieval from subgaussian measurements
F Krahmer, D Stöger
Journal of Fourier Analysis and Applications 26 (6), 89, 2020
162020
Rigidity for perimeter inequality under spherical symmetrisation
F Cagnetti, M Perugini, D Stöger
Calculus of Variations and Partial Differential Equations 59, 1-53, 2020
132020
Blind deconvolution and compressed sensing
D Stöger, P Jung, F Krahmer
2016 4th International Workshop on Compressed Sensing Theory and its …, 2016
122016
Randomly initialized alternating least squares: Fast convergence for matrix sensing
K Lee, D Stöger
SIAM Journal on Mathematics of Data Science 5 (3), 774-799, 2023
102023
Sparse Power Factorization: Balancing peakiness and sample complexity
J Geppert, F Krahmer, D Stöger
Advances in Computational Mathematics 45 (3), 1711-1728, 2019
102019
Proof methods for robust low-rank matrix recovery
T Fuchs, D Gross, P Jung, F Krahmer, R Kueng, D Stöger
Compressed Sensing in Information Processing, 37-75, 2022
92022
Refined performance guarantees for sparse power factorization
JA Geppert, F Krahmer, D Stöger
2017 International Conference on Sampling Theory and Applications (SampTA …, 2017
72017
Blind demixing and deconvolution with noisy data: Near-optimal rate
D Stöger, P Jung, F Krahmer
WSA 2017; 21th International ITG Workshop on Smart Antennas, 1-5, 2017
52017
Blind Demixing and Deconvolution with Noisy Data: Near-optimal Rate
P Jung, F Krahmer, D Stoeger
WSA 2017; 21th International ITG Workshop on Smart Antennas; Proceedings of, 1-5, 2017
5*2017
How to induce regularization in generalized linear models: A guide to reparametrizing gradient flow
HH Chou, J Maly, D Stöger
arXiv preprint arXiv:2308.04921, 2023
22023
Blind Deconvolution: Convex Geometry and Noise Robustness
F Krahmer, D Stöger
2018 52nd Asilomar Conference on Signals, Systems, and Computers, 643-646, 2018
22018
Robust Recovery of Low-Rank Matrices and Low-Tubal-Rank Tensors from Noisy Sketches
A Ma, D Stöger, Y Zhu
SIAM Journal on Matrix Analysis and Applications 44 (4), 1566-1588, 2023
12023
Sparse power factorization with refined peakiness conditions
D Stöger, J Geppert, F Krahmer
2018 IEEE Statistical Signal Processing Workshop (SSP), 816-820, 2018
12018
Non-convex matrix sensing: Breaking the quadratic rank barrier in the sample complexity
D Stöger, Y Zhu
arXiv preprint arXiv:2408.13276, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–20