关注
Yunwen Lei
标题
引用次数
引用次数
年份
Stochastic gradient descent for nonconvex learning without bounded gradient assumptions
Y Lei, T Hu, G Li, K Tang
IEEE Transactions on Neural Networks and Learning Systems 31 (10), 4394-4400, 2020
1482020
Fine-Grained Analysis of Stability and Generalization for Stochastic Gradient Descent
Y Lei, Y Ying
International Conference on Machine Learning, 5809-5819, 2020
1462020
Data-dependent generalization bounds for multi-class classification
Y Lei, Ü Dogan, DX Zhou, M Kloft
IEEE Transactions on Information Theory 65 (5), 2995-3021, 2019
842019
Multi-class svms: From tighter data-dependent generalization bounds to novel algorithms
Y Lei, Ü Dogan, A Binder, M Kloft
Advances in Neural Information Processing Systems, 2026-2034, 2015
612015
Stability and Generalization of Stochastic Gradient Methods for Minimax Problems
Y Lei, Z Yang, T Yang, Y Ying
International Conference on Machine Learning, 6175-6186, 2021
512021
Sharper Generalization Bounds for Learning with Gradient-dominated Objective Functions
Y Lei, Y Ying
International Conference on Learning Representations, 2021
512021
Sharper generalization bounds for pairwise learning
Y Lei, A Ledent, M Kloft
Advances in Neural Information Processing Systems 33, 21236-21246, 2020
472020
Generalization performance of radial basis function networks
Y Lei, L Ding, W Zhang
IEEE Transactions on Neural Networks and Learning Systems 26 (3), 551-564, 2015
462015
Simple stochastic and online gradient descent algorithms for pairwise learning
Z Yang, Y Lei, P Wang, T Yang, Y Ying
Advances in Neural Information Processing Systems 34, 20160-20171, 2021
352021
Norm-based generalisation bounds for deep multi-class convolutional neural networks
A Ledent, W Mustafa, Y Lei, M Kloft
Proceedings of the AAAI Conference on Artificial Intelligence 35 (9), 8279-8287, 2021
35*2021
A generalization error bound for multi-class domain generalization
AA Deshmukh, Y Lei, S Sharma, U Dogan, JW Cutler, C Scott
arXiv preprint arXiv:1905.10392, 2019
352019
Learning rates for stochastic gradient descent with nonconvex objectives
Y Lei, K Tang
IEEE Transactions on Pattern Analysis and Machine Intelligence 43 (12), 4505 …, 2021
342021
Local rademacher complexity-based learning guarantees for multi-task learning
N Yousefi, Y Lei, M Kloft, M Mollaghasemi, GC Anagnostopoulos
Journal of Machine Learning Research 19 (38), 1-47, 2018
342018
Generalization guarantee of SGD for pairwise learning
Y Lei, M Liu, Y Ying
Advances in neural information processing systems 34, 21216-21228, 2021
332021
Generalization Performance of Multi-pass Stochastic Gradient Descent with Convex Loss Functions
Y Lei, T Hu, K Tang
Journal of Machine Learning Research 22 (25), 1−41, 2021
332021
Differentially private SGD with non-smooth losses
P Wang, Y Lei, Y Ying, H Zhang
Applied and Computational Harmonic Analysis 56, 306-336, 2022
322022
Stochastic Proximal AUC Maximization
Y Lei, Y Ying
Journal of Machine Learning Research 22 (61), 1-45, 2021
302021
Stochastic composite mirror descent: Optimal bounds with high probabilities
Y Lei, K Tang
Advances in Neural Information Processing Systems 31, 1526-1536, 2018
292018
On the generalization analysis of adversarial learning
W Mustafa, Y Lei, M Kloft
International Conference on Machine Learning, 16174-16196, 2022
272022
Boosted Kernel Ridge Regression: Optimal Learning Rates and Early Stopping
SB Lin, Y Lei, DX Zhou
Journal of Machine Learning Research 20 (46), 1-36, 2019
272019
系统目前无法执行此操作,请稍后再试。
文章 1–20