Masaaki Imaizumi
Masaaki Imaizumi
Other names今泉允聡
Verified email at - Homepage
Cited by
Cited by
Deep neural networks learn non-smooth functions effectively
M Imaizumi, K Fukumizu
Artificial Intelligence and Statistics, 869-878, 2019
Adaptive Approximation and Generalization of Deep Neural Network with Intrinsic Dimensionality.
R Nakada, M Imaizumi
Journal of Machine Learning Research 21 (174), 1-38, 2020
Finite sample analysis of minimax offline reinforcement learning: Completeness, fast rates and first-order efficiency
M Uehara, M Imaizumi, N Jiang, N Kallus, W Sun, T Xie
arXiv preprint arXiv:2102.02981, 2021
Improved generalization bounds of group invariant/equivariant deep networks via quotient feature spaces
A Sannai, M Imaizumi, M Kawano
Uncertainty in Artificial Intelligence, 771-780, 2021
PCA-based estimation for functional linear regression with functional responses
M Imaizumi, K Kato
Journal of Multivariate Analysis 163, 15-36, 2018
On tensor train rank minimization: Statistical efficiency and scalable algorithm
M Imaizumi, T Maehara, K Hayashi
Advances in Neural Information Processing Systems 30, 2017
Instrumental variable regression via kernel maximum moment loss
R Zhang, M Imaizumi, B Schölkopf, K Muandet
Journal of Causal Inference 11 (1), 20220073, 2023
Doubly decomposing nonparametric tensor regression
M Imaizumi, K Hayashi
International Conference on Machine Learning, 727-736, 2016
Advantage of deep neural networks for estimating functions with singularity on hypersurfaces
M Imaizumi, K Fukumizu
Journal of Machine Learning Research 23 (1), 4772-4825, 2022
Tensor decomposition with smoothness
M Imaizumi, K Hayashi
International Conference on Machine Learning, 1597-1606, 2017
On random subsampling of Gaussian process regression: A graphon-based analysis
K Hayashi, M Imaizumi, Y Yoshida
Artificial Intelligence and Statistics, 2055-2065, 2020
A simple method to construct confidence bands in functional linear regression
M Imaizumi, K Kato
Statistica Sinica 29 (4), 2055-2081, 2019
Hypothesis Test and Confidence Analysis With Wasserstein Distance on General Dimension
M Imaizumi, H Ota, T Hamaguchi
Neural Computation 34 (6), 1448-1487, 2022
Learning causal models from conditional moment restrictions by importance weighting
M Kato, M Imaizumi, K McAlinn, H Kakehi, S Yasui
International Conference on Learning Representation, 2021
Statistically efficient estimation for non-smooth probability densities
M Imaizumi, T Maehara, Y Yoshida
Artificial Intelligence and Statistics, 978-987, 2018
Fréchet kernel for trajectory data analysis
K Takeuchi, M Imaizumi, S Kanda, Y Tabei, K Fujii, K Yoda, M Ishihata, ...
Proceedings of the 29th International Conference on Advances in Geographic …, 2021
Inference for projection-based wasserstein distances on finite spaces
R Okano, M Imaizumi
Statistica Sinica, 2022
Minimum sharpness: Scale-invariant parameter-robustness of neural networks
H Ibayashi, T Hamaguchi, M Imaizumi
arXiv preprint arXiv:2106.12612, 2021
Asymptotically Minimax Optimal Fixed-Budget Best Arm Identification for Expected Simple Regret Minimization
M Kato, M Imaizumi, T Ishihara, T Kitagawa
arXiv preprint arXiv:2302.02988, 2023
Dimension-free bounds for sum of dependent matrices and operators with heavy-tailed distribution
S Nakakita, P Alquier, M Imaizumi
arXiv preprint arXiv:2210.09756, 2022
The system can't perform the operation now. Try again later.
Articles 1–20