¹Ø×¢
Yoshihiro Nagano
Yoshihiro Nagano
ÔÚ i.kyoto-u.ac.jp µÄµç×ÓÓʼþ¾­¹ýÑéÖ¤ - Ê×Ò³
±êÌâ
ÒýÓôÎÊý
ÒýÓôÎÊý
Äê·Ý
A Wrapped Normal Distribution on Hyperbolic Space for Gradient-Based Learning
Y Nagano, S Yamaguchi, Y Fujita, M Koyama
ICML2019, 2019
117*2019
On the surrogate gap between contrastive and supervised losses
H Bao, Y Nagano, K Nozawa
International Conference on Machine Learning, 1585-1606, 2022
22*2022
Statistical mechanical analysis of catastrophic forgetting in continual learning with teacher and student networks
H Asanuma, S Takagi, Y Nagano, Y Yoshida, Y Igarashi, M Okada
Journal of the Physical Society of Japan 90 (10), 104001, 2021
122021
Normal mode analysis of a relaxation process with Bayesian inference
I Sakata, Y Nagano, Y Igarashi, S Murata, K Mizoguchi, I Akai, M Okada
Science and Technology of Advanced Materials 21 (1), 67-78, 2020
42020
Complex energies of the coherent longitudinal optical phonon–plasmon coupled mode according to dynamic mode decomposition analysis
I Sakata, T Sakata, K Mizoguchi, S Tanaka, G Oohata, I Akai, Y Igarashi, ...
Scientific Reports 11 (1), 23169, 2021
22021
Input response of neural network model with lognormally distributed synaptic weights
Y Nagano, R Karakida, N Watanabe, A Aoyama, M Okada
Journal of the Physical Society of Japan 85 (7), 074001, 2016
12016
Transforming method, training device, and inference device
Y Nagano, S Yamaguchi
US Patent App. 17/444,301, 2021
2021
Analysis of Trainability of Gradient-based Multi-environment Learning from Gradient Norm Regularization Perspective
S Takagi, Y Nagano, Y Yoshida, M Okada
2021 International Joint Conference on Neural Networks (IJCNN), 1-8, 2021
2021
Collective dynamics of repeated inference in variational autoencoder rapidly find cluster structure
Y Nagano, R Karakida, M Okada
Scientific Reports 10 (1), 16001, 2020
2020
Sparse STC estimation of suppressive elements for neurons in primary visual cortex
R Tanaka, K Sasaki, H Sakamoto, Y Nagano, Y Yue, M Okada, I Ohzawa
IEICE Technical Report; IEICE Tech. Rep. 119 (453), 155-159, 2020
2020
Role of two learning rates in convergence of model-agnostic meta-learning
S Takagi, Y Nagano, Y Yoshida, M Okada
2019
Localized Generations with Deep Neural Networks for Multi-Scale Structured Datasets
Y Nagano, S Takagi, Y Yoshida, M Okada
2019
2019 Äê¶È時ÏÞÑо¿»á実Ê©報¸æ 脳¿ÆѧÈôÊ֤λáµÚ 11 »ØºÏËÞ ¡¸ÈôÊÖÑо¿ÕߤËÏò¤±¤¿¥ì¥¯¥Á¥ãþí & ¥ïþí¥¯¥·¥ç¥Ã¥×ºÏËÞ~ Éñ経»î動¤¬機ÄܤòÉú¤à¥á¥«¥Ë¥º¥à¤Î̽Çó, Àí論¤Î実¼ù¤È応ÓÃ~¡¹
´¨¶ËÕþ則£¬ СɽÐÛÌ«ÀÉ£¬ Ûà±¾áÔ£¬ ×ôÌÙÔªÖØ£¬ ×ôÌÙÓÉÓ ¸ßľ־ÀÉ£¬ ...
ÈÕ±¾Éñ経»Ø·ѧ»á誌 26 (3), 105-109, 2019
2019
動µÄ¥âþí¥É·Ö½â¤È Bayesian LARS-OLS ¤Ë¤è¤ë¥³¥Òþí¥ì¥ó¥È¥Õ¥©¥Î¥ó¤Î¹ÌÓÐÕñ動¥âþí¥É選択
ÛàÌïÒÝÖ¾£¬ 長Ò°Ïé´ó£¬ ÎåÊ®嵐¿µÑ壬 ´åÌïÉ죬 溝¿ÚÐÒ˾£¬ ³à¾®Ò»ÀÉ£¬ ...
ÈÕ±¾ÎïÀíѧ»á講ÑݸÅÒª¼¯ 74.1, 1487-1487, 2019
2019
¥Ù¥¤¥ºµÄ¥¹¥Ú¥¯¥È¥ë·Ö½â¤òÓ䤤¿ V1 複雑ÐÍ細°û¤ÎÊÜÈÝÒ°Íƶ¨
¸ßβ¿¡輔£¬ Ûà±¾ºÆ¡£¬ 長Ò°Ïé´ó£¬ 楽詠灝£¬ ×ô¡©Ä¾¸ûÌ«£¬ ´ó澤Îåס£¬ ...
ÈÕ±¾ÎïÀíѧ»á講ÑݸÅÒª¼¯ 74.1, 2853-2853, 2019
2019
¥ì¥×¥ê¥«½»換¥â¥ó¥Æ¥«¥ë¥í·¨¤Ë¤è¤ë確ÂÊ·Ö²¼Íƶ¨¤Ë»ù¤Å¤¤¤¿Î»Ïॢ¥ó¥é¥Ã¥Ô¥ó¥°
薬´ü洸´È£¬ 長Ò°Ïé´ó£¬ ÎåÊ®嵐¿µÑ壬 ÖÐ嶋¹§¾Ã£¬ ³É瀬¿µ£¬ 岡ÌïÕæÈË
ÈÕ±¾ÎïÀíѧ»á講ÑݸÅÒª¼¯ 74.1, 2714-2714, 2019
2019
Bayesian LARS-OLS ¤Ë¤è¤ë¥³¥Òþí¥ì¥ó¥È¥Õ¥©¥Î¥ó¤Î¹ÌÓÐÕñ動¥âþí¥É選択
ÛàÌïÒÝÖ¾£¬ 長Ò°Ïé´ó£¬ ÎåÊ®嵐¿µÑ壬 ´åÌïÉ죬 溝¿ÚÐÒ˾£¬ ³à¾®Ò»ÀÉ£¬ ...
電×ÓÇé報ͨÐÅѧ»á¼¼術Ñо¿報¸æ; ÐÅѧ¼¼報 118 (284), 255-262, 2018
2018
Normal mode selection of coherent phonons by Bayesian LARS-OLS
I Sakata, Y Nagano, Y Igarashi, S Murata, K Mizoguchi, I Akai, M Okada
IEICE Technical Report; IEICE Tech. Rep. 118 (284), 255-262, 2018
2018
2018 Äê¶È時ÏÞÑо¿»á実Ê©報¸æ 脳¿ÆѧÈôÊ֤λáµÚ 10 »ØºÏËÞ ¡¸ÈôÊÖ脳Ñо¿ÕߤËÏò¤±¤¿¥ì¥¯¥Á¥ãþí & ¥ïþí¥¯¥·¥ç¥Ã¥×ºÏËÞ~ 脳¤ÈÈ˹¤ÖªÄܤÎÇé報±í現, ¥Çþí¥¿½âÎö¤«¤é¥â¥Ç¥ë構築¤Þ¤Ç~¡¹
³àβÐñÑ壬 ´¨¶ËÕþ則£¬ СɽÐÛÌ«ÀÉ£¬ 長Ò°Ïé´ó£¬ ËÙË®×Á£¬ Èý須ºêÎ䣬 ...
ÈÕ±¾Éñ経»Ø·ѧ»á誌 25 (2), 38-42, 2018
2018
Éî層Éú³É¥â¥Ç¥ë¤Ë¤ª¤±¤ë認識-Éú³É¥ëþí¥×¤Î¥À¥¤¥Ê¥ß¥¯¥¹¤ÈÄÚ²¿±í現
長Ò°Ïé´ó£¬ ÌÆľÌïÁÁ£¬ 岡ÌïÕæÈË
ÈÕ±¾ÎïÀíѧ»á講ÑݸÅÒª¼¯ 73.1, 2932-2932, 2018
2018
ϵͳĿǰÎÞ·¨Ö´Ðд˲Ù×÷£¬ÇëÉÔºóÔÙÊÔ¡£
ÎÄÕÂ 1–20