关注
Takateru Yamakoshi
Takateru Yamakoshi
在 g.ecc.u-tokyo.ac.jp 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Reconstructing the cascade of language processing in the brain using the internal computations of a transformer-based language model
S Kumar, TR Sumers, T Yamakoshi, A Goldstein, U Hasson, KA Norman, ...
BioRxiv, 2022.06. 08.495348, 2022
422022
Investigating representations of verb bias in neural language models
RD Hawkins, T Yamakoshi, TL Griffiths, AE Goldberg
arXiv preprint arXiv:2010.02375, 2020
282020
Probing BERT's priors with serial reproduction chains
T Yamakoshi, TL Griffiths, RD Hawkins
arXiv preprint arXiv:2202.12226, 2022
132022
Causal interventions expose implicit situation models for commonsense language understanding
T Yamakoshi, JL McClelland, AE Goldberg, RD Hawkins
arXiv preprint arXiv:2306.03882, 2023
82023
Shared functional specialization in transformer-based language models and the human brain
S Kumar, TR Sumers, T Yamakoshi, A Goldstein, U Hasson, KA Norman, ...
3*
Neural Constructions
T Yamakoshi, R Hawkins
OSF, 2020
2020
Reconstructing the cascade of language processing in the brain using the internal computations of transformer language models
S Kumar, TR Sumers, T Yamakoshi, A Goldstein, U Hasson, KA Norman, ...
系统目前无法执行此操作,请稍后再试。
文章 1–7