Follow
Ashish Vaswani
Ashish Vaswani
Startup
Verified email at fastmail.com
Title
Cited by
Cited by
Year
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems 30, 2017
1105892017
Relational inductive biases, deep learning, and graph networks
PW Battaglia, JB Hamrick, V Bapst, A Sanchez-Gonzalez, V Zambaldi, ...
arXiv preprint arXiv:1806.01261, 2018
32782018
Self-attention with relative position representations
P Shaw, J Uszkoreit, A Vaswani
arXiv preprint arXiv:1803.02155, 2018
21602018
Image transformer
N Parmar, A Vaswani, J Uszkoreit, L Kaiser, N Shazeer, A Ku, D Tran
International conference on machine learning, 4055-4064, 2018
17292018
Attention augmented convolutional networks
I Bello, B Zoph, A Vaswani, J Shlens, QV Le
Proceedings of the IEEE/CVF international conference on computer vision …, 2019
11772019
Stand-alone self-attention in vision models
P Ramachandran, N Parmar, A Vaswani, I Bello, A Levskaya, J Shlens
Advances in neural information processing systems 32, 2019
11472019
Attention is all you need. arXiv 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762 30, 2017
9532017
Bottleneck transformers for visual recognition
A Srinivas, TY Lin, N Parmar, J Shlens, P Abbeel, A Vaswani
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2021
9522021
Music transformer
CZA Huang, A Vaswani, J Uszkoreit, N Shazeer, I Simon, C Hawthorne, ...
arXiv preprint arXiv:1809.04281, 2018
7812018
Advances in neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Attention is All you Need, 2017
7712017
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
5992018
Attention is all you need (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2019
4632019
Efficient content-based sparse attention with routing transformers
A Roy, M Saffar, A Vaswani, D Grangier
Transactions of the Association for Computational Linguistics 9, 53-68, 2021
4422021
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
3782017
Learning whom to trust with MACE
D Hovy, T Berg-Kirkpatrick, A Vaswani, E Hovy
Proceedings of the 2013 Conference of the North American Chapter of the …, 2013
3622013
Mesh-tensorflow: Deep learning for supercomputers
N Shazeer, Y Cheng, N Parmar, D Tran, A Vaswani, P Koanantakool, ...
Advances in neural information processing systems 31, 2018
3552018
Scaling local self-attention for parameter efficient visual backbones
A Vaswani, P Ramachandran, A Srinivas, N Parmar, B Hechtman, ...
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2021
3542021
Decoding with large-scale neural language models improves translation
A Vaswani, Y Zhao, V Fossum, D Chiang
Proceedings of the 2013 conference on empirical methods in natural language …, 2013
2982013
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
2562017
Relational inductive biases, deep learning, and graph networks. arXiv 2018
PW Battaglia, JB Hamrick, V Bapst, A Sanchez-Gonzalez, V Zambaldi, ...
arXiv preprint arXiv:1806.01261, 2018
1642018
The system can't perform the operation now. Try again later.
Articles 1–20