Follow
Yuan Cao
Yuan Cao
Google DeepMind
Verified email at google.com
Title
Cited by
Cited by
Year
Google's neural machine translation system: Bridging the gap between human and machine translation
Y Wu, M Schuster, Z Chen, QV Le, M Norouzi, W Macherey, M Krikun, ...
arXiv preprint arXiv:1609.08144, 2016
79672016
Simvlm: Simple visual language model pretraining with weak supervision
Z Wang, J Yu, AW Yu, Z Dai, Y Tsvetkov, Y Cao
arXiv preprint arXiv:2108.10904, 2021
5962021
React: Synergizing reasoning and acting in language models
S Yao, J Zhao, D Yu, N Du, I Shafran, K Narasimhan, Y Cao
arXiv preprint arXiv:2210.03629, 2022
5922022
Tree of thoughts: Deliberate problem solving with large language models
S Yao, D Yu, J Zhao, I Shafran, T Griffiths, Y Cao, K Narasimhan
Advances in Neural Information Processing Systems 36, 2024
4802024
Massively multilingual neural machine translation in the wild: Findings and challenges
N Arivazhagan, A Bapna, O Firat, D Lepikhin, M Johnson, M Krikun, ...
arXiv preprint arXiv:1907.05019, 2019
3562019
Google’s neural machine translation system: Bridging the gap between human and machine translation. arXiv 2016
Y Wu, M Schuster, Z Chen, QV Le, M Norouzi, W Macherey, M Krikun, ...
arXiv preprint arXiv:1609.08144 2, 2019
2862019
Hierarchical generative modeling for controllable speech synthesis
WN Hsu, Y Zhang, RJ Weiss, H Zen, Y Wu, Y Wang, Y Cao, Y Jia, Z Chen, ...
arXiv preprint arXiv:1810.07217, 2018
2552018
Gmail smart compose: Real-time assisted writing
MX Chen, BN Lee, G Bansal, Y Cao, S Zhang, J Lu, J Tsay, Y Wang, ...
Proceedings of the 25th ACM SIGKDD International Conference on Knowledge …, 2019
2052019
Lingvo: a modular and scalable framework for sequence-to-sequence modeling
J Shen, P Nguyen, Y Wu, Z Chen, MX Chen, Y Jia, A Kannan, T Sainath, ...
arXiv preprint arXiv:1902.08295, 2019
1932019
Leveraging weakly supervised data to improve end-to-end speech-to-text translation
Y Jia, M Johnson, W Macherey, RJ Weiss, Y Cao, CC Chiu, N Ari, ...
ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and …, 2019
1692019
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
arXiv preprint arXiv:2312.11805, 2023
1412023
Fully-hierarchical fine-grained prosody modeling for interpretable speech synthesis
G Sun, Y Zhang, RJ Weiss, Y Cao, H Zen, Y Wu
ICASSP 2020-2020 IEEE international conference on acoustics, speech and …, 2020
1342020
Gradient vaccine: Investigating and improving multi-task optimization in massively multilingual models
Z Wang, Y Tsvetkov, O Firat, Y Cao
arXiv preprint arXiv:2010.05874, 2020
1282020
Training deeper neural machine translation models with transparent attention
A Bapna, MX Chen, O Firat, Y Cao, Y Wu
arXiv preprint arXiv:1808.07561, 2018
1192018
Your gan is secretly an energy-based model and you should use discriminator driven latent sampling
T Che, R Zhang, J Sohl-Dickstein, H Larochelle, L Paull, Y Cao, Y Bengio
Advances in Neural Information Processing Systems 33, 12275-12287, 2020
1032020
Generating diverse and natural text-to-speech samples using a quantized fine-grained VAE and autoregressive prosody prior
G Sun, Y Zhang, RJ Weiss, Y Cao, H Zen, A Rosenberg, B Ramabhadran, ...
ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and …, 2020
962020
Leveraging monolingual data with self-supervision for multilingual neural machine translation
A Siddhant, A Bapna, Y Cao, O Firat, M Chen, S Kudugunta, ...
arXiv preprint arXiv:2005.04816, 2020
772020
Towards zero-label language learning
Z Wang, AW Yu, O Firat, Y Cao
arXiv preprint arXiv:2109.09193, 2021
642021
Building machine translation systems for the next thousand languages
A Bapna, I Caswell, J Kreutzer, O Firat, D van Esch, A Siddhant, M Niu, ...
arXiv preprint arXiv:2205.03983, 2022
442022
Joshua 4.0: Packing, PRO, and paraphrases
J Ganitkevitch, Y Cao, J Weese, M Post, C Callison-Burch
Proceedings of the Seventh Workshop on Statistical Machine Translation, 283-291, 2012
422012
The system can't perform the operation now. Try again later.
Articles 1–20