关注
Luke Zettlemoyer
Luke Zettlemoyer
在 cs.washington.edu 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Roberta: A robustly optimized bert pretraining approach
Y Liu, M Ott, N Goyal, J Du, M Joshi, D Chen, O Levy, M Lewis, ...
arXiv preprint arXiv:1907.11692, 2019
18373*2019
Deep contextualized word representations
ME Peters, M Neumann, M Iyyer, M Gardner, C Clark, K Lee, ...
NAACL, 2018
13821*2018
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 2019
66762019
Unsupervised cross-lingual representation learning at scale
A Conneau, K Khandelwal, N Goyal, V Chaudhary, G Wenzek, F Guzmán, ...
arXiv preprint arXiv:1911.02116, 2019
40702019
Spanbert: Improving pre-training by representing and predicting spans
M Joshi, D Chen, Y Liu, DS Weld, L Zettlemoyer, O Levy
Transactions of the association for computational linguistics 8, 64-77, 2020
16932020
Triviaqa: A large scale distantly supervised challenge dataset for reading comprehension
M Joshi, E Choi, DS Weld, L Zettlemoyer
arXiv preprint arXiv:1705.03551, 2017
14202017
Allennlp: A deep semantic natural language processing platform
M Gardner, J Grus, M Neumann, O Tafjord, P Dasigi, N Liu, M Peters, ...
arXiv preprint arXiv:1803.07640, 2018
12412018
Multilingual denoising pre-training for neural machine translation
Y Liu, J Gu, N Goyal, X Li, S Edunov, M Ghazvininejad, M Lewis, ...
Transactions of the Association for Computational Linguistics 8, 726-742, 2020
11822020
Knowledge-based weak supervision for information extraction of overlapping relations
R Hoffmann, C Zhang, X Ling, L Zettlemoyer, DS Weld
Proceedings of the 49th annual meeting of the association for computational …, 2011
11572011
Learning to map sentences to logical form: Structured classification with probabilistic categorial grammars
LS Zettlemoyer, M Collins
Conference on Uncertainty in Artificial Intelligence (UAI), 2005
1082*2005
End-to-end neural coreference resolution
K Lee, L He, M Lewis, L Zettlemoyer
arXiv preprint arXiv:1707.07045, 2017
9972017
Opt: Open pre-trained transformer language models
S Zhang, S Roller, N Goyal, M Artetxe, M Chen, S Chen, C Dewan, ...
arXiv preprint arXiv:2205.01068, 2022
989*2022
QuAC: Question answering in context
E Choi, H He, M Iyyer, M Yatskar, W Yih, Y Choi, P Liang, L Zettlemoyer
arXiv preprint arXiv:1808.07036, 2018
7082018
Summarizing source code using a neural attention model
S Iyer, I Konstas, A Cheung, L Zettlemoyer
54th Annual Meeting of the Association for Computational Linguistics 2016 …, 2016
6932016
Adversarial example generation with syntactically controlled paraphrase networks
M Iyyer, J Wieting, K Gimpel, L Zettlemoyer
arXiv preprint arXiv:1804.06059, 2018
6232018
Deep semantic role labeling: What works and what’s next
L He, K Lee, M Lewis, L Zettlemoyer
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
5192017
Weakly supervised learning of semantic parsers for mapping instructions to actions
Y Artzi, L Zettlemoyer
Transactions of the Association for Computational Linguistics 1, 49-62, 2013
5102013
Higher-order coreference resolution with coarse-to-fine inference
K Lee, L He, L Zettlemoyer
arXiv preprint arXiv:1804.05392, 2018
5052018
Online learning of relaxed CCG grammars for parsing to logical form
L Zettlemoyer, M Collins
Proceedings of the 2007 Joint Conference on Empirical Methods in Natural …, 2007
5012007
Open question answering over curated and extracted knowledge bases
A Fader, L Zettlemoyer, O Etzioni
Proceedings of the 20th ACM SIGKDD international conference on Knowledge …, 2014
4862014
系统目前无法执行此操作,请稍后再试。
文章 1–20