Follow
Luke Zettlemoyer
Title
Cited by
Cited by
Year
Roberta: A robustly optimized bert pretraining approach
Y Liu, M Ott, N Goyal, J Du, M Joshi, D Chen, O Levy, M Lewis, ...
arXiv preprint arXiv:1907.11692, 2019
15195*2019
Deep contextualized word representations
ME Peters, M Neumann, M Iyyer, M Gardner, C Clark, K Lee, ...
NAACL, 2018
12965*2018
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 2019
53432019
Unsupervised cross-lingual representation learning at scale
A Conneau, K Khandelwal, N Goyal, V Chaudhary, G Wenzek, F Guzmán, ...
arXiv preprint arXiv:1911.02116, 2019
33332019
Spanbert: Improving pre-training by representing and predicting spans
M Joshi, D Chen, Y Liu, DS Weld, L Zettlemoyer, O Levy
Transactions of the Association for Computational Linguistics 8, 64-77, 2020
14692020
Triviaqa: A large scale distantly supervised challenge dataset for reading comprehension
M Joshi, E Choi, DS Weld, L Zettlemoyer
arXiv preprint arXiv:1705.03551, 2017
12822017
Allennlp: A deep semantic natural language processing platform
M Gardner, J Grus, M Neumann, O Tafjord, P Dasigi, N Liu, M Peters, ...
arXiv preprint arXiv:1803.07640, 2018
11532018
Knowledge-based weak supervision for information extraction of overlapping relations
R Hoffmann, C Zhang, X Ling, L Zettlemoyer, DS Weld
Proceedings of the 49th annual meeting of the association for computational …, 2011
11352011
Learning to map sentences to logical form: Structured classification with probabilistic categorial grammars
LS Zettlemoyer, M Collins
Conference on Uncertainty in Artificial Intelligence (UAI), 2005
1057*2005
Multilingual denoising pre-training for neural machine translation
Y Liu, J Gu, N Goyal, X Li, S Edunov, M Ghazvininejad, M Lewis, ...
Transactions of the Association for Computational Linguistics 8, 726-742, 2020
9862020
End-to-end neural coreference resolution
K Lee, L He, M Lewis, L Zettlemoyer
arXiv preprint arXiv:1707.07045, 2017
9202017
QuAC: Question answering in context
E Choi, H He, M Iyyer, M Yatskar, W Yih, Y Choi, P Liang, L Zettlemoyer
arXiv preprint arXiv:1808.07036, 2018
6522018
Summarizing source code using a neural attention model
S Iyer, I Konstas, A Cheung, L Zettlemoyer
Proceedings of the 54th Annual Meeting of the Association for Computational …, 2016
6392016
Opt: Open pre-trained transformer language models
S Zhang, S Roller, N Goyal, M Artetxe, M Chen, S Chen, C Dewan, ...
arXiv preprint arXiv:2205.01068, 2022
568*2022
Adversarial example generation with syntactically controlled paraphrase networks
M Iyyer, J Wieting, K Gimpel, L Zettlemoyer
arXiv preprint arXiv:1804.06059, 2018
5602018
Deep semantic role labeling: What works and what’s next
L He, K Lee, M Lewis, L Zettlemoyer
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
4942017
Weakly supervised learning of semantic parsers for mapping instructions to actions
Y Artzi, L Zettlemoyer
Transactions of the Association for Computational Linguistics 1, 49-62, 2013
4942013
Online learning of relaxed CCG grammars for parsing to logical form
L Zettlemoyer, M Collins
Proceedings of the 2007 Joint Conference on Empirical Methods in Natural …, 2007
4942007
Open question answering over curated and extracted knowledge bases
A Fader, L Zettlemoyer, O Etzioni
Proceedings of the 20th ACM SIGKDD international conference on Knowledge …, 2014
4712014
Higher-order coreference resolution with coarse-to-fine inference
K Lee, L He, L Zettlemoyer
arXiv preprint arXiv:1804.05392, 2018
4612018
The system can't perform the operation now. Try again later.
Articles 1–20