Sledovat
Lifu Tu
Lifu Tu
Salesforce AI Research
E-mailová adresa ověřena na: ttic.edu - Domovská stránka
Název
Citace
Citace
Rok
Codegen: An open large language model for code with multi-turn program synthesis
E Nijkamp, B Pang, H Hayashi, L Tu, H Wang, Y Zhou, S Savarese, ...
arXiv preprint arXiv:2203.13474, 2022
2362022
Commonsense knowledge base completion
X Li, A Taheri, L Tu, K Gimpel
Proceedings of the 54th Annual Meeting of the Association for Computational …, 2016
1862016
An empirical study on robustness to spurious correlations using pre-trained language models
L Tu, G Lalwani, S Gella, H He
Transactions of the Association for Computational Linguistics 8, 621-633, 2020
1442020
A conversational paradigm for program synthesis
E Nijkamp, B Pang, H Hayashi, L Tu, H Wang, Y Zhou, S Savarese, ...
arXiv preprint arXiv:2203.13474, 2022
1072022
Pay attention to the ending: Strong neural baselines for the roc story cloze task
Z Cai, L Tu, K Gimpel
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
662017
Learning approximate inference networks for structured prediction
L Tu, K Gimpel
arXiv preprint arXiv:1803.03376, 2018
572018
ENGINE: Energy-based inference networks for non-autoregressive machine translation
L Tu, RY Pang, S Wiseman, K Gimpel
arXiv preprint arXiv:2005.00850, 2020
472020
Generating diverse story continuations with controllable semantics
L Tu, X Ding, D Yu, K Gimpel
arXiv preprint arXiv:1909.13434, 2019
182019
Learning to embed words in context for syntactic tasks
L Tu, K Gimpel, K Livescu
arXiv preprint arXiv:1706.02807, 2017
182017
Quality signals in generated stories
M Sagarkar, J Wieting, L Tu, K Gimpel
Proceedings of the Seventh Joint Conference on Lexical and Computational …, 2018
162018
Benchmarking approximate inference methods for neural structured prediction
L Tu, K Gimpel
arXiv preprint arXiv:1904.01138, 2019
152019
Improving joint training of inference networks and structured prediction energy networks
L Tu, RY Pang, K Gimpel
arXiv preprint arXiv:1911.02891, 2019
122019
Prompt-Tuning Can Be Much Better Than Fine-Tuning on Cross-lingual Understanding With Multilingual Language Models
L Tu, C Xiong, Y Zhou
arXiv preprint arXiv:2210.12360, 2022
62022
An Exploration of Arbitrary-Order Sequence Labeling via Energy-Based Inference Networks
L Tu, T Liu, K Gimpel
arXiv preprint arXiv:2010.02789, 2020
32020
Unsupervised Dense Retrieval Deserves Better Positive Pairs: Scalable Augmentation with Query Extraction and Generation
R Meng, Y Liu, S Yavuz, D Agarwal, L Tu, N Yu, J Zhang, M Bhat, Y Zhou
arXiv preprint arXiv:2212.08841, 2022
22022
Learning Energy-Based Approximate Inference Networks for Structured Applications in NLP
L Tu
arXiv preprint arXiv:2108.12522, 2021
2021
Improving and Stabilizing Deep Energy-Based Learning
L Tu, RY Pang, K Gimpel
2019
Network Inference by Learned Node-Specific Degree Prior
Q Tang, L Tu, W Wang, J Xu
arXiv preprint arXiv:1602.02386, 2016
2016
Protein-Protein Interaction Prediction via Structured Matrix Completion
Q Tang, L Tu, AA Khan, J Xu
Systém momentálně nemůže danou operaci provést. Zkuste to znovu později.
Články 1–19