Sledovat
Noah Constant
Noah Constant
E-mailová adresa ověřena na: google.com - Domovská stránka
Název
Citace
Citace
Rok
Universal sentence encoder
D Cer, Y Yang, S Kong, N Hua, N Limtiaco, RS John, N Constant, ...
arXiv preprint arXiv:1803.11175, 2018
1724*2018
mT5: A massively multilingual pre-trained text-to-text transformer
L Xue, N Constant, A Roberts, M Kale, R Al-Rfou, A Siddhant, A Barua, ...
arXiv preprint arXiv:2010.11934, 2020
6302020
The power of scale for parameter-efficient prompt tuning
B Lester, R Al-Rfou, N Constant
arXiv preprint arXiv:2104.08691, 2021
4832021
Character-level language modeling with deeper self-attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI conference on artificial intelligence 33 (01), 3159-3166, 2019
2962019
Multilingual universal sentence encoder for semantic retrieval
Y Yang, D Cer, A Ahmad, M Guo, J Law, N Constant, GH Abrego, S Yuan, ...
arXiv preprint arXiv:1907.04307, 2019
2872019
Contrastive topic: Meanings and realizations
N Constant
1772014
Learning semantic textual similarity from conversations
Y Yang, S Yuan, D Cer, S Kong, N Constant, P Pilar, H Ge, YH Sung, ...
arXiv preprint arXiv:1804.07754, 2018
1432018
English rise-fall-rise: A study in the semantics and pragmatics of intonation
N Constant
Linguistics & Philosophy 35 (5), 407-442, 2012
1112012
ByT5: Towards a token-free future with pre-trained byte-to-byte models
L Xue, A Barua, N Constant, R Al-Rfou, S Narang, M Kale, A Roberts, ...
Transactions of the Association for Computational Linguistics 10, 291-306, 2022
942022
Effective parallel corpus mining using bilingual sentence embeddings
M Guo, Q Shen, Y Yang, H Ge, D Cer, GH Abrego, K Stevens, N Constant, ...
arXiv preprint arXiv:1807.11906, 2018
832018
The pragmatics of expressive content: Evidence from large corpora
N Constant, C Davis, C Potts, F Schwarz
Sprache und Datenverarbeitung 33 (1-2), 5-21, 2009
662009
XTREME-R: Towards more challenging and nuanced multilingual evaluation
S Ruder, N Constant, J Botha, A Siddhant, O Firat, J Fu, P Liu, J Hu, ...
arXiv preprint arXiv:2104.07412, 2021
602021
SPoT: Better frozen model adaptation through soft prompt transfer
T Vu, B Lester, N Constant, R Al-Rfou, D Cer
arXiv preprint arXiv:2110.07904, 2021
49*2021
ReQA: An evaluation for end-to-end answer retrieval models
A Ahmad, N Constant, Y Yang, D Cer
arXiv preprint arXiv:1907.04780, 2019
462019
Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models
A Srivastava, A Rastogi, A Rao, AAM Shoeb, A Abid, A Fisch, AR Brown, ...
arXiv preprint arXiv:2206.04615, 2022
442022
Sentence-T5: Scalable sentence encoders from pre-trained text-to-text models
J Ni, GH Ábrego, N Constant, J Ma, KB Hall, D Cer, Y Yang
arXiv preprint arXiv:2108.08877, 2021
332021
LAReQA: Language-agnostic answer retrieval from a multilingual pool
U Roy, N Constant, R Al-Rfou, A Barua, A Phillips, Y Yang
arXiv preprint arXiv:2004.05484, 2020
262020
TextSETTR: Few-shot text style extraction and tunable targeted restyling
P Riley, N Constant, M Guo, G Kumar, D Uthus, Z Parekh
arXiv preprint arXiv:2010.03802, 2020
24*2020
MultiReQA: A Cross-Domain Evaluation for Retrieval Question Answering Models
M Guo, Y Yang, D Cer, Q Shen, N Constant
arXiv preprint arXiv:2005.02507, 2020
242020
Witnessable quantifiers license type-e meaning: Evidence from contrastive topic, equatives and supplements
N Constant
Semantics and Linguistic Theory (SALT) 22, 286-306, 2012
182012
Systém momentálně nemůže danou operaci provést. Zkuste to znovu později.
Články 1–20