Saksham Singhal
Saksham Singhal
Verified email at microsoft.com
Title
Cited by
Cited by
Year
Infoxlm: An information-theoretic framework for cross-lingual language model pre-training
Z Chi, L Dong, F Wei, N Yang, S Singhal, W Wang, X Song, XL Mao, ...
arXiv preprint arXiv:2007.07834, 2020
642020
Consistency Regularization for Cross-Lingual Fine-Tuning
B Zheng, L Dong, S Huang, W Wang, Z Chi, S Singhal, W Che, T Liu, ...
arXiv preprint arXiv:2106.08226, 2021
72021
Deltalm: Encoder-decoder pre-training for language generation and translation by augmenting pretrained multilingual encoders
S Ma, L Dong, S Huang, D Zhang, A Muzio, S Singhal, HH Awadalla, ...
arXiv preprint arXiv:2106.13736, 2021
62021
Xlm-e: Cross-lingual language model pre-training via electra
Z Chi, S Huang, L Dong, S Ma, S Singhal, P Bajaj, X Song, F Wei
arXiv preprint arXiv:2106.16138, 2021
52021
XLM-T: Scaling up Multilingual Machine Translation with Pretrained Cross-lingual Transformer Encoders
S Ma, J Yang, H Huang, Z Chi, L Dong, D Zhang, HH Awadalla, A Muzio, ...
arXiv preprint arXiv:2012.15547, 2020
42020
Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task
J Yang, S Ma, H Huang, D Zhang, L Dong, S Huang, A Muzio, S Singhal, ...
arXiv preprint arXiv:2111.02086, 2021
12021
Dispersion based similarity for mining similar papers in citation network
S Singhal, V Pudi
2015 IEEE International Conference on Data Mining Workshop (ICDMW), 524-531, 2015
12015
Allocating Large Vocabulary Capacity for Cross-lingual Language Model Pre-training
B Zheng, L Dong, S Huang, S Singhal, W Che, T Liu, X Song, F Wei
arXiv preprint arXiv:2109.07306, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–8