Seguir
Xiaodong Liu
Xiaodong Liu
Microsoft Research, Redmond
Dirección de correo verificada de microsoft.com - Página principal
Título
Citado por
Citado por
Año
On the variance of the adaptive learning rate and beyond
L Liu, H Jiang, P He, W Chen, X Liu, J Gao, J Han
arXiv preprint arXiv:1908.03265, 2019
19912019
Deberta: Decoding-enhanced bert with disentangled attention
P He, X Liu, J Gao, W Chen
arXiv preprint arXiv:2006.03654, 2020
19312020
Unified Language Model Pre-training for Natural Language Understanding and Generation
L Dong, N Yang, W Wang, F Wei, X Liu, Y Wang, J Gao, M Zhou, HW Hon
arXiv preprint arXiv:1905.03197, 2019
15842019
Smart: Robust and efficient fine-tuning for pre-trained natural language models through principled regularized optimization
H Jiang, P He, W Chen, X Liu, J Gao, T Zhao
arXiv preprint arXiv:1911.03437, 2019
1495*2019
MS MARCO: A human generated machine reading comprehension dataset
T Nguyen, M Rosenberg, X Song, J Gao, S Tiwary, R Majumder, L Deng
CoCo@ NIPS, 2016
14782016
Domain-specific language model pretraining for biomedical natural language processing
Y Gu, R Tinn, H Cheng, M Lucas, N Usuyama, X Liu, T Naumann, J Gao, ...
ACM Transactions on Computing for Healthcare (HEALTH) 3 (1), 1-23, 2021
14282021
Multi-task deep neural networks for natural language understanding
X Liu, P He, W Chen, J Gao
arXiv preprint arXiv:1901.11504, 2019
13302019
MS MARCO: A human generated machine reading comprehension dataset
T Nguyen, M Rosenberg, X Song, J Gao, S Tiwary, R Majumder, L Deng
CoCo@ NIPS, 2016
618*2016
Representation learning using multi-task deep neural networks for semantic classification and information retrieval
X Liu, J Gao, X He, L Deng, K Duh, YY Wang
4882015
Rat-sql: Relation-aware schema encoding and linking for text-to-sql parsers
B Wang, R Shin, X Liu, O Polozov, M Richardson
arXiv preprint arXiv:1911.04942, 2019
4672019
Cyclical annealing schedule: A simple approach to mitigating kl vanishing
H Fu, C Li, X Liu, J Gao, A Celikyilmaz, L Carin
arXiv preprint arXiv:1903.10145, 2019
3762019
Unilmv2: Pseudo-masked language models for unified language model pre-training
H Bao, L Dong, F Wei, W Wang, N Yang, X Liu, Y Wang, J Gao, S Piao, ...
International conference on machine learning, 642-652, 2020
3712020
Understanding the difficulty of training transformers
L Liu, X Liu, J Gao, W Chen, J Han
arXiv preprint arXiv:2004.08249, 2020
2402020
Record: Bridging the gap between human and machine commonsense reading comprehension
S Zhang, X Liu, J Liu, J Gao, K Duh, B Van Durme
arXiv preprint arXiv:1810.12885, 2018
2402018
Stochastic answer networks for machine reading comprehension
X Liu, Y Shen, K Duh, J Gao
arXiv preprint arXiv:1712.03556, 2017
2312017
Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
X Liu, P He, W Chen, J Gao
arXiv preprint arXiv:1904.09482, 2019
1982019
Generation-augmented retrieval for open-domain question answering
Y Mao, P He, X Liu, Y Shen, J Gao, J Han, W Chen
arXiv preprint arXiv:2009.08553, 2020
1612020
Adversarial training for large neural language models
X Liu, H Cheng, P He, W Chen, Y Wang, H Poon, J Gao
arXiv preprint arXiv:2004.08994, 2020
1602020
Tuning large neural networks via zero-shot hyperparameter transfer
G Yang, E Hu, I Babuschkin, S Sidor, X Liu, D Farhi, N Ryder, J Pachocki, ...
Advances in Neural Information Processing Systems 34, 17084-17097, 2021
126*2021
Language-based image editing with recurrent attentive models
J Chen, Y Shen, J Gao, J Liu, X Liu
Proceedings of the IEEE conference on computer vision and pattern …, 2018
1232018
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20