Alert button
Picture for Jianfeng Gao

Jianfeng Gao

Alert button

A Hybrid Neural Network Model for Commonsense Reasoning

Jul 27, 2019
Pengcheng He, Xiaodong Liu, Weizhu Chen, Jianfeng Gao

Figure 1 for A Hybrid Neural Network Model for Commonsense Reasoning
Figure 2 for A Hybrid Neural Network Model for Commonsense Reasoning
Figure 3 for A Hybrid Neural Network Model for Commonsense Reasoning
Figure 4 for A Hybrid Neural Network Model for Commonsense Reasoning
Viaarxiv icon

Model Adaptation via Model Interpolation and Boosting for Web Search Ranking

Jul 22, 2019
Jianfeng Gao, Qiang Wu, Chris Burges, Krysta Svore, Yi Su, Nazan Khan, Shalin Shah, Hongyan Zhou

Figure 1 for Model Adaptation via Model Interpolation and Boosting for Web Search Ranking
Figure 2 for Model Adaptation via Model Interpolation and Boosting for Web Search Ranking
Figure 3 for Model Adaptation via Model Interpolation and Boosting for Web Search Ranking
Figure 4 for Model Adaptation via Model Interpolation and Boosting for Web Search Ranking
Viaarxiv icon

DoubleTransfer at MEDIQA 2019: Multi-Source Transfer Learning for Natural Language Understanding in the Medical Domain

Jun 11, 2019
Yichong Xu, Xiaodong Liu, Chunyuan Li, Hoifung Poon, Jianfeng Gao

Figure 1 for DoubleTransfer at MEDIQA 2019: Multi-Source Transfer Learning for Natural Language Understanding in the Medical Domain
Figure 2 for DoubleTransfer at MEDIQA 2019: Multi-Source Transfer Learning for Natural Language Understanding in the Medical Domain
Figure 3 for DoubleTransfer at MEDIQA 2019: Multi-Source Transfer Learning for Natural Language Understanding in the Medical Domain
Figure 4 for DoubleTransfer at MEDIQA 2019: Multi-Source Transfer Learning for Natural Language Understanding in the Medical Domain
Viaarxiv icon

Towards Amortized Ranking-Critical Training for Collaborative Filtering

Jun 10, 2019
Sam Lobel, Chunyuan Li, Jianfeng Gao, Lawrence Carin

Figure 1 for Towards Amortized Ranking-Critical Training for Collaborative Filtering
Figure 2 for Towards Amortized Ranking-Critical Training for Collaborative Filtering
Figure 3 for Towards Amortized Ranking-Critical Training for Collaborative Filtering
Figure 4 for Towards Amortized Ranking-Critical Training for Collaborative Filtering
Viaarxiv icon

Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading

Jun 07, 2019
Lianhui Qin, Michel Galley, Chris Brockett, Xiaodong Liu, Xiang Gao, Bill Dolan, Yejin Choi, Jianfeng Gao

Figure 1 for Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading
Figure 2 for Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading
Figure 3 for Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading
Figure 4 for Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading
Viaarxiv icon

Budgeted Policy Learning for Task-Oriented Dialogue Systems

Jun 02, 2019
Zhirui Zhang, Xiujun Li, Jianfeng Gao, Enhong Chen

Figure 1 for Budgeted Policy Learning for Task-Oriented Dialogue Systems
Figure 2 for Budgeted Policy Learning for Task-Oriented Dialogue Systems
Figure 3 for Budgeted Policy Learning for Task-Oriented Dialogue Systems
Figure 4 for Budgeted Policy Learning for Task-Oriented Dialogue Systems
Viaarxiv icon

Challenges in Building Intelligent Open-domain Dialog Systems

May 13, 2019
Minlie Huang, Xiaoyan Zhu, Jianfeng Gao

Figure 1 for Challenges in Building Intelligent Open-domain Dialog Systems
Figure 2 for Challenges in Building Intelligent Open-domain Dialog Systems
Figure 3 for Challenges in Building Intelligent Open-domain Dialog Systems
Figure 4 for Challenges in Building Intelligent Open-domain Dialog Systems
Viaarxiv icon

Unified Language Model Pre-training for Natural Language Understanding and Generation

May 08, 2019
Li Dong, Nan Yang, Wenhui Wang, Furu Wei, Xiaodong Liu, Yu Wang, Jianfeng Gao, Ming Zhou, Hsiao-Wuen Hon

Figure 1 for Unified Language Model Pre-training for Natural Language Understanding and Generation
Figure 2 for Unified Language Model Pre-training for Natural Language Understanding and Generation
Figure 3 for Unified Language Model Pre-training for Natural Language Understanding and Generation
Figure 4 for Unified Language Model Pre-training for Natural Language Understanding and Generation
Viaarxiv icon

Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding

Apr 20, 2019
Xiaodong Liu, Pengcheng He, Weizhu Chen, Jianfeng Gao

Figure 1 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Figure 2 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Figure 3 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Figure 4 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Viaarxiv icon