Alert button
Picture for Naman Goyal

Naman Goyal

Alert button

Better Fine-Tuning by Reducing Representational Collapse

Aug 06, 2020
Armen Aghajanyan, Akshat Shrivastava, Anchit Gupta, Naman Goyal, Luke Zettlemoyer, Sonal Gupta

Figure 1 for Better Fine-Tuning by Reducing Representational Collapse
Figure 2 for Better Fine-Tuning by Reducing Representational Collapse
Figure 3 for Better Fine-Tuning by Reducing Representational Collapse
Figure 4 for Better Fine-Tuning by Reducing Representational Collapse
Viaarxiv icon

Multilingual Translation with Extensible Multilingual Pretraining and Finetuning

Aug 02, 2020
Yuqing Tang, Chau Tran, Xian Li, Peng-Jen Chen, Naman Goyal, Vishrav Chaudhary, Jiatao Gu, Angela Fan

Figure 1 for Multilingual Translation with Extensible Multilingual Pretraining and Finetuning
Figure 2 for Multilingual Translation with Extensible Multilingual Pretraining and Finetuning
Figure 3 for Multilingual Translation with Extensible Multilingual Pretraining and Finetuning
Figure 4 for Multilingual Translation with Extensible Multilingual Pretraining and Finetuning
Viaarxiv icon

Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks

May 22, 2020
Patrick Lewis, Ethan Perez, Aleksandara Piktus, Fabio Petroni, Vladimir Karpukhin, Naman Goyal, Heinrich Küttler, Mike Lewis, Wen-tau Yih, Tim Rocktäschel, Sebastian Riedel, Douwe Kiela

Figure 1 for Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks
Figure 2 for Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks
Figure 3 for Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks
Figure 4 for Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks
Viaarxiv icon

Recipes for building an open-domain chatbot

Apr 30, 2020
Stephen Roller, Emily Dinan, Naman Goyal, Da Ju, Mary Williamson, Yinhan Liu, Jing Xu, Myle Ott, Kurt Shuster, Eric M. Smith, Y-Lan Boureau, Jason Weston

Figure 1 for Recipes for building an open-domain chatbot
Figure 2 for Recipes for building an open-domain chatbot
Figure 3 for Recipes for building an open-domain chatbot
Figure 4 for Recipes for building an open-domain chatbot
Viaarxiv icon

Multilingual Denoising Pre-training for Neural Machine Translation

Jan 23, 2020
Yinhan Liu, Jiatao Gu, Naman Goyal, Xian Li, Sergey Edunov, Marjan Ghazvininejad, Mike Lewis, Luke Zettlemoyer

Figure 1 for Multilingual Denoising Pre-training for Neural Machine Translation
Figure 2 for Multilingual Denoising Pre-training for Neural Machine Translation
Figure 3 for Multilingual Denoising Pre-training for Neural Machine Translation
Figure 4 for Multilingual Denoising Pre-training for Neural Machine Translation
Viaarxiv icon

Unsupervised Cross-lingual Representation Learning at Scale

Nov 05, 2019
Alexis Conneau, Kartikay Khandelwal, Naman Goyal, Vishrav Chaudhary, Guillaume Wenzek, Francisco Guzmán, Edouard Grave, Myle Ott, Luke Zettlemoyer, Veselin Stoyanov

Figure 1 for Unsupervised Cross-lingual Representation Learning at Scale
Figure 2 for Unsupervised Cross-lingual Representation Learning at Scale
Figure 3 for Unsupervised Cross-lingual Representation Learning at Scale
Figure 4 for Unsupervised Cross-lingual Representation Learning at Scale
Viaarxiv icon

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

Oct 29, 2019
Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Ves Stoyanov, Luke Zettlemoyer

Figure 1 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Figure 2 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Figure 3 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Figure 4 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Viaarxiv icon

RoBERTa: A Robustly Optimized BERT Pretraining Approach

Jul 26, 2019
Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, Veselin Stoyanov

Figure 1 for RoBERTa: A Robustly Optimized BERT Pretraining Approach
Figure 2 for RoBERTa: A Robustly Optimized BERT Pretraining Approach
Figure 3 for RoBERTa: A Robustly Optimized BERT Pretraining Approach
Figure 4 for RoBERTa: A Robustly Optimized BERT Pretraining Approach
Viaarxiv icon

The Social Dynamics of Language Change in Online Networks

Sep 07, 2016
Rahul Goel, Sandeep Soni, Naman Goyal, John Paparrizos, Hanna Wallach, Fernando Diaz, Jacob Eisenstein

Figure 1 for The Social Dynamics of Language Change in Online Networks
Figure 2 for The Social Dynamics of Language Change in Online Networks
Figure 3 for The Social Dynamics of Language Change in Online Networks
Viaarxiv icon