Alert button
Picture for Gokmen Oz

Gokmen Oz

Alert button

Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks

Add code
Bookmark button
Alert button
Oct 11, 2022
Charith Peris, Lizhen Tan, Thomas Gueudre, Turan Gojayev, Pan Wei, Gokmen Oz

Figure 1 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Figure 2 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Figure 3 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Figure 4 for Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Viaarxiv icon

Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems

Add code
Bookmark button
Alert button
Jun 15, 2022
Jack FitzGerald, Shankar Ananthakrishnan, Konstantine Arkoudas, Davide Bernardi, Abhishek Bhagia, Claudio Delli Bovi, Jin Cao, Rakesh Chada, Amit Chauhan, Luoxin Chen, Anurag Dwarakanath, Satyam Dwivedi, Turan Gojayev, Karthik Gopalakrishnan, Thomas Gueudre, Dilek Hakkani-Tur, Wael Hamza, Jonathan Hueser, Kevin Martin Jose, Haidar Khan, Beiye Liu, Jianhua Lu, Alessandro Manzotti, Pradeep Natarajan, Karolina Owczarzak, Gokmen Oz, Enrico Palumbo, Charith Peris, Chandana Satya Prakash, Stephen Rawls, Andy Rosenbaum, Anjali Shenoy, Saleh Soltan, Mukund Harakere Sridhar, Liz Tan, Fabian Triefenbach, Pan Wei, Haiyang Yu, Shuai Zheng, Gokhan Tur, Prem Natarajan

Figure 1 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Figure 2 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Figure 3 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Figure 4 for Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems
Viaarxiv icon

Using multiple ASR hypotheses to boost i18n NLU performance

Add code
Bookmark button
Alert button
Dec 14, 2020
Charith Peris, Gokmen Oz, Khadige Abboud, Venkata sai Varada, Prashan Wanigasekara, Haidar Khan

Figure 1 for Using multiple ASR hypotheses to boost i18n NLU performance
Figure 2 for Using multiple ASR hypotheses to boost i18n NLU performance
Figure 3 for Using multiple ASR hypotheses to boost i18n NLU performance
Figure 4 for Using multiple ASR hypotheses to boost i18n NLU performance
Viaarxiv icon