Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding

Add code
Apr 20, 2019
Figure 1 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Figure 2 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Figure 3 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Figure 4 for Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: