Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models

Add code
Feb 18, 2025
Figure 1 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models
Figure 2 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models
Figure 3 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models
Figure 4 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: