Picture for Gyouk Chu

Gyouk Chu

Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models

Add code
Feb 18, 2025
Figure 1 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models
Figure 2 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models
Figure 3 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models
Figure 4 for Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models
Viaarxiv icon