Picture for Thong T. Doan

Thong T. Doan

On DeepSeekMoE: Statistical Benefits of Shared Experts and Normalized Sigmoid Gating

Add code
May 16, 2025
Viaarxiv icon

LIBMoE: A Library for comprehensive benchmarking Mixture of Experts in Large Language Models

Add code
Nov 01, 2024
Figure 1 for LIBMoE: A Library for comprehensive benchmarking Mixture of Experts in Large Language Models
Figure 2 for LIBMoE: A Library for comprehensive benchmarking Mixture of Experts in Large Language Models
Figure 3 for LIBMoE: A Library for comprehensive benchmarking Mixture of Experts in Large Language Models
Figure 4 for LIBMoE: A Library for comprehensive benchmarking Mixture of Experts in Large Language Models
Viaarxiv icon