Picture for Entong Li

Entong Li

High-Throughput LLM inference on Heterogeneous Clusters

Add code
Apr 18, 2025
Viaarxiv icon

LocMoE: A Low-overhead MoE for Large Language Model Training

Add code
Jan 25, 2024
Figure 1 for LocMoE: A Low-overhead MoE for Large Language Model Training
Figure 2 for LocMoE: A Low-overhead MoE for Large Language Model Training
Figure 3 for LocMoE: A Low-overhead MoE for Large Language Model Training
Figure 4 for LocMoE: A Low-overhead MoE for Large Language Model Training
Viaarxiv icon