Picture for Changho Hwang

Changho Hwang

ForestColl: Efficient Collective Communications on Heterogeneous Network Fabrics

Add code
Feb 09, 2024
Viaarxiv icon

Pre-gated MoE: An Algorithm-System Co-Design for Fast and Scalable Mixture-of-Expert Inference

Aug 23, 2023
Figure 1 for Pre-gated MoE: An Algorithm-System Co-Design for Fast and Scalable Mixture-of-Expert Inference
Figure 2 for Pre-gated MoE: An Algorithm-System Co-Design for Fast and Scalable Mixture-of-Expert Inference
Figure 3 for Pre-gated MoE: An Algorithm-System Co-Design for Fast and Scalable Mixture-of-Expert Inference
Figure 4 for Pre-gated MoE: An Algorithm-System Co-Design for Fast and Scalable Mixture-of-Expert Inference
Viaarxiv icon

Tutel: Adaptive Mixture-of-Experts at Scale

Add code
Jun 07, 2022
Figure 1 for Tutel: Adaptive Mixture-of-Experts at Scale
Figure 2 for Tutel: Adaptive Mixture-of-Experts at Scale
Figure 3 for Tutel: Adaptive Mixture-of-Experts at Scale
Figure 4 for Tutel: Adaptive Mixture-of-Experts at Scale
Viaarxiv icon

Confident Multiple Choice Learning

Add code
Sep 22, 2017
Figure 1 for Confident Multiple Choice Learning
Figure 2 for Confident Multiple Choice Learning
Figure 3 for Confident Multiple Choice Learning
Figure 4 for Confident Multiple Choice Learning
Viaarxiv icon