Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

A Frequency-aware Software Cache for Large Recommendation System Embeddings


Aug 08, 2022
Jiarui Fang, Geng Zhang, Jiatong Han, Shenggui Li, Zhengda Bian, Yongbin Li, Jin Liu, Yang You

Add code


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Colossal-AI: A Unified Deep Learning System For Large-Scale Parallel Training


Oct 28, 2021
Zhengda Bian, Hongxin Liu, Boxiang Wang, Haichen Huang, Yongbin Li, Chuanrui Wang, Fan Cui, Yang You

Add code


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Online Evolutionary Batch Size Orchestration for Scheduling Deep Learning Workloads in GPU Clusters


Aug 08, 2021
Zhengda Bian, Shenggui Li, Wei Wang, Yang You

Add code

* This paper has been accepted by the International Conference for High Performance Computing, Networking, Storage, and Analysis (SC21), Nov 14-19, 2021, St. Louis, USA 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

2.5-dimensional distributed model training


May 30, 2021
Boxiang Wang, Qifan Xu, Zhengda Bian, Yang You

Add code


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Maximizing Parallelism in Distributed Training for Huge Neural Networks


May 30, 2021
Zhengda Bian, Qifan Xu, Boxiang Wang, Yang You

Add code

* Technical Report of NUS HPC-AI Lab (https://ai.comp.nus.edu.sg). The leading two authors have equal contributions 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email