Get our free extension to see links to code for papers anywhere online!

Chrome logo Add to Chrome

Firefox logo Add to Firefox

Picture for Zheng Chai

Towards Quantized Model Parallelism for Graph-Augmented MLPs Based on Gradient-Free ADMM framework


May 20, 2021
Junxiang Wang, Hongyi Li, Zheng Chai, Yongchao Wang, Yue Cheng, Liang Zhao

* Junxiang Wang and Hongyi Li contribute equally to this work, and Yongchao Wang and Liang Zhao are corresponding authors. This work is under progress. arXiv admin note: substantial text overlap with arXiv:2009.02868 

  Access Paper or Ask Questions

FedAT: A Communication-Efficient Federated Learning Method with Asynchronous Tiers under Non-IID Data


Oct 12, 2020
Zheng Chai, Yujing Chen, Liang Zhao, Yue Cheng, Huzefa Rangwala


  Access Paper or Ask Questions

Tunable Subnetwork Splitting for Model-parallelism of Neural Network Training


Sep 16, 2020
Junxiang Wang, Zheng Chai, Yue Cheng, Liang Zhao

* ICML 2020 Workshop on "Beyond first-order methods in ML systems" 

  Access Paper or Ask Questions

TiFL: A Tier-based Federated Learning System


Jan 25, 2020
Zheng Chai, Ahsan Ali, Syed Zawad, Stacey Truex, Ali Anwar, Nathalie Baracaldo, Yi Zhou, Heiko Ludwig, Feng Yan, Yue Cheng


  Access Paper or Ask Questions

Federated Multi-task Hierarchical Attention Model for Sensor Analytics


May 13, 2019
Yujing Chen, Yue Ning, Zheng Chai, Huzefa Rangwala


  Access Paper or Ask Questions