Picture for Shaohuai Shi

Shaohuai Shi

FADNet++: Real-Time and Accurate Disparity Estimation with Configurable Networks

Add code
Oct 06, 2021
Figure 1 for FADNet++: Real-Time and Accurate Disparity Estimation with Configurable Networks
Figure 2 for FADNet++: Real-Time and Accurate Disparity Estimation with Configurable Networks
Figure 3 for FADNet++: Real-Time and Accurate Disparity Estimation with Configurable Networks
Figure 4 for FADNet++: Real-Time and Accurate Disparity Estimation with Configurable Networks
Viaarxiv icon

Accelerating Distributed K-FAC with Smart Parallelism of Computing and Communication Tasks

Add code
Jul 14, 2021
Figure 1 for Accelerating Distributed K-FAC with Smart Parallelism of Computing and Communication Tasks
Figure 2 for Accelerating Distributed K-FAC with Smart Parallelism of Computing and Communication Tasks
Figure 3 for Accelerating Distributed K-FAC with Smart Parallelism of Computing and Communication Tasks
Figure 4 for Accelerating Distributed K-FAC with Smart Parallelism of Computing and Communication Tasks
Viaarxiv icon

Automated Model Design and Benchmarking of 3D Deep Learning Models for COVID-19 Detection with Chest CT Scans

Add code
Feb 12, 2021
Figure 1 for Automated Model Design and Benchmarking of 3D Deep Learning Models for COVID-19 Detection with Chest CT Scans
Figure 2 for Automated Model Design and Benchmarking of 3D Deep Learning Models for COVID-19 Detection with Chest CT Scans
Figure 3 for Automated Model Design and Benchmarking of 3D Deep Learning Models for COVID-19 Detection with Chest CT Scans
Figure 4 for Automated Model Design and Benchmarking of 3D Deep Learning Models for COVID-19 Detection with Chest CT Scans
Viaarxiv icon

Towards Scalable Distributed Training of Deep Learning on Public Cloud Clusters

Add code
Oct 20, 2020
Figure 1 for Towards Scalable Distributed Training of Deep Learning on Public Cloud Clusters
Figure 2 for Towards Scalable Distributed Training of Deep Learning on Public Cloud Clusters
Figure 3 for Towards Scalable Distributed Training of Deep Learning on Public Cloud Clusters
Figure 4 for Towards Scalable Distributed Training of Deep Learning on Public Cloud Clusters
Viaarxiv icon

Communication-Efficient Distributed Deep Learning: Survey, Evaluation, and Challenges

Add code
May 27, 2020
Figure 1 for Communication-Efficient Distributed Deep Learning: Survey, Evaluation, and Challenges
Figure 2 for Communication-Efficient Distributed Deep Learning: Survey, Evaluation, and Challenges
Figure 3 for Communication-Efficient Distributed Deep Learning: Survey, Evaluation, and Challenges
Figure 4 for Communication-Efficient Distributed Deep Learning: Survey, Evaluation, and Challenges
Viaarxiv icon

FADNet: A Fast and Accurate Network for Disparity Estimation

Add code
Mar 24, 2020
Figure 1 for FADNet: A Fast and Accurate Network for Disparity Estimation
Figure 2 for FADNet: A Fast and Accurate Network for Disparity Estimation
Figure 3 for FADNet: A Fast and Accurate Network for Disparity Estimation
Figure 4 for FADNet: A Fast and Accurate Network for Disparity Estimation
Viaarxiv icon

Communication-Efficient Distributed Deep Learning: A Comprehensive Survey

Add code
Mar 10, 2020
Figure 1 for Communication-Efficient Distributed Deep Learning: A Comprehensive Survey
Figure 2 for Communication-Efficient Distributed Deep Learning: A Comprehensive Survey
Figure 3 for Communication-Efficient Distributed Deep Learning: A Comprehensive Survey
Figure 4 for Communication-Efficient Distributed Deep Learning: A Comprehensive Survey
Viaarxiv icon

Communication Contention Aware Scheduling of Multiple Deep Learning Training Jobs

Add code
Feb 24, 2020
Figure 1 for Communication Contention Aware Scheduling of Multiple Deep Learning Training Jobs
Figure 2 for Communication Contention Aware Scheduling of Multiple Deep Learning Training Jobs
Figure 3 for Communication Contention Aware Scheduling of Multiple Deep Learning Training Jobs
Figure 4 for Communication Contention Aware Scheduling of Multiple Deep Learning Training Jobs
Viaarxiv icon

Communication-Efficient Decentralized Learning with Sparsification and Adaptive Peer Selection

Add code
Feb 22, 2020
Figure 1 for Communication-Efficient Decentralized Learning with Sparsification and Adaptive Peer Selection
Figure 2 for Communication-Efficient Decentralized Learning with Sparsification and Adaptive Peer Selection
Figure 3 for Communication-Efficient Decentralized Learning with Sparsification and Adaptive Peer Selection
Figure 4 for Communication-Efficient Decentralized Learning with Sparsification and Adaptive Peer Selection
Viaarxiv icon

MG-WFBP: Merging Gradients Wisely for Efficient Communication in Distributed Deep Learning

Add code
Dec 18, 2019
Figure 1 for MG-WFBP: Merging Gradients Wisely for Efficient Communication in Distributed Deep Learning
Figure 2 for MG-WFBP: Merging Gradients Wisely for Efficient Communication in Distributed Deep Learning
Figure 3 for MG-WFBP: Merging Gradients Wisely for Efficient Communication in Distributed Deep Learning
Figure 4 for MG-WFBP: Merging Gradients Wisely for Efficient Communication in Distributed Deep Learning
Viaarxiv icon