Picture for Jacob Nelson

Jacob Nelson

Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL

Add code
Nov 15, 2021
Figure 1 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Figure 2 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Figure 3 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Figure 4 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Viaarxiv icon

Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering

Add code
May 28, 2021
Figure 1 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Figure 2 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Figure 3 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Figure 4 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Viaarxiv icon

Scaling Distributed Machine Learning with In-Network Aggregation

Add code
Feb 22, 2019
Figure 1 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 2 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 3 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 4 for Scaling Distributed Machine Learning with In-Network Aggregation
Viaarxiv icon

Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training

Add code
May 21, 2018
Figure 1 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Figure 2 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Figure 3 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Figure 4 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Viaarxiv icon