Alert button
Picture for Jacob Nelson

Jacob Nelson

Alert button

Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL

Add code
Bookmark button
Alert button
Nov 15, 2021
Aashaka Shah, Vijay Chidambaram, Meghan Cowan, Saeed Maleki, Madan Musuvathi, Todd Mytkowicz, Jacob Nelson, Olli Saarikivi, Rachee Singh

Figure 1 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Figure 2 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Figure 3 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Figure 4 for Synthesizing Collective Communication Algorithms for Heterogeneous Networks with TACCL
Viaarxiv icon

Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering

Add code
Bookmark button
Alert button
May 28, 2021
Liang Luo, Jacob Nelson, Arvind Krishnamurthy, Luis Ceze

Figure 1 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Figure 2 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Figure 3 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Figure 4 for Cloud Collectives: Towards Cloud-aware Collectives forML Workloads with Rank Reordering
Viaarxiv icon

Scaling Distributed Machine Learning with In-Network Aggregation

Add code
Bookmark button
Alert button
Feb 22, 2019
Amedeo Sapio, Marco Canini, Chen-Yu Ho, Jacob Nelson, Panos Kalnis, Changhoon Kim, Arvind Krishnamurthy, Masoud Moshref, Dan R. K. Ports, Peter Richtárik

Figure 1 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 2 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 3 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 4 for Scaling Distributed Machine Learning with In-Network Aggregation
Viaarxiv icon

Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training

Add code
Bookmark button
Alert button
May 21, 2018
Liang Luo, Jacob Nelson, Luis Ceze, Amar Phanishayee, Arvind Krishnamurthy

Figure 1 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Figure 2 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Figure 3 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Figure 4 for Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Viaarxiv icon