Picture for Naoya Maruyama

Naoya Maruyama

The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism

Add code
Jul 25, 2020
Figure 1 for The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Figure 2 for The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Figure 3 for The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Figure 4 for The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Viaarxiv icon

Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism

Add code
Mar 15, 2019
Figure 1 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Figure 2 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Figure 3 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Figure 4 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Viaarxiv icon

Effective Quantization Approaches for Recurrent Neural Networks

Add code
Feb 07, 2018
Figure 1 for Effective Quantization Approaches for Recurrent Neural Networks
Figure 2 for Effective Quantization Approaches for Recurrent Neural Networks
Figure 3 for Effective Quantization Approaches for Recurrent Neural Networks
Figure 4 for Effective Quantization Approaches for Recurrent Neural Networks
Viaarxiv icon