Alert button
Picture for Naoya Maruyama

Naoya Maruyama

Alert button

The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism

Add code
Bookmark button
Alert button
Jul 25, 2020
Yosuke Oyama, Naoya Maruyama, Nikoli Dryden, Erin McCarthy, Peter Harrington, Jan Balewski, Satoshi Matsuoka, Peter Nugent, Brian Van Essen

Viaarxiv icon

Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism

Add code
Bookmark button
Alert button
Mar 15, 2019
Nikoli Dryden, Naoya Maruyama, Tom Benson, Tim Moon, Marc Snir, Brian Van Essen

Figure 1 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Figure 2 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Figure 3 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Figure 4 for Improving Strong-Scaling of CNN Training by Exploiting Finer-Grained Parallelism
Viaarxiv icon

Effective Quantization Approaches for Recurrent Neural Networks

Add code
Bookmark button
Alert button
Feb 07, 2018
Md Zahangir Alom, Adam T Moody, Naoya Maruyama, Brian C Van Essen, Tarek M. Taha

Figure 1 for Effective Quantization Approaches for Recurrent Neural Networks
Figure 2 for Effective Quantization Approaches for Recurrent Neural Networks
Figure 3 for Effective Quantization Approaches for Recurrent Neural Networks
Figure 4 for Effective Quantization Approaches for Recurrent Neural Networks
Viaarxiv icon