Picture for Michael Houston

Michael Houston

Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, A Large-Scale Generative Language Model

Add code
Feb 04, 2022
Figure 1 for Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, A Large-Scale Generative Language Model
Figure 2 for Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, A Large-Scale Generative Language Model
Figure 3 for Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, A Large-Scale Generative Language Model
Figure 4 for Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, A Large-Scale Generative Language Model
Viaarxiv icon

Highly-scalable, physics-informed GANs for learning solutions of stochastic PDEs

Add code
Oct 29, 2019
Figure 1 for Highly-scalable, physics-informed GANs for learning solutions of stochastic PDEs
Figure 2 for Highly-scalable, physics-informed GANs for learning solutions of stochastic PDEs
Figure 3 for Highly-scalable, physics-informed GANs for learning solutions of stochastic PDEs
Figure 4 for Highly-scalable, physics-informed GANs for learning solutions of stochastic PDEs
Viaarxiv icon

Mixed Precision Training

Add code
Feb 15, 2018
Figure 1 for Mixed Precision Training
Figure 2 for Mixed Precision Training
Figure 3 for Mixed Precision Training
Figure 4 for Mixed Precision Training
Viaarxiv icon