Picture for Jeffrey Pennington

Jeffrey Pennington

Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks

Add code
Aug 15, 2018
Figure 1 for Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks
Figure 2 for Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks
Figure 3 for Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks
Figure 4 for Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks
Viaarxiv icon

Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks

Add code
Jul 10, 2018
Figure 1 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Figure 2 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Figure 3 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Figure 4 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Viaarxiv icon

Sensitivity and Generalization in Neural Networks: an Empirical Study

Add code
Jun 18, 2018
Figure 1 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Figure 2 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Figure 3 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Figure 4 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Viaarxiv icon

Deep Neural Networks as Gaussian Processes

Add code
Mar 03, 2018
Figure 1 for Deep Neural Networks as Gaussian Processes
Figure 2 for Deep Neural Networks as Gaussian Processes
Figure 3 for Deep Neural Networks as Gaussian Processes
Figure 4 for Deep Neural Networks as Gaussian Processes
Viaarxiv icon

The Emergence of Spectral Universality in Deep Networks

Add code
Feb 27, 2018
Figure 1 for The Emergence of Spectral Universality in Deep Networks
Figure 2 for The Emergence of Spectral Universality in Deep Networks
Figure 3 for The Emergence of Spectral Universality in Deep Networks
Figure 4 for The Emergence of Spectral Universality in Deep Networks
Viaarxiv icon

Estimating the Spectral Density of Large Implicit Matrices

Add code
Feb 09, 2018
Figure 1 for Estimating the Spectral Density of Large Implicit Matrices
Figure 2 for Estimating the Spectral Density of Large Implicit Matrices
Figure 3 for Estimating the Spectral Density of Large Implicit Matrices
Figure 4 for Estimating the Spectral Density of Large Implicit Matrices
Viaarxiv icon

Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice

Add code
Nov 13, 2017
Figure 1 for Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
Figure 2 for Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
Figure 3 for Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
Figure 4 for Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
Viaarxiv icon

A Correspondence Between Random Neural Networks and Statistical Field Theory

Add code
Oct 18, 2017
Figure 1 for A Correspondence Between Random Neural Networks and Statistical Field Theory
Figure 2 for A Correspondence Between Random Neural Networks and Statistical Field Theory
Figure 3 for A Correspondence Between Random Neural Networks and Statistical Field Theory
Figure 4 for A Correspondence Between Random Neural Networks and Statistical Field Theory
Viaarxiv icon