Alert button
Picture for Jascha Sohl-Dickstein

Jascha Sohl-Dickstein

Alert button

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

Add code
Bookmark button
Alert button
Oct 11, 2018
Roman Novak, Lechao Xiao, Jaehoon Lee, Yasaman Bahri, Daniel A. Abolafia, Jeffrey Pennington, Jascha Sohl-Dickstein

Figure 1 for Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes
Figure 2 for Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes
Figure 3 for Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes
Figure 4 for Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes
Viaarxiv icon

Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks

Add code
Bookmark button
Alert button
Jul 10, 2018
Lechao Xiao, Yasaman Bahri, Jascha Sohl-Dickstein, Samuel S. Schoenholz, Jeffrey Pennington

Figure 1 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Figure 2 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Figure 3 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Figure 4 for Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Viaarxiv icon

Adversarial Reprogramming of Neural Networks

Add code
Bookmark button
Alert button
Jun 28, 2018
Gamaleldin F. Elsayed, Ian Goodfellow, Jascha Sohl-Dickstein

Figure 1 for Adversarial Reprogramming of Neural Networks
Figure 2 for Adversarial Reprogramming of Neural Networks
Figure 3 for Adversarial Reprogramming of Neural Networks
Figure 4 for Adversarial Reprogramming of Neural Networks
Viaarxiv icon

Guided evolutionary strategies: escaping the curse of dimensionality in random search

Add code
Bookmark button
Alert button
Jun 28, 2018
Niru Maheswaranathan, Luke Metz, George Tucker, Jascha Sohl-Dickstein

Figure 1 for Guided evolutionary strategies: escaping the curse of dimensionality in random search
Figure 2 for Guided evolutionary strategies: escaping the curse of dimensionality in random search
Figure 3 for Guided evolutionary strategies: escaping the curse of dimensionality in random search
Figure 4 for Guided evolutionary strategies: escaping the curse of dimensionality in random search
Viaarxiv icon

PCA of high dimensional random walks with comparison to neural network training

Add code
Bookmark button
Alert button
Jun 22, 2018
Joseph M. Antognini, Jascha Sohl-Dickstein

Figure 1 for PCA of high dimensional random walks with comparison to neural network training
Figure 2 for PCA of high dimensional random walks with comparison to neural network training
Figure 3 for PCA of high dimensional random walks with comparison to neural network training
Figure 4 for PCA of high dimensional random walks with comparison to neural network training
Viaarxiv icon

Sensitivity and Generalization in Neural Networks: an Empirical Study

Add code
Bookmark button
Alert button
Jun 18, 2018
Roman Novak, Yasaman Bahri, Daniel A. Abolafia, Jeffrey Pennington, Jascha Sohl-Dickstein

Figure 1 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Figure 2 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Figure 3 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Figure 4 for Sensitivity and Generalization in Neural Networks: an Empirical Study
Viaarxiv icon

Learning Unsupervised Learning Rules

Add code
Bookmark button
Alert button
May 23, 2018
Luke Metz, Niru Maheswaranathan, Brian Cheung, Jascha Sohl-Dickstein

Figure 1 for Learning Unsupervised Learning Rules
Figure 2 for Learning Unsupervised Learning Rules
Figure 3 for Learning Unsupervised Learning Rules
Figure 4 for Learning Unsupervised Learning Rules
Viaarxiv icon

Adversarial Examples that Fool both Computer Vision and Time-Limited Humans

Add code
Bookmark button
Alert button
May 22, 2018
Gamaleldin F. Elsayed, Shreya Shankar, Brian Cheung, Nicolas Papernot, Alex Kurakin, Ian Goodfellow, Jascha Sohl-Dickstein

Figure 1 for Adversarial Examples that Fool both Computer Vision and Time-Limited Humans
Figure 2 for Adversarial Examples that Fool both Computer Vision and Time-Limited Humans
Figure 3 for Adversarial Examples that Fool both Computer Vision and Time-Limited Humans
Figure 4 for Adversarial Examples that Fool both Computer Vision and Time-Limited Humans
Viaarxiv icon

Deep Neural Networks as Gaussian Processes

Add code
Bookmark button
Alert button
Mar 03, 2018
Jaehoon Lee, Yasaman Bahri, Roman Novak, Samuel S. Schoenholz, Jeffrey Pennington, Jascha Sohl-Dickstein

Figure 1 for Deep Neural Networks as Gaussian Processes
Figure 2 for Deep Neural Networks as Gaussian Processes
Figure 3 for Deep Neural Networks as Gaussian Processes
Figure 4 for Deep Neural Networks as Gaussian Processes
Viaarxiv icon

Generalizing Hamiltonian Monte Carlo with Neural Networks

Add code
Bookmark button
Alert button
Mar 02, 2018
Daniel Levy, Matthew D. Hoffman, Jascha Sohl-Dickstein

Figure 1 for Generalizing Hamiltonian Monte Carlo with Neural Networks
Figure 2 for Generalizing Hamiltonian Monte Carlo with Neural Networks
Figure 3 for Generalizing Hamiltonian Monte Carlo with Neural Networks
Figure 4 for Generalizing Hamiltonian Monte Carlo with Neural Networks
Viaarxiv icon