Alert button
Picture for Alexander A. Alemi

Alexander A. Alemi

Alert button

Speed Limits for Deep Learning

Jul 27, 2023
Inbar Seroussi, Alexander A. Alemi, Moritz Helias, Zohar Ringel

Figure 1 for Speed Limits for Deep Learning
Viaarxiv icon

Variational Prediction

Jul 14, 2023
Alexander A. Alemi, Ben Poole

Figure 1 for Variational Prediction
Figure 2 for Variational Prediction
Figure 3 for Variational Prediction
Viaarxiv icon

Weighted Ensemble Self-Supervised Learning

Nov 18, 2022
Yangjun Ruan, Saurabh Singh, Warren Morningstar, Alexander A. Alemi, Sergey Ioffe, Ian Fischer, Joshua V. Dillon

Figure 1 for Weighted Ensemble Self-Supervised Learning
Figure 2 for Weighted Ensemble Self-Supervised Learning
Figure 3 for Weighted Ensemble Self-Supervised Learning
Figure 4 for Weighted Ensemble Self-Supervised Learning
Viaarxiv icon

Bayesian Imitation Learning for End-to-End Mobile Manipulation

Feb 15, 2022
Yuqing Du, Daniel Ho, Alexander A. Alemi, Eric Jang, Mohi Khansari

Figure 1 for Bayesian Imitation Learning for End-to-End Mobile Manipulation
Figure 2 for Bayesian Imitation Learning for End-to-End Mobile Manipulation
Figure 3 for Bayesian Imitation Learning for End-to-End Mobile Manipulation
Figure 4 for Bayesian Imitation Learning for End-to-End Mobile Manipulation
Viaarxiv icon

A Closer Look at the Adversarial Robustness of Information Bottleneck Models

Jul 12, 2021
Iryna Korshunova, David Stutz, Alexander A. Alemi, Olivia Wiles, Sven Gowal

Figure 1 for A Closer Look at the Adversarial Robustness of Information Bottleneck Models
Figure 2 for A Closer Look at the Adversarial Robustness of Information Bottleneck Models
Figure 3 for A Closer Look at the Adversarial Robustness of Information Bottleneck Models
Figure 4 for A Closer Look at the Adversarial Robustness of Information Bottleneck Models
Viaarxiv icon

Does Knowledge Distillation Really Work?

Jun 10, 2021
Samuel Stanton, Pavel Izmailov, Polina Kirichenko, Alexander A. Alemi, Andrew Gordon Wilson

Figure 1 for Does Knowledge Distillation Really Work?
Figure 2 for Does Knowledge Distillation Really Work?
Figure 3 for Does Knowledge Distillation Really Work?
Figure 4 for Does Knowledge Distillation Really Work?
Viaarxiv icon

PAC$^m$-Bayes: Narrowing the Empirical Risk Gap in the Misspecified Bayesian Regime

Oct 19, 2020
Warren R. Morningstar, Alexander A. Alemi, Joshua V. Dillon

Figure 1 for PAC$^m$-Bayes: Narrowing the Empirical Risk Gap in the Misspecified Bayesian Regime
Figure 2 for PAC$^m$-Bayes: Narrowing the Empirical Risk Gap in the Misspecified Bayesian Regime
Figure 3 for PAC$^m$-Bayes: Narrowing the Empirical Risk Gap in the Misspecified Bayesian Regime
Figure 4 for PAC$^m$-Bayes: Narrowing the Empirical Risk Gap in the Misspecified Bayesian Regime
Viaarxiv icon

Density of States Estimation for Out-of-Distribution Detection

Jun 22, 2020
Warren R. Morningstar, Cusuh Ham, Andrew G. Gallagher, Balaji Lakshminarayanan, Alexander A. Alemi, Joshua V. Dillon

Figure 1 for Density of States Estimation for Out-of-Distribution Detection
Figure 2 for Density of States Estimation for Out-of-Distribution Detection
Figure 3 for Density of States Estimation for Out-of-Distribution Detection
Figure 4 for Density of States Estimation for Out-of-Distribution Detection
Viaarxiv icon

CEB Improves Model Robustness

Feb 13, 2020
Ian Fischer, Alexander A. Alemi

Figure 1 for CEB Improves Model Robustness
Figure 2 for CEB Improves Model Robustness
Figure 3 for CEB Improves Model Robustness
Figure 4 for CEB Improves Model Robustness
Viaarxiv icon