Picture for Alexandros G. Dimakis

Alexandros G. Dimakis

Robust compressed sensing of generative models

Add code
Jun 18, 2020
Figure 1 for Robust compressed sensing of generative models
Figure 2 for Robust compressed sensing of generative models
Figure 3 for Robust compressed sensing of generative models
Viaarxiv icon

Deep Learning Techniques for Inverse Problems in Imaging

Add code
May 12, 2020
Figure 1 for Deep Learning Techniques for Inverse Problems in Imaging
Figure 2 for Deep Learning Techniques for Inverse Problems in Imaging
Figure 3 for Deep Learning Techniques for Inverse Problems in Imaging
Figure 4 for Deep Learning Techniques for Inverse Problems in Imaging
Viaarxiv icon

Compressed Sensing with Invertible Generative Models and Dependent Noise

Add code
Mar 18, 2020
Figure 1 for Compressed Sensing with Invertible Generative Models and Dependent Noise
Figure 2 for Compressed Sensing with Invertible Generative Models and Dependent Noise
Figure 3 for Compressed Sensing with Invertible Generative Models and Dependent Noise
Figure 4 for Compressed Sensing with Invertible Generative Models and Dependent Noise
Viaarxiv icon

Exactly Computing the Local Lipschitz Constant of ReLU Networks

Add code
Mar 02, 2020
Figure 1 for Exactly Computing the Local Lipschitz Constant of ReLU Networks
Figure 2 for Exactly Computing the Local Lipschitz Constant of ReLU Networks
Figure 3 for Exactly Computing the Local Lipschitz Constant of ReLU Networks
Figure 4 for Exactly Computing the Local Lipschitz Constant of ReLU Networks
Viaarxiv icon

Conditional Sampling from Invertible Generative Models with Applications to Inverse Problems

Add code
Feb 26, 2020
Figure 1 for Conditional Sampling from Invertible Generative Models with Applications to Inverse Problems
Figure 2 for Conditional Sampling from Invertible Generative Models with Applications to Inverse Problems
Figure 3 for Conditional Sampling from Invertible Generative Models with Applications to Inverse Problems
Figure 4 for Conditional Sampling from Invertible Generative Models with Applications to Inverse Problems
Viaarxiv icon

Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models

Add code
Dec 02, 2019
Figure 1 for Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models
Figure 2 for Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models
Figure 3 for Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models
Figure 4 for Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models
Viaarxiv icon

Communication-Efficient Asynchronous Stochastic Frank-Wolfe over Nuclear-norm Balls

Add code
Oct 17, 2019
Figure 1 for Communication-Efficient Asynchronous Stochastic Frank-Wolfe over Nuclear-norm Balls
Figure 2 for Communication-Efficient Asynchronous Stochastic Frank-Wolfe over Nuclear-norm Balls
Figure 3 for Communication-Efficient Asynchronous Stochastic Frank-Wolfe over Nuclear-norm Balls
Figure 4 for Communication-Efficient Asynchronous Stochastic Frank-Wolfe over Nuclear-norm Balls
Viaarxiv icon

SGD Learns One-Layer Networks in WGANs

Add code
Oct 15, 2019
Figure 1 for SGD Learns One-Layer Networks in WGANs
Figure 2 for SGD Learns One-Layer Networks in WGANs
Viaarxiv icon

Learning Distributions Generated by One-Layer ReLU Networks

Add code
Sep 19, 2019
Figure 1 for Learning Distributions Generated by One-Layer ReLU Networks
Figure 2 for Learning Distributions Generated by One-Layer ReLU Networks
Viaarxiv icon

Inverting Deep Generative models, One layer at a time

Add code
Jun 19, 2019
Figure 1 for Inverting Deep Generative models, One layer at a time
Figure 2 for Inverting Deep Generative models, One layer at a time
Figure 3 for Inverting Deep Generative models, One layer at a time
Figure 4 for Inverting Deep Generative models, One layer at a time
Viaarxiv icon