Picture for Yuansi Chen

Yuansi Chen

EECS, INRIA Grenoble Rhône-Alpes / LJK Laboratoire Jean Kuntzmann

Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms

Sep 19, 2023
Figure 1 for Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms
Figure 2 for Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms
Figure 3 for Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms
Figure 4 for Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms
Viaarxiv icon

When does Metropolized Hamiltonian Monte Carlo provably outperform Metropolis-adjusted Langevin algorithm?

Apr 10, 2023
Figure 1 for When does Metropolized Hamiltonian Monte Carlo provably outperform Metropolis-adjusted Langevin algorithm?
Viaarxiv icon

A Simple Proof of the Mixing of Metropolis-Adjusted Langevin Algorithm under Smoothness and Isoperimetry

Apr 08, 2023
Viaarxiv icon

Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling

Sep 27, 2021
Figure 1 for Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling
Figure 2 for Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling
Figure 3 for Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling
Viaarxiv icon

Domain adaptation under structural causal models

Add code
Oct 29, 2020
Figure 1 for Domain adaptation under structural causal models
Figure 2 for Domain adaptation under structural causal models
Figure 3 for Domain adaptation under structural causal models
Figure 4 for Domain adaptation under structural causal models
Viaarxiv icon

Fast mixing of Metropolized Hamiltonian Monte Carlo: Benefits of multi-step gradients

Add code
May 29, 2019
Figure 1 for Fast mixing of Metropolized Hamiltonian Monte Carlo: Benefits of multi-step gradients
Figure 2 for Fast mixing of Metropolized Hamiltonian Monte Carlo: Benefits of multi-step gradients
Figure 3 for Fast mixing of Metropolized Hamiltonian Monte Carlo: Benefits of multi-step gradients
Figure 4 for Fast mixing of Metropolized Hamiltonian Monte Carlo: Benefits of multi-step gradients
Viaarxiv icon

Sampling Can Be Faster Than Optimization

Nov 20, 2018
Figure 1 for Sampling Can Be Faster Than Optimization
Figure 2 for Sampling Can Be Faster Than Optimization
Figure 3 for Sampling Can Be Faster Than Optimization
Figure 4 for Sampling Can Be Faster Than Optimization
Viaarxiv icon

Fast MCMC sampling algorithms on polytopes

Add code
Jul 08, 2018
Figure 1 for Fast MCMC sampling algorithms on polytopes
Figure 2 for Fast MCMC sampling algorithms on polytopes
Figure 3 for Fast MCMC sampling algorithms on polytopes
Figure 4 for Fast MCMC sampling algorithms on polytopes
Viaarxiv icon

Log-concave sampling: Metropolis-Hastings algorithms are fast!

Add code
Jul 08, 2018
Figure 1 for Log-concave sampling: Metropolis-Hastings algorithms are fast!
Figure 2 for Log-concave sampling: Metropolis-Hastings algorithms are fast!
Figure 3 for Log-concave sampling: Metropolis-Hastings algorithms are fast!
Figure 4 for Log-concave sampling: Metropolis-Hastings algorithms are fast!
Viaarxiv icon

Stability and Convergence Trade-off of Iterative Optimization Algorithms

Add code
Apr 04, 2018
Figure 1 for Stability and Convergence Trade-off of Iterative Optimization Algorithms
Figure 2 for Stability and Convergence Trade-off of Iterative Optimization Algorithms
Figure 3 for Stability and Convergence Trade-off of Iterative Optimization Algorithms
Figure 4 for Stability and Convergence Trade-off of Iterative Optimization Algorithms
Viaarxiv icon