Picture for Quoc Tran-Dinh

Quoc Tran-Dinh

Randomized Primal-Dual Algorithms for Composite Convex Minimization with Faster Convergence Rates

Add code
Mar 03, 2020
Figure 1 for Randomized Primal-Dual Algorithms for Composite Convex Minimization with Faster Convergence Rates
Figure 2 for Randomized Primal-Dual Algorithms for Composite Convex Minimization with Faster Convergence Rates
Figure 3 for Randomized Primal-Dual Algorithms for Composite Convex Minimization with Faster Convergence Rates
Figure 4 for Randomized Primal-Dual Algorithms for Composite Convex Minimization with Faster Convergence Rates
Viaarxiv icon

A Hybrid Stochastic Policy Gradient Algorithm for Reinforcement Learning

Add code
Mar 01, 2020
Figure 1 for A Hybrid Stochastic Policy Gradient Algorithm for Reinforcement Learning
Figure 2 for A Hybrid Stochastic Policy Gradient Algorithm for Reinforcement Learning
Figure 3 for A Hybrid Stochastic Policy Gradient Algorithm for Reinforcement Learning
Figure 4 for A Hybrid Stochastic Policy Gradient Algorithm for Reinforcement Learning
Viaarxiv icon

A Unified Convergence Analysis for Shuffling-Type Gradient Methods

Add code
Feb 19, 2020
Figure 1 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Figure 2 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Figure 3 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Figure 4 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Viaarxiv icon

Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization

Add code
Feb 17, 2020
Figure 1 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Figure 2 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Figure 3 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Figure 4 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Viaarxiv icon

A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization

Add code
Feb 17, 2020
Figure 1 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Figure 2 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Figure 3 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Figure 4 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Viaarxiv icon

Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm

Add code
Jul 26, 2019
Figure 1 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Figure 2 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Figure 3 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Figure 4 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Viaarxiv icon

A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization

Add code
Jul 08, 2019
Figure 1 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Figure 2 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Figure 3 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Figure 4 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Viaarxiv icon

Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization

Add code
May 15, 2019
Figure 1 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Figure 2 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Figure 3 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Figure 4 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Viaarxiv icon

ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization

Add code
Mar 29, 2019
Figure 1 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Figure 2 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Figure 3 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Figure 4 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Viaarxiv icon

Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence

Add code
Sep 27, 2018
Figure 1 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Figure 2 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Figure 3 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Figure 4 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Viaarxiv icon