Picture for Quoc Tran-Dinh

Quoc Tran-Dinh

A Unified Convergence Analysis for Shuffling-Type Gradient Methods

Add code
Feb 19, 2020
Figure 1 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Figure 2 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Figure 3 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Figure 4 for A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Viaarxiv icon

Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization

Add code
Feb 17, 2020
Figure 1 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Figure 2 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Figure 3 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Figure 4 for Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Viaarxiv icon

A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization

Add code
Feb 17, 2020
Figure 1 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Figure 2 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Figure 3 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Figure 4 for A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
Viaarxiv icon

Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm

Add code
Jul 26, 2019
Figure 1 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Figure 2 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Figure 3 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Figure 4 for Using positive spanning sets to achieve stationarity with the Boosted DC Algorithm
Viaarxiv icon

A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization

Add code
Jul 08, 2019
Figure 1 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Figure 2 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Figure 3 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Figure 4 for A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Viaarxiv icon

Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization

Add code
May 15, 2019
Figure 1 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Figure 2 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Figure 3 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Figure 4 for Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Viaarxiv icon

ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization

Add code
Mar 29, 2019
Figure 1 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Figure 2 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Figure 3 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Figure 4 for ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Viaarxiv icon

Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence

Add code
Sep 27, 2018
Figure 1 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Figure 2 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Figure 3 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Figure 4 for Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence
Viaarxiv icon

Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods

Add code
May 08, 2018
Figure 1 for Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Figure 2 for Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Figure 3 for Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Figure 4 for Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
Viaarxiv icon

Composite convex minimization involving self-concordant-like cost functions

Add code
Jan 20, 2018
Figure 1 for Composite convex minimization involving self-concordant-like cost functions
Figure 2 for Composite convex minimization involving self-concordant-like cost functions
Figure 3 for Composite convex minimization involving self-concordant-like cost functions
Figure 4 for Composite convex minimization involving self-concordant-like cost functions
Viaarxiv icon