Picture for Dongruo Zhou

Dongruo Zhou

Risk Bounds of Accelerated SGD for Overparameterized Linear Regression

Add code
Nov 23, 2023
Viaarxiv icon

Variance-Dependent Regret Bounds for Linear Bandits and Reinforcement Learning: Adaptivity and Computational Efficiency

Add code
Feb 21, 2023
Viaarxiv icon

Nearly Minimax Optimal Reinforcement Learning for Linear Markov Decision Processes

Add code
Dec 12, 2022
Viaarxiv icon

Learning Two-Player Mixture Markov Games: Kernel Function Approximation and Correlated Equilibrium

Add code
Aug 10, 2022
Viaarxiv icon

Computationally Efficient Horizon-Free Reinforcement Learning for Linear Mixture MDPs

Add code
May 23, 2022
Figure 1 for Computationally Efficient Horizon-Free Reinforcement Learning for Linear Mixture MDPs
Viaarxiv icon

Nearly Optimal Algorithms for Linear Contextual Bandits with Adversarial Corruptions

Add code
May 13, 2022
Figure 1 for Nearly Optimal Algorithms for Linear Contextual Bandits with Adversarial Corruptions
Viaarxiv icon

Bandit Learning with General Function Classes: Heteroscedastic Noise and Variance-dependent Regret Bounds

Add code
Feb 28, 2022
Figure 1 for Bandit Learning with General Function Classes: Heteroscedastic Noise and Variance-dependent Regret Bounds
Viaarxiv icon

Learning Contextual Bandits Through Perturbed Rewards

Add code
Jan 24, 2022
Figure 1 for Learning Contextual Bandits Through Perturbed Rewards
Figure 2 for Learning Contextual Bandits Through Perturbed Rewards
Figure 3 for Learning Contextual Bandits Through Perturbed Rewards
Figure 4 for Learning Contextual Bandits Through Perturbed Rewards
Viaarxiv icon

Faster Perturbed Stochastic Gradient Methods for Finding Local Minima

Add code
Oct 25, 2021
Figure 1 for Faster Perturbed Stochastic Gradient Methods for Finding Local Minima
Figure 2 for Faster Perturbed Stochastic Gradient Methods for Finding Local Minima
Viaarxiv icon

Linear Contextual Bandits with Adversarial Corruptions

Add code
Oct 25, 2021
Figure 1 for Linear Contextual Bandits with Adversarial Corruptions
Viaarxiv icon