Picture for Shubhada Agrawal

Shubhada Agrawal

Cover meets Robbins while Betting on Bounded Data: $\ln n$ Regret and Almost Sure $\ln\ln n$ Regret

Add code
Apr 22, 2026
Viaarxiv icon

Regret Tail Characterization of Optimal Bandit Algorithms with Generic Rewards

Add code
Apr 16, 2026
Viaarxiv icon

Asymptotically Optimal Sequential Testing with Markovian Data

Add code
Feb 19, 2026
Viaarxiv icon

Eventually LIL Regret: Almost Sure $\ln\ln T$ Regret for a sub-Gaussian Mixture on Unbounded Data

Add code
Dec 13, 2025
Viaarxiv icon

On Stopping Times of Power-one Sequential Tests: Tight Lower and Upper Bounds

Add code
Apr 28, 2025
Viaarxiv icon

Markov Chain Variance Estimation: A Stochastic Approximation Approach

Add code
Sep 09, 2024
Figure 1 for Markov Chain Variance Estimation: A Stochastic Approximation Approach
Viaarxiv icon

Optimal Top-Two Method for Best Arm Identification and Fluid Analysis

Add code
Mar 14, 2024
Figure 1 for Optimal Top-Two Method for Best Arm Identification and Fluid Analysis
Figure 2 for Optimal Top-Two Method for Best Arm Identification and Fluid Analysis
Figure 3 for Optimal Top-Two Method for Best Arm Identification and Fluid Analysis
Figure 4 for Optimal Top-Two Method for Best Arm Identification and Fluid Analysis
Viaarxiv icon

CRIMED: Lower and Upper Bounds on Regret for Bandits with Unbounded Stochastic Corruption

Add code
Sep 28, 2023
Figure 1 for CRIMED: Lower and Upper Bounds on Regret for Bandits with Unbounded Stochastic Corruption
Figure 2 for CRIMED: Lower and Upper Bounds on Regret for Bandits with Unbounded Stochastic Corruption
Figure 3 for CRIMED: Lower and Upper Bounds on Regret for Bandits with Unbounded Stochastic Corruption
Figure 4 for CRIMED: Lower and Upper Bounds on Regret for Bandits with Unbounded Stochastic Corruption
Viaarxiv icon

Optimal Best-Arm Identification in Bandits with Access to Offline Data

Add code
Jun 15, 2023
Figure 1 for Optimal Best-Arm Identification in Bandits with Access to Offline Data
Figure 2 for Optimal Best-Arm Identification in Bandits with Access to Offline Data
Figure 3 for Optimal Best-Arm Identification in Bandits with Access to Offline Data
Figure 4 for Optimal Best-Arm Identification in Bandits with Access to Offline Data
Viaarxiv icon

Regret Minimization in Heavy-Tailed Bandits

Add code
Feb 07, 2021
Figure 1 for Regret Minimization in Heavy-Tailed Bandits
Figure 2 for Regret Minimization in Heavy-Tailed Bandits
Viaarxiv icon