Picture for Tor Lattimore

Tor Lattimore

On Explore-Then-Commit Strategies

Add code
Nov 14, 2016
Figure 1 for On Explore-Then-Commit Strategies
Viaarxiv icon

The End of Optimism? An Asymptotic Analysis of Finite-Armed Linear Bandits

Add code
Oct 14, 2016
Viaarxiv icon

Free Lunch for Optimisation under the Universal Distribution

Add code
Aug 16, 2016
Figure 1 for Free Lunch for Optimisation under the Universal Distribution
Viaarxiv icon

Causal Bandits: Learning Good Interventions via Causal Inference

Add code
Jun 10, 2016
Figure 1 for Causal Bandits: Learning Good Interventions via Causal Inference
Figure 2 for Causal Bandits: Learning Good Interventions via Causal Inference
Viaarxiv icon

Thompson Sampling is Asymptotically Optimal in General Environments

Add code
Jun 03, 2016
Viaarxiv icon

Regret Analysis of the Finite-Horizon Gittins Index Strategy for Multi-Armed Bandits

Add code
May 27, 2016
Figure 1 for Regret Analysis of the Finite-Horizon Gittins Index Strategy for Multi-Armed Bandits
Figure 2 for Regret Analysis of the Finite-Horizon Gittins Index Strategy for Multi-Armed Bandits
Figure 3 for Regret Analysis of the Finite-Horizon Gittins Index Strategy for Multi-Armed Bandits
Figure 4 for Regret Analysis of the Finite-Horizon Gittins Index Strategy for Multi-Armed Bandits
Viaarxiv icon

Regret Analysis of the Anytime Optimally Confident UCB Algorithm

Add code
May 06, 2016
Figure 1 for Regret Analysis of the Anytime Optimally Confident UCB Algorithm
Figure 2 for Regret Analysis of the Anytime Optimally Confident UCB Algorithm
Viaarxiv icon

Optimally Confident UCB: Improved Regret for Finite-Armed Bandits

Add code
Feb 24, 2016
Figure 1 for Optimally Confident UCB: Improved Regret for Finite-Armed Bandits
Figure 2 for Optimally Confident UCB: Improved Regret for Finite-Armed Bandits
Viaarxiv icon

Conservative Bandits

Add code
Feb 13, 2016
Figure 1 for Conservative Bandits
Figure 2 for Conservative Bandits
Figure 3 for Conservative Bandits
Viaarxiv icon

The Pareto Regret Frontier for Bandits

Add code
Oct 30, 2015
Figure 1 for The Pareto Regret Frontier for Bandits
Viaarxiv icon