Alert button
Picture for Branislav Kveton

Branislav Kveton

Alert button

Differentiable Bandit Exploration

Add code
Bookmark button
Alert button
Feb 17, 2020
Craig Boutilier, Chih-Wei Hsu, Branislav Kveton, Martin Mladenov, Csaba Szepesvari, Manzil Zaheer

Figure 1 for Differentiable Bandit Exploration
Figure 2 for Differentiable Bandit Exploration
Figure 3 for Differentiable Bandit Exploration
Viaarxiv icon

Old Dog Learns New Tricks: Randomized UCB for Bandit Problems

Add code
Bookmark button
Alert button
Oct 11, 2019
Sharan Vaswani, Abbas Mehrabian, Audrey Durand, Branislav Kveton

Figure 1 for Old Dog Learns New Tricks: Randomized UCB for Bandit Problems
Figure 2 for Old Dog Learns New Tricks: Randomized UCB for Bandit Problems
Figure 3 for Old Dog Learns New Tricks: Randomized UCB for Bandit Problems
Figure 4 for Old Dog Learns New Tricks: Randomized UCB for Bandit Problems
Viaarxiv icon

Randomized Exploration in Generalized Linear Bandits

Add code
Bookmark button
Alert button
Jun 21, 2019
Branislav Kveton, Manzil Zaheer, Csaba Szepesvari, Lihong Li, Mohammad Ghavamzadeh, Craig Boutilier

Figure 1 for Randomized Exploration in Generalized Linear Bandits
Figure 2 for Randomized Exploration in Generalized Linear Bandits
Viaarxiv icon

Waterfall Bandits: Learning to Sell Ads Online

Add code
Bookmark button
Alert button
Apr 20, 2019
Branislav Kveton, Saied Mahdian, S. Muthukrishnan, Zheng Wen, Yikun Xian

Figure 1 for Waterfall Bandits: Learning to Sell Ads Online
Figure 2 for Waterfall Bandits: Learning to Sell Ads Online
Figure 3 for Waterfall Bandits: Learning to Sell Ads Online
Figure 4 for Waterfall Bandits: Learning to Sell Ads Online
Viaarxiv icon

Empirical Bayes Regret Minimization

Add code
Bookmark button
Alert button
Apr 04, 2019
Chih-Wei Hsu, Branislav Kveton, Ofer Meshi, Martin Mladenov, Csaba Szepesvari

Figure 1 for Empirical Bayes Regret Minimization
Figure 2 for Empirical Bayes Regret Minimization
Figure 3 for Empirical Bayes Regret Minimization
Figure 4 for Empirical Bayes Regret Minimization
Viaarxiv icon

Perturbed-History Exploration in Stochastic Linear Bandits

Add code
Bookmark button
Alert button
Mar 21, 2019
Branislav Kveton, Csaba Szepesvari, Mohammad Ghavamzadeh, Craig Boutilier

Figure 1 for Perturbed-History Exploration in Stochastic Linear Bandits
Figure 2 for Perturbed-History Exploration in Stochastic Linear Bandits
Figure 3 for Perturbed-History Exploration in Stochastic Linear Bandits
Viaarxiv icon

Perturbed-History Exploration in Stochastic Multi-Armed Bandits

Add code
Bookmark button
Alert button
Feb 26, 2019
Branislav Kveton, Csaba Szepesvari, Mohammad Ghavamzadeh, Craig Boutilier

Figure 1 for Perturbed-History Exploration in Stochastic Multi-Armed Bandits
Viaarxiv icon

Garbage In, Reward Out: Bootstrapping Exploration in Multi-Armed Bandits

Add code
Bookmark button
Alert button
Nov 13, 2018
Branislav Kveton, Csaba Szepesvari, Zheng Wen, Mohammad Ghavamzadeh, Tor Lattimore

Figure 1 for Garbage In, Reward Out: Bootstrapping Exploration in Multi-Armed Bandits
Figure 2 for Garbage In, Reward Out: Bootstrapping Exploration in Multi-Armed Bandits
Viaarxiv icon

Online Diverse Learning to Rank from Partial-Click Feedback

Add code
Bookmark button
Alert button
Nov 01, 2018
Prakhar Gupta, Gaurush Hiranandani, Harvineet Singh, Branislav Kveton, Zheng Wen, Iftikhar Ahamath Burhanuddin

Figure 1 for Online Diverse Learning to Rank from Partial-Click Feedback
Figure 2 for Online Diverse Learning to Rank from Partial-Click Feedback
Figure 3 for Online Diverse Learning to Rank from Partial-Click Feedback
Figure 4 for Online Diverse Learning to Rank from Partial-Click Feedback
Viaarxiv icon

Online Influence Maximization under Independent Cascade Model with Semi-Bandit Feedback

Add code
Bookmark button
Alert button
Jun 19, 2018
Zheng Wen, Branislav Kveton, Michal Valko, Sharan Vaswani

Figure 1 for Online Influence Maximization under Independent Cascade Model with Semi-Bandit Feedback
Figure 2 for Online Influence Maximization under Independent Cascade Model with Semi-Bandit Feedback
Figure 3 for Online Influence Maximization under Independent Cascade Model with Semi-Bandit Feedback
Viaarxiv icon