Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

Incentivizing Combinatorial Bandit Exploration



Xinyan Hu , Dung Daniel Ngo , Aleksandrs Slivkins , Zhiwei Steven Wu

* 9 pages of main text, 21 pages in total 

   Access Paper or Ask Questions

Sayer: Using Implicit Feedback to Optimize System Policies



Mathias Lécuyer , Sang Hoon Kim , Mihir Nanavati , Junchen Jiang , Siddhartha Sen , Amit Sharma , Aleksandrs Slivkins


   Access Paper or Ask Questions

Exploration and Incentives in Reinforcement Learning



Max Simchowitz , Aleksandrs Slivkins


   Access Paper or Ask Questions

Competing Bandits: The Perils of Exploration Under Competition



Guy Aridor , Yishay Mansour , Aleksandrs Slivkins , Zhiwei Steven Wu

* merged and extended version of arXiv:1702.08533 and arXiv:1902.05590 

   Access Paper or Ask Questions

Adaptive Discretization for Adversarial Bandits with Continuous Action Spaces



Chara Podimata , Aleksandrs Slivkins


   Access Paper or Ask Questions

Efficient Contextual Bandits with Continuous Actions



Maryam Majzoubi , Chicheng Zhang , Rajan Chari , Akshay Krishnamurthy , John Langford , Aleksandrs Slivkins


   Access Paper or Ask Questions

Constrained episodic reinforcement learning in concave-convex and knapsack settings



Kianté Brantley , Miroslav Dudik , Thodoris Lykouris , Sobhan Miryoosefi , Max Simchowitz , Aleksandrs Slivkins , Wen Sun


   Access Paper or Ask Questions

Greedy Algorithm almost Dominates in Smoothed Contextual Bandits



Manish Raghavan , Aleksandrs Slivkins , Jennifer Wortman Vaughan , Zhiwei Steven Wu

* Results in this paper, without any proofs, have been announced in an extended abstract (Raghavan et al., 2018a), and fleshed out in the technical report (Raghavan et al., 2018b [arXiv:1806.00543]). This manuscript covers a subset of results from Raghavan et al. (2018a,b), focusing on the greedy algorithm, and is streamlined accordingly 

   Access Paper or Ask Questions

Sample Complexity of Incentivized Exploration



Mark Sellke , Aleksandrs Slivkins


   Access Paper or Ask Questions

Advances in Bandits with Knapsacks



Karthik Abinav Sankararaman , Aleksandrs Slivkins


   Access Paper or Ask Questions

1
2
3
4
>>