Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits


Aug 19, 2022
Vishakha Patil, Vineet Nair, Ganesh Ghalme, Arindam Khan


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

State-Visitation Fairness in Average-Reward MDPs


Mar 02, 2021
Ganesh Ghalme, Vineet Nair, Vishakha Patil, Yilun Zhou


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Budgeted and Non-budgeted Causal Bandits


Dec 13, 2020
Vineet Nair, Vishakha Patil, Gaurav Sinha


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Streaming Algorithms for Stochastic Multi-armed Bandits


Dec 09, 2020
Arnab Maiti, Vishakha Patil, Arindam Khan

* 24 pages, 2 figures, 4 algorithms 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Achieving Fairness in the Stochastic Multi-armed Bandit Problem


Jul 23, 2019
Vishakha Patil, Ganesh Ghalme, Vineet Nair, Y. Narahari

* arXiv admin note: substantial text overlap with arXiv:1905.11260 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Stochastic Multi-armed Bandits with Arm-specific Fairness Guarantees


May 27, 2019
Vishakha Patil, Ganesh Ghalme, Vineet Nair, Y. Narahari


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email