Alert button
Picture for Arindam Khan

Arindam Khan

Alert button

Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits

Aug 19, 2022
Vishakha Patil, Vineet Nair, Ganesh Ghalme, Arindam Khan

Figure 1 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 2 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 3 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 4 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Viaarxiv icon

Fairness and Welfare Quantification for Regret in Multi-Armed Bandits

May 27, 2022
Siddharth Barman, Arindam Khan, Arnab Maiti, Ayush Sawarni

Viaarxiv icon

Approximation Algorithms for ROUND-UFP and ROUND-SAP

Feb 07, 2022
Debajyoti Kar, Arindam Khan, Andreas Wiese

Figure 1 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Figure 2 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Figure 3 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Figure 4 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Viaarxiv icon

Streaming Algorithms for Stochastic Multi-armed Bandits

Dec 09, 2020
Arnab Maiti, Vishakha Patil, Arindam Khan

Figure 1 for Streaming Algorithms for Stochastic Multi-armed Bandits
Figure 2 for Streaming Algorithms for Stochastic Multi-armed Bandits
Viaarxiv icon