Picture for Aditya Mate

Aditya Mate

A resource-constrained stochastic scheduling algorithm for homeless street outreach and gleaning edible food

Mar 15, 2024
Figure 1 for A resource-constrained stochastic scheduling algorithm for homeless street outreach and gleaning edible food
Figure 2 for A resource-constrained stochastic scheduling algorithm for homeless street outreach and gleaning edible food
Figure 3 for A resource-constrained stochastic scheduling algorithm for homeless street outreach and gleaning edible food
Figure 4 for A resource-constrained stochastic scheduling algorithm for homeless street outreach and gleaning edible food
Viaarxiv icon

Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation

Feb 06, 2023
Figure 1 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Figure 2 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Figure 3 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Figure 4 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Viaarxiv icon

Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits

Jan 19, 2023
Figure 1 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Figure 2 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Figure 3 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Figure 4 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Viaarxiv icon

Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain

Feb 02, 2022
Figure 1 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Figure 2 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Figure 3 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Figure 4 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Viaarxiv icon

Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health

Sep 16, 2021
Figure 1 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Figure 2 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Figure 3 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Figure 4 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Viaarxiv icon

Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes

Apr 05, 2021
Figure 1 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Figure 2 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Figure 3 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Figure 4 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Viaarxiv icon

Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems

Mar 08, 2021
Figure 1 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Figure 2 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Figure 3 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Figure 4 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Viaarxiv icon

Collapsing Bandits and Their Application to Public Health Interventions

Jul 05, 2020
Figure 1 for Collapsing Bandits and Their Application to Public Health Interventions
Figure 2 for Collapsing Bandits and Their Application to Public Health Interventions
Figure 3 for Collapsing Bandits and Their Application to Public Health Interventions
Figure 4 for Collapsing Bandits and Their Application to Public Health Interventions
Viaarxiv icon

Decision-Focused Learning of Adversary Behavior in Security Games

Mar 03, 2019
Figure 1 for Decision-Focused Learning of Adversary Behavior in Security Games
Figure 2 for Decision-Focused Learning of Adversary Behavior in Security Games
Figure 3 for Decision-Focused Learning of Adversary Behavior in Security Games
Figure 4 for Decision-Focused Learning of Adversary Behavior in Security Games
Viaarxiv icon