Alert button
Picture for Milind Tambe

Milind Tambe

Alert button

Reflections from the Workshop on AI-Assisted Decision Making for Conservation

Add code
Bookmark button
Alert button
Jul 17, 2023
Lily Xu, Esther Rolf, Sara Beery, Joseph R. Bennett, Tanya Berger-Wolf, Tanya Birch, Elizabeth Bondi-Kelly, Justin Brashares, Melissa Chapman, Anthony Corso, Andrew Davies, Nikhil Garg, Angela Gaylard, Robert Heilmayr, Hannah Kerner, Konstantin Klemmer, Vipin Kumar, Lester Mackey, Claire Monteleoni, Paul Moorcroft, Jonathan Palmer, Andrew Perrault, David Thau, Milind Tambe

Figure 1 for Reflections from the Workshop on AI-Assisted Decision Making for Conservation
Figure 2 for Reflections from the Workshop on AI-Assisted Decision Making for Conservation
Figure 3 for Reflections from the Workshop on AI-Assisted Decision Making for Conservation
Figure 4 for Reflections from the Workshop on AI-Assisted Decision Making for Conservation
Viaarxiv icon

Leaving the Nest: Going Beyond Local Loss Functions for Predict-Then-Optimize

Add code
Bookmark button
Alert button
May 26, 2023
Sanket Shah, Andrew Perrault, Bryan Wilder, Milind Tambe

Figure 1 for Leaving the Nest: Going Beyond Local Loss Functions for Predict-Then-Optimize
Figure 2 for Leaving the Nest: Going Beyond Local Loss Functions for Predict-Then-Optimize
Figure 3 for Leaving the Nest: Going Beyond Local Loss Functions for Predict-Then-Optimize
Figure 4 for Leaving the Nest: Going Beyond Local Loss Functions for Predict-Then-Optimize
Viaarxiv icon

Limited Resource Allocation in a Non-Markovian World: The Case of Maternal and Child Healthcare

Add code
Bookmark button
Alert button
May 22, 2023
Panayiotis Danassis, Shresth Verma, Jackson A. Killian, Aparna Taneja, Milind Tambe

Figure 1 for Limited Resource Allocation in a Non-Markovian World: The Case of Maternal and Child Healthcare
Figure 2 for Limited Resource Allocation in a Non-Markovian World: The Case of Maternal and Child Healthcare
Figure 3 for Limited Resource Allocation in a Non-Markovian World: The Case of Maternal and Child Healthcare
Figure 4 for Limited Resource Allocation in a Non-Markovian World: The Case of Maternal and Child Healthcare
Viaarxiv icon

Fairness for Workers Who Pull the Arms: An Index Based Policy for Allocation of Restless Bandit Tasks

Add code
Bookmark button
Alert button
Mar 01, 2023
Arpita Biswas, Jackson A. Killian, Paula Rodriguez Diaz, Susobhan Ghosh, Milind Tambe

Figure 1 for Fairness for Workers Who Pull the Arms: An Index Based Policy for Allocation of Restless Bandit Tasks
Figure 2 for Fairness for Workers Who Pull the Arms: An Index Based Policy for Allocation of Restless Bandit Tasks
Figure 3 for Fairness for Workers Who Pull the Arms: An Index Based Policy for Allocation of Restless Bandit Tasks
Figure 4 for Fairness for Workers Who Pull the Arms: An Index Based Policy for Allocation of Restless Bandit Tasks
Viaarxiv icon

Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation

Add code
Bookmark button
Alert button
Feb 06, 2023
Aditya Mate, Bryan Wilder, Aparna Taneja, Milind Tambe

Figure 1 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Figure 2 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Figure 3 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Figure 4 for Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation
Viaarxiv icon

Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits

Add code
Bookmark button
Alert button
Jan 19, 2023
Paritosh Verma, Shresth Verma, Aditya Mate, Aparna Taneja, Milind Tambe

Figure 1 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Figure 2 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Figure 3 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Figure 4 for Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits
Viaarxiv icon

Indexability is Not Enough for Whittle: Improved, Near-Optimal Algorithms for Restless Bandits

Add code
Bookmark button
Alert button
Oct 31, 2022
Abheek Ghosh, Dheeraj Nagaraj, Manish Jain, Milind Tambe

Figure 1 for Indexability is Not Enough for Whittle: Improved, Near-Optimal Algorithms for Restless Bandits
Figure 2 for Indexability is Not Enough for Whittle: Improved, Near-Optimal Algorithms for Restless Bandits
Figure 3 for Indexability is Not Enough for Whittle: Improved, Near-Optimal Algorithms for Restless Bandits
Figure 4 for Indexability is Not Enough for Whittle: Improved, Near-Optimal Algorithms for Restless Bandits
Viaarxiv icon

Artificial Intelligence and Life in 2030: The One Hundred Year Study on Artificial Intelligence

Add code
Bookmark button
Alert button
Oct 31, 2022
Peter Stone, Rodney Brooks, Erik Brynjolfsson, Ryan Calo, Oren Etzioni, Greg Hager, Julia Hirschberg, Shivaram Kalyanakrishnan, Ece Kamar, Sarit Kraus, Kevin Leyton-Brown, David Parkes, William Press, AnnaLee Saxenian, Julie Shah, Milind Tambe, Astro Teller

Viaarxiv icon

Artificial Replay: A Meta-Algorithm for Harnessing Historical Data in Bandits

Add code
Bookmark button
Alert button
Sep 30, 2022
Siddhartha Banerjee, Sean R. Sinclair, Milind Tambe, Lily Xu, Christina Lee Yu

Figure 1 for Artificial Replay: A Meta-Algorithm for Harnessing Historical Data in Bandits
Figure 2 for Artificial Replay: A Meta-Algorithm for Harnessing Historical Data in Bandits
Figure 3 for Artificial Replay: A Meta-Algorithm for Harnessing Historical Data in Bandits
Figure 4 for Artificial Replay: A Meta-Algorithm for Harnessing Historical Data in Bandits
Viaarxiv icon

Optimistic Whittle Index Policy: Online Learning for Restless Bandits

Add code
Bookmark button
Alert button
May 30, 2022
Kai Wang, Lily Xu, Aparna Taneja, Milind Tambe

Figure 1 for Optimistic Whittle Index Policy: Online Learning for Restless Bandits
Figure 2 for Optimistic Whittle Index Policy: Online Learning for Restless Bandits
Figure 3 for Optimistic Whittle Index Policy: Online Learning for Restless Bandits
Figure 4 for Optimistic Whittle Index Policy: Online Learning for Restless Bandits
Viaarxiv icon