Alert button
Picture for Yangyi Lu

Yangyi Lu

Alert button

Offline Policy Evaluation and Optimization under Confounding

Add code
Bookmark button
Alert button
Dec 01, 2022
Kevin Tan, Yangyi Lu, Chinmaya Kausik, Yixin Wang, Ambuj Tewari

Figure 1 for Offline Policy Evaluation and Optimization under Confounding
Figure 2 for Offline Policy Evaluation and Optimization under Confounding
Figure 3 for Offline Policy Evaluation and Optimization under Confounding
Figure 4 for Offline Policy Evaluation and Optimization under Confounding
Viaarxiv icon

Bandit Algorithms for Precision Medicine

Add code
Bookmark button
Alert button
Aug 10, 2021
Yangyi Lu, Ziping Xu, Ambuj Tewari

Viaarxiv icon

Causal Bandits with Unknown Graph Structure

Add code
Bookmark button
Alert button
Jun 05, 2021
Yangyi Lu, Amirhossein Meisami, Ambuj Tewari

Figure 1 for Causal Bandits with Unknown Graph Structure
Viaarxiv icon

Causal Markov Decision Processes: Learning Good Interventions Efficiently

Add code
Bookmark button
Alert button
Feb 15, 2021
Yangyi Lu, Amirhossein Meisami, Ambuj Tewari

Figure 1 for Causal Markov Decision Processes: Learning Good Interventions Efficiently
Figure 2 for Causal Markov Decision Processes: Learning Good Interventions Efficiently
Figure 3 for Causal Markov Decision Processes: Learning Good Interventions Efficiently
Figure 4 for Causal Markov Decision Processes: Learning Good Interventions Efficiently
Viaarxiv icon

Low-Rank Generalized Linear Bandit Problems

Add code
Bookmark button
Alert button
Jun 04, 2020
Yangyi Lu, Amirhossein Meisami, Ambuj Tewari

Figure 1 for Low-Rank Generalized Linear Bandit Problems
Figure 2 for Low-Rank Generalized Linear Bandit Problems
Viaarxiv icon

Regret Analysis of Causal Bandit Problems

Add code
Bookmark button
Alert button
Oct 11, 2019
Yangyi Lu, Amirhossein Meisami, Ambuj Tewari, Zhenyu Yan

Figure 1 for Regret Analysis of Causal Bandit Problems
Figure 2 for Regret Analysis of Causal Bandit Problems
Figure 3 for Regret Analysis of Causal Bandit Problems
Figure 4 for Regret Analysis of Causal Bandit Problems
Viaarxiv icon