Picture for Maryam Aziz

Maryam Aziz

SEQUEL, CNRS, CRIStAL

Improving Content Retrievability in Search with Controllable Query Generation

Add code
Mar 21, 2023
Figure 1 for Improving Content Retrievability in Search with Controllable Query Generation
Figure 2 for Improving Content Retrievability in Search with Controllable Query Generation
Figure 3 for Improving Content Retrievability in Search with Controllable Query Generation
Figure 4 for Improving Content Retrievability in Search with Controllable Query Generation
Viaarxiv icon

On Multi-Armed Bandit Designs for Phase I Clinical Trials

Add code
Mar 17, 2019
Figure 1 for On Multi-Armed Bandit Designs for Phase I Clinical Trials
Figure 2 for On Multi-Armed Bandit Designs for Phase I Clinical Trials
Figure 3 for On Multi-Armed Bandit Designs for Phase I Clinical Trials
Figure 4 for On Multi-Armed Bandit Designs for Phase I Clinical Trials
Viaarxiv icon

Pure-Exploration for Infinite-Armed Bandits with General Arm Reservoirs

Add code
Nov 15, 2018
Figure 1 for Pure-Exploration for Infinite-Armed Bandits with General Arm Reservoirs
Figure 2 for Pure-Exploration for Infinite-Armed Bandits with General Arm Reservoirs
Figure 3 for Pure-Exploration for Infinite-Armed Bandits with General Arm Reservoirs
Figure 4 for Pure-Exploration for Infinite-Armed Bandits with General Arm Reservoirs
Viaarxiv icon

Adaptively Pruning Features for Boosted Decision Trees

Add code
May 19, 2018
Figure 1 for Adaptively Pruning Features for Boosted Decision Trees
Figure 2 for Adaptively Pruning Features for Boosted Decision Trees
Figure 3 for Adaptively Pruning Features for Boosted Decision Trees
Figure 4 for Adaptively Pruning Features for Boosted Decision Trees
Viaarxiv icon

Pure Exploration in Infinitely-Armed Bandit Models with Fixed-Confidence

Add code
Mar 13, 2018
Figure 1 for Pure Exploration in Infinitely-Armed Bandit Models with Fixed-Confidence
Figure 2 for Pure Exploration in Infinitely-Armed Bandit Models with Fixed-Confidence
Viaarxiv icon