Picture for Mirco Mutti

Mirco Mutti

From Parameters to Behavior: Unsupervised Compression of the Policy Space

Add code
Sep 26, 2025
Viaarxiv icon

State Entropy Regularization for Robust Reinforcement Learning

Add code
Jun 08, 2025
Viaarxiv icon

Enhancing Diversity in Parallel Agents: A Maximum State Entropy Exploration Story

Add code
May 02, 2025
Viaarxiv icon

A Classification View on Meta Learning Bandits

Add code
Apr 06, 2025
Figure 1 for A Classification View on Meta Learning Bandits
Figure 2 for A Classification View on Meta Learning Bandits
Figure 3 for A Classification View on Meta Learning Bandits
Viaarxiv icon

Towards Principled Multi-Agent Task Agnostic Exploration

Add code
Feb 12, 2025
Viaarxiv icon

Reward Compatibility: A Framework for Inverse RL

Add code
Jan 14, 2025
Figure 1 for Reward Compatibility: A Framework for Inverse RL
Figure 2 for Reward Compatibility: A Framework for Inverse RL
Figure 3 for Reward Compatibility: A Framework for Inverse RL
Figure 4 for Reward Compatibility: A Framework for Inverse RL
Viaarxiv icon

Geometric Active Exploration in Markov Decision Processes: the Benefit of Abstraction

Add code
Jul 18, 2024
Figure 1 for Geometric Active Exploration in Markov Decision Processes: the Benefit of Abstraction
Figure 2 for Geometric Active Exploration in Markov Decision Processes: the Benefit of Abstraction
Viaarxiv icon

The Limits of Pure Exploration in POMDPs: When the Observation Entropy is Enough

Add code
Jun 18, 2024
Viaarxiv icon

How to Scale Inverse RL to Large State Spaces? A Provably Efficient Approach

Add code
Jun 06, 2024
Figure 1 for How to Scale Inverse RL to Large State Spaces? A Provably Efficient Approach
Figure 2 for How to Scale Inverse RL to Large State Spaces? A Provably Efficient Approach
Figure 3 for How to Scale Inverse RL to Large State Spaces? A Provably Efficient Approach
Figure 4 for How to Scale Inverse RL to Large State Spaces? A Provably Efficient Approach
Viaarxiv icon

How to Explore with Belief: State Entropy Maximization in POMDPs

Add code
Jun 04, 2024
Viaarxiv icon