Alert button
Picture for Jürgen Leitner

Jürgen Leitner

Alert button

Deep Learning Approaches to Grasp Synthesis: A Review

Jul 06, 2022
Rhys Newbury, Morris Gu, Lachlan Chumbley, Arsalan Mousavian, Clemens Eppner, Jürgen Leitner, Jeannette Bohg, Antonio Morales, Tamim Asfour, Danica Kragic, Dieter Fox, Akansel Cosgun

Figure 1 for Deep Learning Approaches to Grasp Synthesis: A Review
Figure 2 for Deep Learning Approaches to Grasp Synthesis: A Review
Figure 3 for Deep Learning Approaches to Grasp Synthesis: A Review
Figure 4 for Deep Learning Approaches to Grasp Synthesis: A Review
Viaarxiv icon

Follow the Gradient: Crossing the Reality Gap using Differentiable Physics (RealityGrad)

Sep 10, 2021
Jack Collins, Ross Brown, Jürgen Leitner, David Howard

Figure 1 for Follow the Gradient: Crossing the Reality Gap using Differentiable Physics (RealityGrad)
Figure 2 for Follow the Gradient: Crossing the Reality Gap using Differentiable Physics (RealityGrad)
Figure 3 for Follow the Gradient: Crossing the Reality Gap using Differentiable Physics (RealityGrad)
Figure 4 for Follow the Gradient: Crossing the Reality Gap using Differentiable Physics (RealityGrad)
Viaarxiv icon

EGAD! an Evolved Grasping Analysis Dataset for diversity and reproducibility in robotic manipulation

Mar 03, 2020
Douglas Morrison, Peter Corke, Jürgen Leitner

Figure 1 for EGAD! an Evolved Grasping Analysis Dataset for diversity and reproducibility in robotic manipulation
Figure 2 for EGAD! an Evolved Grasping Analysis Dataset for diversity and reproducibility in robotic manipulation
Figure 3 for EGAD! an Evolved Grasping Analysis Dataset for diversity and reproducibility in robotic manipulation
Figure 4 for EGAD! an Evolved Grasping Analysis Dataset for diversity and reproducibility in robotic manipulation
Viaarxiv icon

Benchmarking Simulated Robotic Manipulation through a Real World Dataset

Nov 27, 2019
Jack Collins, Jessie McVicar, David Wedlock, Ross Brown, David Howard, Jürgen Leitner

Figure 1 for Benchmarking Simulated Robotic Manipulation through a Real World Dataset
Figure 2 for Benchmarking Simulated Robotic Manipulation through a Real World Dataset
Figure 3 for Benchmarking Simulated Robotic Manipulation through a Real World Dataset
Figure 4 for Benchmarking Simulated Robotic Manipulation through a Real World Dataset
Viaarxiv icon

Evaluating task-agnostic exploration for fixed-batch learning of arbitrary future tasks

Nov 20, 2019
Vibhavari Dasagi, Robert Lee, Jake Bruce, Jürgen Leitner

Figure 1 for Evaluating task-agnostic exploration for fixed-batch learning of arbitrary future tasks
Figure 2 for Evaluating task-agnostic exploration for fixed-batch learning of arbitrary future tasks
Figure 3 for Evaluating task-agnostic exploration for fixed-batch learning of arbitrary future tasks
Figure 4 for Evaluating task-agnostic exploration for fixed-batch learning of arbitrary future tasks
Viaarxiv icon

A Perceived Environment Design using a Multi-Modal Variational Autoencoder for learning Active-Sensing

Nov 01, 2019
Timo Korthals, Malte Schilling, Jürgen Leitner

Figure 1 for A Perceived Environment Design using a Multi-Modal Variational Autoencoder for learning Active-Sensing
Viaarxiv icon

Ctrl-Z: Recovering from Instability in Reinforcement Learning

Oct 09, 2019
Vibhavari Dasagi, Jake Bruce, Thierry Peynot, Jürgen Leitner

Figure 1 for Ctrl-Z: Recovering from Instability in Reinforcement Learning
Figure 2 for Ctrl-Z: Recovering from Instability in Reinforcement Learning
Figure 3 for Ctrl-Z: Recovering from Instability in Reinforcement Learning
Figure 4 for Ctrl-Z: Recovering from Instability in Reinforcement Learning
Viaarxiv icon

Quantifying the Reality Gap in Robotic Manipulation Tasks

Nov 08, 2018
Jack Collins, David Howard, Jürgen Leitner

Figure 1 for Quantifying the Reality Gap in Robotic Manipulation Tasks
Figure 2 for Quantifying the Reality Gap in Robotic Manipulation Tasks
Figure 3 for Quantifying the Reality Gap in Robotic Manipulation Tasks
Figure 4 for Quantifying the Reality Gap in Robotic Manipulation Tasks
Viaarxiv icon

Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter

Sep 23, 2018
Douglas Morrison, Peter Corke, Jürgen Leitner

Figure 1 for Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter
Figure 2 for Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter
Figure 3 for Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter
Figure 4 for Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter
Viaarxiv icon