Alert button
Picture for Thomas M. Howard

Thomas M. Howard

Alert button

Resolving Ambiguity via Dialogue to Correct Unsynthesizable Controllers for Free-Flying Robots

Add code
Bookmark button
Alert button
Apr 11, 2023
Joshua Rosser, Jacob Arkin, Siddharth Patki, Thomas M. Howard

Figure 1 for Resolving Ambiguity via Dialogue to Correct Unsynthesizable Controllers for Free-Flying Robots
Figure 2 for Resolving Ambiguity via Dialogue to Correct Unsynthesizable Controllers for Free-Flying Robots
Figure 3 for Resolving Ambiguity via Dialogue to Correct Unsynthesizable Controllers for Free-Flying Robots
Figure 4 for Resolving Ambiguity via Dialogue to Correct Unsynthesizable Controllers for Free-Flying Robots
Viaarxiv icon

Language Understanding for Field and Service Robots in a Priori Unknown Environments

Add code
Bookmark button
Alert button
May 21, 2021
Matthew R. Walter, Siddharth Patki, Andrea F. Daniele, Ethan Fahnestock, Felix Duvallet, Sachithra Hemachandra, Jean Oh, Anthony Stentz, Nicholas Roy, Thomas M. Howard

Figure 1 for Language Understanding for Field and Service Robots in a Priori Unknown Environments
Figure 2 for Language Understanding for Field and Service Robots in a Priori Unknown Environments
Figure 3 for Language Understanding for Field and Service Robots in a Priori Unknown Environments
Figure 4 for Language Understanding for Field and Service Robots in a Priori Unknown Environments
Viaarxiv icon

Language-guided Semantic Mapping and Mobile Manipulation in Partially Observable Environments

Add code
Bookmark button
Alert button
Oct 22, 2019
Siddharth Patki, Ethan Fahnestock, Thomas M. Howard, Matthew R. Walter

Figure 1 for Language-guided Semantic Mapping and Mobile Manipulation in Partially Observable Environments
Figure 2 for Language-guided Semantic Mapping and Mobile Manipulation in Partially Observable Environments
Figure 3 for Language-guided Semantic Mapping and Mobile Manipulation in Partially Observable Environments
Figure 4 for Language-guided Semantic Mapping and Mobile Manipulation in Partially Observable Environments
Viaarxiv icon

Language-guided Adaptive Perception with Hierarchical Symbolic Representations for Mobile Manipulators

Add code
Bookmark button
Alert button
Sep 21, 2019
Ethan Fahnestock, Siddharth Patki, Thomas M. Howard

Figure 1 for Language-guided Adaptive Perception with Hierarchical Symbolic Representations for Mobile Manipulators
Figure 2 for Language-guided Adaptive Perception with Hierarchical Symbolic Representations for Mobile Manipulators
Figure 3 for Language-guided Adaptive Perception with Hierarchical Symbolic Representations for Mobile Manipulators
Figure 4 for Language-guided Adaptive Perception with Hierarchical Symbolic Representations for Mobile Manipulators
Viaarxiv icon

Inferring Compact Representations for Efficient Natural Language Understanding of Robot Instructions

Add code
Bookmark button
Alert button
Mar 21, 2019
Siddharth Patki, Andrea F. Daniele, Matthew R. Walter, Thomas M. Howard

Figure 1 for Inferring Compact Representations for Efficient Natural Language Understanding of Robot Instructions
Figure 2 for Inferring Compact Representations for Efficient Natural Language Understanding of Robot Instructions
Figure 3 for Inferring Compact Representations for Efficient Natural Language Understanding of Robot Instructions
Figure 4 for Inferring Compact Representations for Efficient Natural Language Understanding of Robot Instructions
Viaarxiv icon

Adaptive Grasp Control through Multi-Modal Interactions for Assistive Prosthetic Devices

Add code
Bookmark button
Alert button
Oct 18, 2018
Michelle Esponda, Thomas M. Howard

Figure 1 for Adaptive Grasp Control through Multi-Modal Interactions for Assistive Prosthetic Devices
Figure 2 for Adaptive Grasp Control through Multi-Modal Interactions for Assistive Prosthetic Devices
Figure 3 for Adaptive Grasp Control through Multi-Modal Interactions for Assistive Prosthetic Devices
Figure 4 for Adaptive Grasp Control through Multi-Modal Interactions for Assistive Prosthetic Devices
Viaarxiv icon

Learning Models for Following Natural Language Directions in Unknown Environments

Add code
Bookmark button
Alert button
Mar 17, 2015
Sachithra Hemachandra, Felix Duvallet, Thomas M. Howard, Nicholas Roy, Anthony Stentz, Matthew R. Walter

Figure 1 for Learning Models for Following Natural Language Directions in Unknown Environments
Figure 2 for Learning Models for Following Natural Language Directions in Unknown Environments
Figure 3 for Learning Models for Following Natural Language Directions in Unknown Environments
Figure 4 for Learning Models for Following Natural Language Directions in Unknown Environments
Viaarxiv icon