Picture for Ruohan Gao

Ruohan Gao

Meerkat: Audio-Visual Large Language Model for Grounding in Space and Time

Add code
Jul 01, 2024
Viaarxiv icon

Hearing Anything Anywhere

Add code
Jun 11, 2024
Viaarxiv icon

The Audio-Visual Conversational Graph: From an Egocentric-Exocentric Perspective

Add code
Dec 20, 2023
Figure 1 for The Audio-Visual Conversational Graph: From an Egocentric-Exocentric Perspective
Figure 2 for The Audio-Visual Conversational Graph: From an Egocentric-Exocentric Perspective
Figure 3 for The Audio-Visual Conversational Graph: From an Egocentric-Exocentric Perspective
Figure 4 for The Audio-Visual Conversational Graph: From an Egocentric-Exocentric Perspective
Viaarxiv icon

SoundCam: A Dataset for Finding Humans Using Room Acoustics

Add code
Nov 06, 2023
Figure 1 for SoundCam: A Dataset for Finding Humans Using Room Acoustics
Figure 2 for SoundCam: A Dataset for Finding Humans Using Room Acoustics
Figure 3 for SoundCam: A Dataset for Finding Humans Using Room Acoustics
Figure 4 for SoundCam: A Dataset for Finding Humans Using Room Acoustics
Viaarxiv icon

NOIR: Neural Signal Operated Intelligent Robots for Everyday Activities

Add code
Nov 02, 2023
Figure 1 for NOIR: Neural Signal Operated Intelligent Robots for Everyday Activities
Figure 2 for NOIR: Neural Signal Operated Intelligent Robots for Everyday Activities
Figure 3 for NOIR: Neural Signal Operated Intelligent Robots for Everyday Activities
Figure 4 for NOIR: Neural Signal Operated Intelligent Robots for Everyday Activities
Viaarxiv icon

RealImpact: A Dataset of Impact Sound Fields for Real Objects

Add code
Jun 16, 2023
Figure 1 for RealImpact: A Dataset of Impact Sound Fields for Real Objects
Figure 2 for RealImpact: A Dataset of Impact Sound Fields for Real Objects
Figure 3 for RealImpact: A Dataset of Impact Sound Fields for Real Objects
Figure 4 for RealImpact: A Dataset of Impact Sound Fields for Real Objects
Viaarxiv icon

Sonicverse: A Multisensory Simulation Platform for Embodied Household Agents that See and Hear

Add code
Jun 01, 2023
Figure 1 for Sonicverse: A Multisensory Simulation Platform for Embodied Household Agents that See and Hear
Figure 2 for Sonicverse: A Multisensory Simulation Platform for Embodied Household Agents that See and Hear
Figure 3 for Sonicverse: A Multisensory Simulation Platform for Embodied Household Agents that See and Hear
Figure 4 for Sonicverse: A Multisensory Simulation Platform for Embodied Household Agents that See and Hear
Viaarxiv icon

The ObjectFolder Benchmark: Multisensory Learning with Neural and Real Objects

Add code
Jun 01, 2023
Figure 1 for The ObjectFolder Benchmark: Multisensory Learning with Neural and Real Objects
Figure 2 for The ObjectFolder Benchmark: Multisensory Learning with Neural and Real Objects
Figure 3 for The ObjectFolder Benchmark: Multisensory Learning with Neural and Real Objects
Figure 4 for The ObjectFolder Benchmark: Multisensory Learning with Neural and Real Objects
Viaarxiv icon

Learning Object-Centric Neural Scattering Functions for Free-viewpoint Relighting and Scene Composition

Add code
Mar 10, 2023
Figure 1 for Learning Object-Centric Neural Scattering Functions for Free-viewpoint Relighting and Scene Composition
Figure 2 for Learning Object-Centric Neural Scattering Functions for Free-viewpoint Relighting and Scene Composition
Figure 3 for Learning Object-Centric Neural Scattering Functions for Free-viewpoint Relighting and Scene Composition
Figure 4 for Learning Object-Centric Neural Scattering Functions for Free-viewpoint Relighting and Scene Composition
Viaarxiv icon

See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation

Add code
Dec 08, 2022
Figure 1 for See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation
Figure 2 for See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation
Figure 3 for See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation
Figure 4 for See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation
Viaarxiv icon