Alert button
Picture for Michael Neumann

Michael Neumann

Alert button

Investigating the Utility of Multimodal Conversational Technology and Audiovisual Analytic Measures for the Assessment and Monitoring of Amyotrophic Lateral Sclerosis at Scale

Add code
Bookmark button
Alert button
Apr 15, 2021
Michael Neumann, Oliver Roesler, Jackson Liscombe, Hardik Kothare, David Suendermann-Oeft, David Pautler, Indu Navar, Aria Anvar, Jochen Kumm, Raquel Norel, Ernest Fraenkel, Alexander V. Sherman, James D. Berry, Gary L. Pattee, Jun Wang, Jordan R. Green, Vikram Ramanarayanan

Figure 1 for Investigating the Utility of Multimodal Conversational Technology and Audiovisual Analytic Measures for the Assessment and Monitoring of Amyotrophic Lateral Sclerosis at Scale
Figure 2 for Investigating the Utility of Multimodal Conversational Technology and Audiovisual Analytic Measures for the Assessment and Monitoring of Amyotrophic Lateral Sclerosis at Scale
Figure 3 for Investigating the Utility of Multimodal Conversational Technology and Audiovisual Analytic Measures for the Assessment and Monitoring of Amyotrophic Lateral Sclerosis at Scale
Figure 4 for Investigating the Utility of Multimodal Conversational Technology and Audiovisual Analytic Measures for the Assessment and Monitoring of Amyotrophic Lateral Sclerosis at Scale
Viaarxiv icon

Investigations on Audiovisual Emotion Recognition in Noisy Conditions

Add code
Bookmark button
Alert button
Mar 02, 2021
Michael Neumann, Ngoc Thang Vu

Figure 1 for Investigations on Audiovisual Emotion Recognition in Noisy Conditions
Figure 2 for Investigations on Audiovisual Emotion Recognition in Noisy Conditions
Figure 3 for Investigations on Audiovisual Emotion Recognition in Noisy Conditions
Figure 4 for Investigations on Audiovisual Emotion Recognition in Noisy Conditions
Viaarxiv icon

URoboSim -- An Episodic Simulation Framework for Prospective Reasoning in Robotic Agents

Add code
Bookmark button
Alert button
Dec 08, 2020
Michael Neumann, Sebastian Koralewski, Michael Beetz

Figure 1 for URoboSim -- An Episodic Simulation Framework for Prospective Reasoning in Robotic Agents
Figure 2 for URoboSim -- An Episodic Simulation Framework for Prospective Reasoning in Robotic Agents
Figure 3 for URoboSim -- An Episodic Simulation Framework for Prospective Reasoning in Robotic Agents
Viaarxiv icon

Imagination-enabled Robot Perception

Add code
Bookmark button
Alert button
Nov 27, 2020
Patrick Mania, Franklin Kenghagho Kenfack, Michael Neumann, Michael Beetz

Figure 1 for Imagination-enabled Robot Perception
Figure 2 for Imagination-enabled Robot Perception
Figure 3 for Imagination-enabled Robot Perception
Figure 4 for Imagination-enabled Robot Perception
Viaarxiv icon

ADVISER: A Toolkit for Developing Multi-modal, Multi-domain and Socially-engaged Conversational Agents

Add code
Bookmark button
Alert button
May 04, 2020
Chia-Yu Li, Daniel Ortega, Dirk Väth, Florian Lux, Lindsey Vanderlyn, Maximilian Schmidt, Michael Neumann, Moritz Völkel, Pavel Denisov, Sabrina Jenne, Zorica Kacarevic, Ngoc Thang Vu

Figure 1 for ADVISER: A Toolkit for Developing Multi-modal, Multi-domain and Socially-engaged Conversational Agents
Figure 2 for ADVISER: A Toolkit for Developing Multi-modal, Multi-domain and Socially-engaged Conversational Agents
Viaarxiv icon

Cross-lingual and Multilingual Speech Emotion Recognition on English and French

Add code
Bookmark button
Alert button
Mar 01, 2018
Michael Neumann, Ngoc Thang Vu

Figure 1 for Cross-lingual and Multilingual Speech Emotion Recognition on English and French
Figure 2 for Cross-lingual and Multilingual Speech Emotion Recognition on English and French
Figure 3 for Cross-lingual and Multilingual Speech Emotion Recognition on English and French
Figure 4 for Cross-lingual and Multilingual Speech Emotion Recognition on English and French
Viaarxiv icon

Attentive Convolutional Neural Network based Speech Emotion Recognition: A Study on the Impact of Input Features, Signal Length, and Acted Speech

Add code
Bookmark button
Alert button
Jun 02, 2017
Michael Neumann, Ngoc Thang Vu

Figure 1 for Attentive Convolutional Neural Network based Speech Emotion Recognition: A Study on the Impact of Input Features, Signal Length, and Acted Speech
Figure 2 for Attentive Convolutional Neural Network based Speech Emotion Recognition: A Study on the Impact of Input Features, Signal Length, and Acted Speech
Figure 3 for Attentive Convolutional Neural Network based Speech Emotion Recognition: A Study on the Impact of Input Features, Signal Length, and Acted Speech
Figure 4 for Attentive Convolutional Neural Network based Speech Emotion Recognition: A Study on the Impact of Input Features, Signal Length, and Acted Speech
Viaarxiv icon