Alert button
Picture for Didan Deng

Didan Deng

Alert button

RMES: Real-Time Micro-Expression Spotting Using Phase From Riesz Pyramid

May 09, 2023
Yini Fang, Didan Deng, Liang Wu, Frederic Jumelle, Bertram Shi

Figure 1 for RMES: Real-Time Micro-Expression Spotting Using Phase From Riesz Pyramid
Figure 2 for RMES: Real-Time Micro-Expression Spotting Using Phase From Riesz Pyramid
Figure 3 for RMES: Real-Time Micro-Expression Spotting Using Phase From Riesz Pyramid
Figure 4 for RMES: Real-Time Micro-Expression Spotting Using Phase From Riesz Pyramid
Viaarxiv icon

Multiple Emotion Descriptors Estimation at the ABAW3 Challenge

Mar 29, 2022
Didan Deng

Figure 1 for Multiple Emotion Descriptors Estimation at the ABAW3 Challenge
Figure 2 for Multiple Emotion Descriptors Estimation at the ABAW3 Challenge
Figure 3 for Multiple Emotion Descriptors Estimation at the ABAW3 Challenge
Figure 4 for Multiple Emotion Descriptors Estimation at the ABAW3 Challenge
Viaarxiv icon

Towards Better Uncertainty: Iterative Training of Efficient Networks for Multitask Emotion Recognition

Jul 21, 2021
Didan Deng, Liang Wu, Bertram E. Shi

Figure 1 for Towards Better Uncertainty: Iterative Training of Efficient Networks for Multitask Emotion Recognition
Figure 2 for Towards Better Uncertainty: Iterative Training of Efficient Networks for Multitask Emotion Recognition
Figure 3 for Towards Better Uncertainty: Iterative Training of Efficient Networks for Multitask Emotion Recognition
Figure 4 for Towards Better Uncertainty: Iterative Training of Efficient Networks for Multitask Emotion Recognition
Viaarxiv icon

Individual risk profiling for portable devices using a neural network to process the cognitive reactions and the emotional responses to a multivariate situational risk assessment

Mar 07, 2021
Frederic Jumelle, Kelvin So, Didan Deng

Figure 1 for Individual risk profiling for portable devices using a neural network to process the cognitive reactions and the emotional responses to a multivariate situational risk assessment
Figure 2 for Individual risk profiling for portable devices using a neural network to process the cognitive reactions and the emotional responses to a multivariate situational risk assessment
Figure 3 for Individual risk profiling for portable devices using a neural network to process the cognitive reactions and the emotional responses to a multivariate situational risk assessment
Figure 4 for Individual risk profiling for portable devices using a neural network to process the cognitive reactions and the emotional responses to a multivariate situational risk assessment
Viaarxiv icon

Individual risk profiling for portable devices using a neural network to process the recording of 30 successive pairs of cognitive reaction and emotional response to a multivariate situational risk assessment

Feb 28, 2021
Frederic Jumelle, Kelvin So, Didan Deng

Figure 1 for Individual risk profiling for portable devices using a neural network to process the recording of 30 successive pairs of cognitive reaction and emotional response to a multivariate situational risk assessment
Figure 2 for Individual risk profiling for portable devices using a neural network to process the recording of 30 successive pairs of cognitive reaction and emotional response to a multivariate situational risk assessment
Figure 3 for Individual risk profiling for portable devices using a neural network to process the recording of 30 successive pairs of cognitive reaction and emotional response to a multivariate situational risk assessment
Figure 4 for Individual risk profiling for portable devices using a neural network to process the recording of 30 successive pairs of cognitive reaction and emotional response to a multivariate situational risk assessment
Viaarxiv icon

Functional neural network for decision processing, a racing network of programmable neurons with fuzzy logic where the target operating model relies on the network itself

Feb 24, 2021
Frederic Jumelle, Kelvin So, Didan Deng

Figure 1 for Functional neural network for decision processing, a racing network of programmable neurons with fuzzy logic where the target operating model relies on the network itself
Figure 2 for Functional neural network for decision processing, a racing network of programmable neurons with fuzzy logic where the target operating model relies on the network itself
Figure 3 for Functional neural network for decision processing, a racing network of programmable neurons with fuzzy logic where the target operating model relies on the network itself
Figure 4 for Functional neural network for decision processing, a racing network of programmable neurons with fuzzy logic where the target operating model relies on the network itself
Viaarxiv icon

Multitask Emotion Recognition with Incomplete Labels

Mar 10, 2020
Didan Deng, Zhaokang Chen, Bertram E. Shi

Figure 1 for Multitask Emotion Recognition with Incomplete Labels
Figure 2 for Multitask Emotion Recognition with Incomplete Labels
Figure 3 for Multitask Emotion Recognition with Incomplete Labels
Figure 4 for Multitask Emotion Recognition with Incomplete Labels
Viaarxiv icon

FAU, Facial Expressions, Valence and Arousal: A Multi-task Solution

Feb 10, 2020
Didan Deng, Zhaokang Chen, Bertram E. Shi

Figure 1 for FAU, Facial Expressions, Valence and Arousal: A Multi-task Solution
Figure 2 for FAU, Facial Expressions, Valence and Arousal: A Multi-task Solution
Figure 3 for FAU, Facial Expressions, Valence and Arousal: A Multi-task Solution
Figure 4 for FAU, Facial Expressions, Valence and Arousal: A Multi-task Solution
Viaarxiv icon

MIMAMO Net: Integrating Micro- and Macro-motion for Video Emotion Recognition

Nov 21, 2019
Didan Deng, Zhaokang Chen, Yuqian Zhou, Bertram Shi

Figure 1 for MIMAMO Net: Integrating Micro- and Macro-motion for Video Emotion Recognition
Figure 2 for MIMAMO Net: Integrating Micro- and Macro-motion for Video Emotion Recognition
Figure 3 for MIMAMO Net: Integrating Micro- and Macro-motion for Video Emotion Recognition
Figure 4 for MIMAMO Net: Integrating Micro- and Macro-motion for Video Emotion Recognition
Viaarxiv icon

Multimodal Utterance-level Affect Analysis using Visual, Audio and Text Features

May 04, 2018
Didan Deng, Yuqian Zhou, Jimin Pi, Bertram E. Shi

Figure 1 for Multimodal Utterance-level Affect Analysis using Visual, Audio and Text Features
Figure 2 for Multimodal Utterance-level Affect Analysis using Visual, Audio and Text Features
Figure 3 for Multimodal Utterance-level Affect Analysis using Visual, Audio and Text Features
Figure 4 for Multimodal Utterance-level Affect Analysis using Visual, Audio and Text Features
Viaarxiv icon