Picture for Yan Zhao

Yan Zhao

Learning to Adapt Foundation Model DINOv2 for Capsule Endoscopy Diagnosis

Add code
Jun 15, 2024
Viaarxiv icon

A Practice in Enrollment Prediction with Markov Chain Models

Add code
May 22, 2024
Viaarxiv icon

LightTR: A Lightweight Framework for Federated Trajectory Recovery

Add code
May 06, 2024
Figure 1 for LightTR: A Lightweight Framework for Federated Trajectory Recovery
Figure 2 for LightTR: A Lightweight Framework for Federated Trajectory Recovery
Figure 3 for LightTR: A Lightweight Framework for Federated Trajectory Recovery
Figure 4 for LightTR: A Lightweight Framework for Federated Trajectory Recovery
Viaarxiv icon

A Unified Replay-based Continuous Learning Framework for Spatio-Temporal Prediction on Streaming Data

Add code
Apr 23, 2024
Figure 1 for A Unified Replay-based Continuous Learning Framework for Spatio-Temporal Prediction on Streaming Data
Figure 2 for A Unified Replay-based Continuous Learning Framework for Spatio-Temporal Prediction on Streaming Data
Figure 3 for A Unified Replay-based Continuous Learning Framework for Spatio-Temporal Prediction on Streaming Data
Figure 4 for A Unified Replay-based Continuous Learning Framework for Spatio-Temporal Prediction on Streaming Data
Viaarxiv icon

E2USD: Efficient-yet-effective Unsupervised State Detection for Multivariate Time Series

Add code
Mar 01, 2024
Figure 1 for E2USD: Efficient-yet-effective Unsupervised State Detection for Multivariate Time Series
Figure 2 for E2USD: Efficient-yet-effective Unsupervised State Detection for Multivariate Time Series
Figure 3 for E2USD: Efficient-yet-effective Unsupervised State Detection for Multivariate Time Series
Figure 4 for E2USD: Efficient-yet-effective Unsupervised State Detection for Multivariate Time Series
Viaarxiv icon

LB-KBQA: Large-language-model and BERT based Knowledge-Based Question and Answering System

Add code
Feb 09, 2024
Figure 1 for LB-KBQA: Large-language-model and BERT based Knowledge-Based Question and Answering System
Figure 2 for LB-KBQA: Large-language-model and BERT based Knowledge-Based Question and Answering System
Viaarxiv icon

Emotion-Aware Contrastive Adaptation Network for Source-Free Cross-Corpus Speech Emotion Recognition

Add code
Jan 23, 2024
Figure 1 for Emotion-Aware Contrastive Adaptation Network for Source-Free Cross-Corpus Speech Emotion Recognition
Figure 2 for Emotion-Aware Contrastive Adaptation Network for Source-Free Cross-Corpus Speech Emotion Recognition
Figure 3 for Emotion-Aware Contrastive Adaptation Network for Source-Free Cross-Corpus Speech Emotion Recognition
Figure 4 for Emotion-Aware Contrastive Adaptation Network for Source-Free Cross-Corpus Speech Emotion Recognition
Viaarxiv icon

Speech Swin-Transformer: Exploring a Hierarchical Transformer with Shifted Windows for Speech Emotion Recognition

Add code
Jan 19, 2024
Figure 1 for Speech Swin-Transformer: Exploring a Hierarchical Transformer with Shifted Windows for Speech Emotion Recognition
Figure 2 for Speech Swin-Transformer: Exploring a Hierarchical Transformer with Shifted Windows for Speech Emotion Recognition
Figure 3 for Speech Swin-Transformer: Exploring a Hierarchical Transformer with Shifted Windows for Speech Emotion Recognition
Figure 4 for Speech Swin-Transformer: Exploring a Hierarchical Transformer with Shifted Windows for Speech Emotion Recognition
Viaarxiv icon

Improving Speaker-independent Speech Emotion Recognition Using Dynamic Joint Distribution Adaptation

Add code
Jan 18, 2024
Figure 1 for Improving Speaker-independent Speech Emotion Recognition Using Dynamic Joint Distribution Adaptation
Figure 2 for Improving Speaker-independent Speech Emotion Recognition Using Dynamic Joint Distribution Adaptation
Figure 3 for Improving Speaker-independent Speech Emotion Recognition Using Dynamic Joint Distribution Adaptation
Figure 4 for Improving Speaker-independent Speech Emotion Recognition Using Dynamic Joint Distribution Adaptation
Viaarxiv icon

Towards Domain-Specific Cross-Corpus Speech Emotion Recognition Approach

Add code
Dec 11, 2023
Viaarxiv icon