Picture for Shikun Li

Shikun Li

LearnAlign: Reasoning Data Selection for Reinforcement Learning in Large Language Models Based on Improved Gradient Alignment

Add code
Jun 13, 2025
Viaarxiv icon

Learning Natural Consistency Representation for Face Forgery Video Detection

Add code
Jul 15, 2024
Figure 1 for Learning Natural Consistency Representation for Face Forgery Video Detection
Figure 2 for Learning Natural Consistency Representation for Face Forgery Video Detection
Figure 3 for Learning Natural Consistency Representation for Face Forgery Video Detection
Figure 4 for Learning Natural Consistency Representation for Face Forgery Video Detection
Viaarxiv icon

DANCE: Dual-View Distribution Alignment for Dataset Condensation

Add code
Jun 03, 2024
Viaarxiv icon

M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy

Add code
Jan 03, 2024
Viaarxiv icon

Coupled Confusion Correction: Learning from Crowds with Sparse Annotations

Add code
Dec 26, 2023
Viaarxiv icon

Multi-Label Noise Transition Matrix Estimation with Label Correlations: Theory and Algorithm

Add code
Sep 22, 2023
Viaarxiv icon

Transferring Annotator- and Instance-dependent Transition Matrix for Learning from Crowds

Add code
Jun 05, 2023
Viaarxiv icon

Trustable Co-label Learning from Multiple Noisy Annotators

Add code
Mar 08, 2022
Figure 1 for Trustable Co-label Learning from Multiple Noisy Annotators
Figure 2 for Trustable Co-label Learning from Multiple Noisy Annotators
Figure 3 for Trustable Co-label Learning from Multiple Noisy Annotators
Figure 4 for Trustable Co-label Learning from Multiple Noisy Annotators
Viaarxiv icon

Selective-Supervised Contrastive Learning with Noisy Labels

Add code
Mar 08, 2022
Figure 1 for Selective-Supervised Contrastive Learning with Noisy Labels
Figure 2 for Selective-Supervised Contrastive Learning with Noisy Labels
Figure 3 for Selective-Supervised Contrastive Learning with Noisy Labels
Figure 4 for Selective-Supervised Contrastive Learning with Noisy Labels
Viaarxiv icon

Student Network Learning via Evolutionary Knowledge Distillation

Add code
Mar 23, 2021
Figure 1 for Student Network Learning via Evolutionary Knowledge Distillation
Figure 2 for Student Network Learning via Evolutionary Knowledge Distillation
Figure 3 for Student Network Learning via Evolutionary Knowledge Distillation
Figure 4 for Student Network Learning via Evolutionary Knowledge Distillation
Viaarxiv icon