Picture for Chengchao Shen

Chengchao Shen

Diversity-Guided MLP Reduction for Efficient Large Vision Transformers

Add code
Jun 10, 2025
Viaarxiv icon

Multiple Object Stitching for Unsupervised Representation Learning

Add code
Jun 09, 2025
Viaarxiv icon

Learning Compact Vision Tokens for Efficient Large Multimodal Models

Add code
Jun 08, 2025
Viaarxiv icon

Multi-Grained Contrast for Data-Efficient Unsupervised Representation Learning

Add code
Jul 02, 2024
Figure 1 for Multi-Grained Contrast for Data-Efficient Unsupervised Representation Learning
Figure 2 for Multi-Grained Contrast for Data-Efficient Unsupervised Representation Learning
Figure 3 for Multi-Grained Contrast for Data-Efficient Unsupervised Representation Learning
Figure 4 for Multi-Grained Contrast for Data-Efficient Unsupervised Representation Learning
Viaarxiv icon

Inter-Instance Similarity Modeling for Contrastive Learning

Add code
Jun 29, 2023
Viaarxiv icon

Asymmetric Patch Sampling for Contrastive Learning

Add code
Jun 05, 2023
Figure 1 for Asymmetric Patch Sampling for Contrastive Learning
Figure 2 for Asymmetric Patch Sampling for Contrastive Learning
Figure 3 for Asymmetric Patch Sampling for Contrastive Learning
Figure 4 for Asymmetric Patch Sampling for Contrastive Learning
Viaarxiv icon

Modeling Global Distribution for Federated Learning with Label Distribution Skew

Add code
Dec 17, 2022
Viaarxiv icon

Learning Dynamic Preference Structure Embedding From Temporal Networks

Add code
Nov 23, 2021
Figure 1 for Learning Dynamic Preference Structure Embedding From Temporal Networks
Figure 2 for Learning Dynamic Preference Structure Embedding From Temporal Networks
Figure 3 for Learning Dynamic Preference Structure Embedding From Temporal Networks
Figure 4 for Learning Dynamic Preference Structure Embedding From Temporal Networks
Viaarxiv icon

Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data

Add code
Oct 27, 2021
Figure 1 for Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Figure 2 for Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Figure 3 for Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Figure 4 for Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Viaarxiv icon

Contrastive Model Inversion for Data-Free Knowledge Distillation

Add code
May 18, 2021
Figure 1 for Contrastive Model Inversion for Data-Free Knowledge Distillation
Figure 2 for Contrastive Model Inversion for Data-Free Knowledge Distillation
Figure 3 for Contrastive Model Inversion for Data-Free Knowledge Distillation
Figure 4 for Contrastive Model Inversion for Data-Free Knowledge Distillation
Viaarxiv icon