Picture for Chaofei Wang

Chaofei Wang

Smooth Diffusion: Crafting Smooth Latent Spaces in Diffusion Models

Add code
Dec 07, 2023
Viaarxiv icon

Avalon's Game of Thoughts: Battle Against Deception through Recursive Contemplation

Add code
Oct 06, 2023
Viaarxiv icon

Latency-aware Unified Dynamic Networks for Efficient Image Recognition

Add code
Sep 02, 2023
Viaarxiv icon

Computation-efficient Deep Learning for Computer Vision: A Survey

Add code
Aug 27, 2023
Viaarxiv icon

Zero-shot Generative Model Adaptation via Image-specific Prompt Learning

Add code
Apr 06, 2023
Viaarxiv icon

Efficient Knowledge Distillation from Model Checkpoints

Add code
Oct 12, 2022
Figure 1 for Efficient Knowledge Distillation from Model Checkpoints
Figure 2 for Efficient Knowledge Distillation from Model Checkpoints
Figure 3 for Efficient Knowledge Distillation from Model Checkpoints
Figure 4 for Efficient Knowledge Distillation from Model Checkpoints
Viaarxiv icon

Learning to Weight Samples for Dynamic Early-exiting Networks

Add code
Sep 17, 2022
Figure 1 for Learning to Weight Samples for Dynamic Early-exiting Networks
Figure 2 for Learning to Weight Samples for Dynamic Early-exiting Networks
Figure 3 for Learning to Weight Samples for Dynamic Early-exiting Networks
Figure 4 for Learning to Weight Samples for Dynamic Early-exiting Networks
Viaarxiv icon

Few Shot Generative Model Adaption via Relaxed Spatial Structural Alignment

Add code
Mar 31, 2022
Figure 1 for Few Shot Generative Model Adaption via Relaxed Spatial Structural Alignment
Figure 2 for Few Shot Generative Model Adaption via Relaxed Spatial Structural Alignment
Figure 3 for Few Shot Generative Model Adaption via Relaxed Spatial Structural Alignment
Figure 4 for Few Shot Generative Model Adaption via Relaxed Spatial Structural Alignment
Viaarxiv icon

Learn From the Past: Experience Ensemble Knowledge Distillation

Add code
Feb 25, 2022
Figure 1 for Learn From the Past: Experience Ensemble Knowledge Distillation
Figure 2 for Learn From the Past: Experience Ensemble Knowledge Distillation
Figure 3 for Learn From the Past: Experience Ensemble Knowledge Distillation
Figure 4 for Learn From the Past: Experience Ensemble Knowledge Distillation
Viaarxiv icon

Fine-Grained Few Shot Learning with Foreground Object Transformation

Add code
Sep 13, 2021
Figure 1 for Fine-Grained Few Shot Learning with Foreground Object Transformation
Figure 2 for Fine-Grained Few Shot Learning with Foreground Object Transformation
Figure 3 for Fine-Grained Few Shot Learning with Foreground Object Transformation
Figure 4 for Fine-Grained Few Shot Learning with Foreground Object Transformation
Viaarxiv icon