Alert button
Picture for Defang Chen

Defang Chen

Alert button

Knowledge Translation: A New Pathway for Model Compression

Add code
Bookmark button
Alert button
Jan 11, 2024
Wujie Sun, Defang Chen, Jiawei Chen, Yan Feng, Chun Chen, Can Wang

Viaarxiv icon

Fast ODE-based Sampling for Diffusion Models in Around 5 Steps

Add code
Bookmark button
Alert button
Nov 30, 2023
Zhenyu Zhou, Defang Chen, Can Wang, Chun Chen

Viaarxiv icon

Customizing Synthetic Data for Data-Free Student Learning

Add code
Bookmark button
Alert button
Jul 10, 2023
Shiya Luo, Defang Chen, Can Wang

Figure 1 for Customizing Synthetic Data for Data-Free Student Learning
Figure 2 for Customizing Synthetic Data for Data-Free Student Learning
Figure 3 for Customizing Synthetic Data for Data-Free Student Learning
Figure 4 for Customizing Synthetic Data for Data-Free Student Learning
Viaarxiv icon

Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning

Add code
Bookmark button
Alert button
Jun 11, 2023
Hailin Zhang, Defang Chen, Can Wang

Figure 1 for Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
Figure 2 for Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
Figure 3 for Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
Figure 4 for Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
Viaarxiv icon

A Geometric Perspective on Diffusion Models

Add code
Bookmark button
Alert button
May 31, 2023
Defang Chen, Zhenyu Zhou, Jian-Ping Mei, Chunhua Shen, Chun Chen, Can Wang

Figure 1 for A Geometric Perspective on Diffusion Models
Figure 2 for A Geometric Perspective on Diffusion Models
Figure 3 for A Geometric Perspective on Diffusion Models
Figure 4 for A Geometric Perspective on Diffusion Models
Viaarxiv icon

Accelerating Diffusion Sampling with Classifier-based Feature Distillation

Add code
Bookmark button
Alert button
Nov 22, 2022
Wujie Sun, Defang Chen, Can Wang, Deshi Ye, Yan Feng, Chun Chen

Figure 1 for Accelerating Diffusion Sampling with Classifier-based Feature Distillation
Figure 2 for Accelerating Diffusion Sampling with Classifier-based Feature Distillation
Figure 3 for Accelerating Diffusion Sampling with Classifier-based Feature Distillation
Figure 4 for Accelerating Diffusion Sampling with Classifier-based Feature Distillation
Viaarxiv icon

Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision

Add code
Bookmark button
Alert button
Oct 25, 2022
Jiongyu Guo, Defang Chen, Can Wang

Figure 1 for Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
Figure 2 for Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
Figure 3 for Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
Figure 4 for Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
Viaarxiv icon

Label-Efficient Domain Generalization via Collaborative Exploration and Generalization

Add code
Bookmark button
Alert button
Aug 07, 2022
Junkun Yuan, Xu Ma, Defang Chen, Kun Kuang, Fei Wu, Lanfen Lin

Figure 1 for Label-Efficient Domain Generalization via Collaborative Exploration and Generalization
Figure 2 for Label-Efficient Domain Generalization via Collaborative Exploration and Generalization
Figure 3 for Label-Efficient Domain Generalization via Collaborative Exploration and Generalization
Figure 4 for Label-Efficient Domain Generalization via Collaborative Exploration and Generalization
Viaarxiv icon

Improving Knowledge Graph Embedding via Iterative Self-Semantic Knowledge Distillation

Add code
Bookmark button
Alert button
Jun 07, 2022
Zhehui Zhou, Defang Chen, Can Wang, Yan Feng, Chun Chen

Figure 1 for Improving Knowledge Graph Embedding via Iterative Self-Semantic Knowledge Distillation
Figure 2 for Improving Knowledge Graph Embedding via Iterative Self-Semantic Knowledge Distillation
Figure 3 for Improving Knowledge Graph Embedding via Iterative Self-Semantic Knowledge Distillation
Figure 4 for Improving Knowledge Graph Embedding via Iterative Self-Semantic Knowledge Distillation
Viaarxiv icon

Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks

Add code
Bookmark button
Alert button
May 05, 2022
Jiongyu Guo, Defang Chen, Can Wang

Figure 1 for Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks
Figure 2 for Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks
Figure 3 for Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks
Figure 4 for Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks
Viaarxiv icon