Alert button
Picture for Gongfan Fang

Gongfan Fang

Alert button

0.1% Data Makes Segment Anything Slim

Add code
Bookmark button
Alert button
Dec 12, 2023
Zigeng Chen, Gongfan Fang, Xinyin Ma, Xinchao Wang

Figure 1 for 0.1% Data Makes Segment Anything Slim
Figure 2 for 0.1% Data Makes Segment Anything Slim
Figure 3 for 0.1% Data Makes Segment Anything Slim
Figure 4 for 0.1% Data Makes Segment Anything Slim
Viaarxiv icon

DeepCache: Accelerating Diffusion Models for Free

Add code
Bookmark button
Alert button
Dec 07, 2023
Xinyin Ma, Gongfan Fang, Xinchao Wang

Figure 1 for DeepCache: Accelerating Diffusion Models for Free
Figure 2 for DeepCache: Accelerating Diffusion Models for Free
Figure 3 for DeepCache: Accelerating Diffusion Models for Free
Figure 4 for DeepCache: Accelerating Diffusion Models for Free
Viaarxiv icon

LLM-Pruner: On the Structural Pruning of Large Language Models

Add code
Bookmark button
Alert button
May 19, 2023
Xinyin Ma, Gongfan Fang, Xinchao Wang

Figure 1 for LLM-Pruner: On the Structural Pruning of Large Language Models
Figure 2 for LLM-Pruner: On the Structural Pruning of Large Language Models
Figure 3 for LLM-Pruner: On the Structural Pruning of Large Language Models
Figure 4 for LLM-Pruner: On the Structural Pruning of Large Language Models
Viaarxiv icon

Structural Pruning for Diffusion Models

Add code
Bookmark button
Alert button
May 18, 2023
Gongfan Fang, Xinyin Ma, Xinchao Wang

Figure 1 for Structural Pruning for Diffusion Models
Figure 2 for Structural Pruning for Diffusion Models
Figure 3 for Structural Pruning for Diffusion Models
Figure 4 for Structural Pruning for Diffusion Models
Viaarxiv icon

DepGraph: Towards Any Structural Pruning

Add code
Bookmark button
Alert button
Jan 30, 2023
Gongfan Fang, Xinyin Ma, Mingli Song, Michael Bi Mi, Xinchao Wang

Figure 1 for DepGraph: Towards Any Structural Pruning
Figure 2 for DepGraph: Towards Any Structural Pruning
Figure 3 for DepGraph: Towards Any Structural Pruning
Figure 4 for DepGraph: Towards Any Structural Pruning
Viaarxiv icon

Federated Selective Aggregation for Knowledge Amalgamation

Add code
Bookmark button
Alert button
Jul 27, 2022
Donglin Xie, Ruonan Yu, Gongfan Fang, Jie Song, Zunlei Feng, Xinchao Wang, Li Sun, Mingli Song

Figure 1 for Federated Selective Aggregation for Knowledge Amalgamation
Figure 2 for Federated Selective Aggregation for Knowledge Amalgamation
Figure 3 for Federated Selective Aggregation for Knowledge Amalgamation
Figure 4 for Federated Selective Aggregation for Knowledge Amalgamation
Viaarxiv icon

Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt

Add code
Bookmark button
Alert button
May 16, 2022
Xinyin Ma, Xinchao Wang, Gongfan Fang, Yongliang Shen, Weiming Lu

Figure 1 for Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt
Figure 2 for Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt
Figure 3 for Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt
Figure 4 for Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt
Viaarxiv icon

Knowledge Amalgamation for Object Detection with Transformers

Add code
Bookmark button
Alert button
Mar 07, 2022
Haofei Zhang, Feng Mao, Mengqi Xue, Gongfan Fang, Zunlei Feng, Jie Song, Mingli Song

Figure 1 for Knowledge Amalgamation for Object Detection with Transformers
Figure 2 for Knowledge Amalgamation for Object Detection with Transformers
Figure 3 for Knowledge Amalgamation for Object Detection with Transformers
Figure 4 for Knowledge Amalgamation for Object Detection with Transformers
Viaarxiv icon

Up to 100x Faster Data-free Knowledge Distillation

Add code
Bookmark button
Alert button
Dec 12, 2021
Gongfan Fang, Kanya Mo, Xinchao Wang, Jie Song, Shitao Bei, Haofei Zhang, Mingli Song

Figure 1 for Up to 100x Faster Data-free Knowledge Distillation
Figure 2 for Up to 100x Faster Data-free Knowledge Distillation
Figure 3 for Up to 100x Faster Data-free Knowledge Distillation
Figure 4 for Up to 100x Faster Data-free Knowledge Distillation
Viaarxiv icon