Alert button
Picture for Zhiwei Hao

Zhiwei Hao

Alert button

GhostNetV3: Exploring the Training Strategies for Compact Models

Add code
Bookmark button
Alert button
Apr 17, 2024
Zhenhua Liu, Zhiwei Hao, Kai Han, Yehui Tang, Yunhe Wang

Viaarxiv icon

SAM-DiffSR: Structure-Modulated Diffusion Model for Image Super-Resolution

Add code
Bookmark button
Alert button
Feb 27, 2024
Chengcheng Wang, Zhiwei Hao, Yehui Tang, Jianyuan Guo, Yujie Yang, Kai Han, Yunhe Wang

Viaarxiv icon

Data-efficient Large Vision Models through Sequential Autoregression

Add code
Bookmark button
Alert button
Feb 07, 2024
Jianyuan Guo, Zhiwei Hao, Chengcheng Wang, Yehui Tang, Han Wu, Han Hu, Kai Han, Chang Xu

Viaarxiv icon

One-for-All: Bridge the Gap Between Heterogeneous Architectures in Knowledge Distillation

Add code
Bookmark button
Alert button
Oct 30, 2023
Zhiwei Hao, Jianyuan Guo, Kai Han, Yehui Tang, Han Hu, Yunhe Wang, Chang Xu

Viaarxiv icon

DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices

Add code
Bookmark button
Alert button
Sep 10, 2023
Guanyu Xu, Zhiwei Hao, Yong Luo, Han Hu, Jianping An, Shiwen Mao

Figure 1 for DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices
Figure 2 for DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices
Figure 3 for DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices
Figure 4 for DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices
Viaarxiv icon

VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale

Add code
Bookmark button
Alert button
May 25, 2023
Zhiwei Hao, Jianyuan Guo, Kai Han, Han Hu, Chang Xu, Yunhe Wang

Figure 1 for VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Figure 2 for VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Figure 3 for VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Figure 4 for VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Viaarxiv icon

Multi-Agent Collaborative Inference via DNN Decoupling: Intermediate Feature Compression and Edge Learning

Add code
Bookmark button
Alert button
May 24, 2022
Zhiwei Hao, Guanyu Xu, Yong Luo, Han Hu, Jianping An, Shiwen Mao

Figure 1 for Multi-Agent Collaborative Inference via DNN Decoupling: Intermediate Feature Compression and Edge Learning
Figure 2 for Multi-Agent Collaborative Inference via DNN Decoupling: Intermediate Feature Compression and Edge Learning
Figure 3 for Multi-Agent Collaborative Inference via DNN Decoupling: Intermediate Feature Compression and Edge Learning
Figure 4 for Multi-Agent Collaborative Inference via DNN Decoupling: Intermediate Feature Compression and Edge Learning
Viaarxiv icon

CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing

Add code
Bookmark button
Alert button
May 24, 2022
Zhiwei Hao, Yong Luo, Zhi Wang, Han Hu, Jianping An

Figure 1 for CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing
Figure 2 for CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing
Figure 3 for CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing
Figure 4 for CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing
Viaarxiv icon