Alert button
Picture for Lujun Li

Lujun Li

Alert button

ParZC: Parametric Zero-Cost Proxies for Efficient NAS

Add code
Bookmark button
Alert button
Feb 03, 2024
Peijie Dong, Lujun Li, Xinglin Pan, Zimian Wei, Xiang Liu, Qiang Wang, Xiaowen Chu

Viaarxiv icon

Auto-Prox: Training-Free Vision Transformer Architecture Search via Automatic Proxy Discovery

Add code
Bookmark button
Alert button
Dec 14, 2023
Zimian Wei, Lujun Li, Peijie Dong, Zheng Hui, Anggeng Li, Menglong Lu, Hengyue Pan, Zhiliang Tian, Dongsheng Li

Viaarxiv icon

TVT: Training-Free Vision Transformer Search on Tiny Datasets

Add code
Bookmark button
Alert button
Nov 24, 2023
Zimian Wei, Hengyue Pan, Lujun Li, Peijie Dong, Zhiliang Tian, Xin Niu, Dongsheng Li

Viaarxiv icon

EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization

Add code
Bookmark button
Alert button
Jul 20, 2023
Peijie Dong, Lujun Li, Zimian Wei, Xin Niu, Zhiliang Tian, Hengyue Pan

Figure 1 for EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization
Figure 2 for EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization
Figure 3 for EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization
Figure 4 for EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization
Viaarxiv icon

NORM: Knowledge Distillation via N-to-One Representation Matching

Add code
Bookmark button
Alert button
May 23, 2023
Xiaolong Liu, Lujun Li, Chao Li, Anbang Yao

Figure 1 for NORM: Knowledge Distillation via N-to-One Representation Matching
Figure 2 for NORM: Knowledge Distillation via N-to-One Representation Matching
Figure 3 for NORM: Knowledge Distillation via N-to-One Representation Matching
Figure 4 for NORM: Knowledge Distillation via N-to-One Representation Matching
Viaarxiv icon

Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling

Add code
Bookmark button
Alert button
May 21, 2023
Shitong Shao, Xu Dai, Shouyi Yin, Lujun Li, Huanran Chen, Yang Hu

Figure 1 for Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling
Figure 2 for Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling
Figure 3 for Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling
Figure 4 for Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling
Viaarxiv icon

DisWOT: Student Architecture Search for Distillation WithOut Training

Add code
Bookmark button
Alert button
Mar 28, 2023
Peijie Dong, Lujun Li, Zimian Wei

Figure 1 for DisWOT: Student Architecture Search for Distillation WithOut Training
Figure 2 for DisWOT: Student Architecture Search for Distillation WithOut Training
Figure 3 for DisWOT: Student Architecture Search for Distillation WithOut Training
Figure 4 for DisWOT: Student Architecture Search for Distillation WithOut Training
Viaarxiv icon

Progressive Meta-Pooling Learning for Lightweight Image Classification Model

Add code
Bookmark button
Alert button
Jan 24, 2023
Peijie Dong, Xin Niu, Zhiliang Tian, Lujun Li, Xiaodong Wang, Zimian Wei, Hengyue Pan, Dongsheng Li

Figure 1 for Progressive Meta-Pooling Learning for Lightweight Image Classification Model
Figure 2 for Progressive Meta-Pooling Learning for Lightweight Image Classification Model
Figure 3 for Progressive Meta-Pooling Learning for Lightweight Image Classification Model
Figure 4 for Progressive Meta-Pooling Learning for Lightweight Image Classification Model
Viaarxiv icon

RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation from Zero-cost Proxies

Add code
Bookmark button
Alert button
Jan 24, 2023
Peijie Dong, Xin Niu, Lujun Li, Zhiliang Tian, Xiaodong Wang, Zimian Wei, Hengyue Pan, Dongsheng Li

Figure 1 for RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation from Zero-cost Proxies
Figure 2 for RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation from Zero-cost Proxies
Figure 3 for RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation from Zero-cost Proxies
Figure 4 for RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation from Zero-cost Proxies
Viaarxiv icon

GP-NAS-ensemble: a model for NAS Performance Prediction

Add code
Bookmark button
Alert button
Jan 23, 2023
Kunlong Chen, Liu Yang, Yitian Chen, Kunjin Chen, Yidan Xu, Lujun Li

Figure 1 for GP-NAS-ensemble: a model for NAS Performance Prediction
Figure 2 for GP-NAS-ensemble: a model for NAS Performance Prediction
Figure 3 for GP-NAS-ensemble: a model for NAS Performance Prediction
Figure 4 for GP-NAS-ensemble: a model for NAS Performance Prediction
Viaarxiv icon