Alert button
Picture for Yunxin Liu

Yunxin Liu

Alert button

Institute for AI Industry Research, Shanghai AI Laboratory, Shanghai, China

PatchBackdoor: Backdoor Attack against Deep Neural Networks without Model Modification

Add code
Bookmark button
Alert button
Aug 22, 2023
Yizhen Yuan, Rui Kong, Shenghao Xie, Yuanchun Li, Yunxin Liu

Viaarxiv icon

AIGC Empowering Telecom Sector White Paper_chinese

Add code
Bookmark button
Alert button
Jul 24, 2023
Ye Ouyang, Yaqin Zhang, Xiaozhou Ye, Yunxin Liu, Yong Song, Yang Liu, Sen Bian, Zhiyong Liu

Viaarxiv icon

6G Network Business Support System

Add code
Bookmark button
Alert button
Jul 19, 2023
Ye Ouyang, Yaqin Zhang, Peng Wang, Yunxin Liu, Wen Qiao, Jun Zhu, Yang Liu, Feng Zhang, Shuling Wang, Xidong Wang

Figure 1 for 6G Network Business Support System
Figure 2 for 6G Network Business Support System
Figure 3 for 6G Network Business Support System
Figure 4 for 6G Network Business Support System
Viaarxiv icon

AdaptiveNet: Post-deployment Neural Architecture Adaptation for Diverse Edge Environments

Add code
Bookmark button
Alert button
Mar 13, 2023
Hao Wen, Yuanchun Li, Zunshuai Zhang, Shiqi Jiang, Xiaozhou Ye, Ye Ouyang, Ya-Qin Zhang, Yunxin Liu

Figure 1 for AdaptiveNet: Post-deployment Neural Architecture Adaptation for Diverse Edge Environments
Figure 2 for AdaptiveNet: Post-deployment Neural Architecture Adaptation for Diverse Edge Environments
Figure 3 for AdaptiveNet: Post-deployment Neural Architecture Adaptation for Diverse Edge Environments
Figure 4 for AdaptiveNet: Post-deployment Neural Architecture Adaptation for Diverse Edge Environments
Viaarxiv icon

LUT-NN: Towards Unified Neural Network Inference by Table Lookup

Add code
Bookmark button
Alert button
Feb 07, 2023
Xiaohu Tang, Yang Wang, Ting Cao, Li Lyna Zhang, Qi Chen, Deng Cai, Yunxin Liu, Mao Yang

Figure 1 for LUT-NN: Towards Unified Neural Network Inference by Table Lookup
Figure 2 for LUT-NN: Towards Unified Neural Network Inference by Table Lookup
Figure 3 for LUT-NN: Towards Unified Neural Network Inference by Table Lookup
Figure 4 for LUT-NN: Towards Unified Neural Network Inference by Table Lookup
Viaarxiv icon

StrokeGAN+: Few-Shot Semi-Supervised Chinese Font Generation with Stroke Encoding

Add code
Bookmark button
Alert button
Nov 11, 2022
Jinshan Zeng, Yefei Wang, Qi Chen, Yunxin Liu, Mingwen Wang, Yuan Yao

Figure 1 for StrokeGAN+: Few-Shot Semi-Supervised Chinese Font Generation with Stroke Encoding
Figure 2 for StrokeGAN+: Few-Shot Semi-Supervised Chinese Font Generation with Stroke Encoding
Figure 3 for StrokeGAN+: Few-Shot Semi-Supervised Chinese Font Generation with Stroke Encoding
Figure 4 for StrokeGAN+: Few-Shot Semi-Supervised Chinese Font Generation with Stroke Encoding
Viaarxiv icon

Nesting Forward Automatic Differentiation for Memory-Efficient Deep Neural Network Training

Add code
Bookmark button
Alert button
Sep 22, 2022
Cong Guo, Yuxian Qiu, Jingwen Leng, Chen Zhang, Ying Cao, Quanlu Zhang, Yunxin Liu, Fan Yang, Minyi Guo

Figure 1 for Nesting Forward Automatic Differentiation for Memory-Efficient Deep Neural Network Training
Figure 2 for Nesting Forward Automatic Differentiation for Memory-Efficient Deep Neural Network Training
Figure 3 for Nesting Forward Automatic Differentiation for Memory-Efficient Deep Neural Network Training
Figure 4 for Nesting Forward Automatic Differentiation for Memory-Efficient Deep Neural Network Training
Viaarxiv icon

ANT: Exploiting Adaptive Numerical Data Type for Low-bit Deep Neural Network Quantization

Add code
Bookmark button
Alert button
Aug 30, 2022
Cong Guo, Chen Zhang, Jingwen Leng, Zihan Liu, Fan Yang, Yunxin Liu, Minyi Guo, Yuhao Zhu

Figure 1 for ANT: Exploiting Adaptive Numerical Data Type for Low-bit Deep Neural Network Quantization
Figure 2 for ANT: Exploiting Adaptive Numerical Data Type for Low-bit Deep Neural Network Quantization
Figure 3 for ANT: Exploiting Adaptive Numerical Data Type for Low-bit Deep Neural Network Quantization
Figure 4 for ANT: Exploiting Adaptive Numerical Data Type for Low-bit Deep Neural Network Quantization
Viaarxiv icon

Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting

Add code
Bookmark button
Alert button
Jun 11, 2022
Yunxin Liu, Qiaosi Yi, Jinshan Zeng

Figure 1 for Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting
Figure 2 for Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting
Figure 3 for Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting
Figure 4 for Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting
Viaarxiv icon

SQuant: On-the-Fly Data-Free Quantization via Diagonal Hessian Approximation

Add code
Bookmark button
Alert button
Feb 14, 2022
Cong Guo, Yuxian Qiu, Jingwen Leng, Xiaotian Gao, Chen Zhang, Yunxin Liu, Fan Yang, Yuhao Zhu, Minyi Guo

Figure 1 for SQuant: On-the-Fly Data-Free Quantization via Diagonal Hessian Approximation
Figure 2 for SQuant: On-the-Fly Data-Free Quantization via Diagonal Hessian Approximation
Figure 3 for SQuant: On-the-Fly Data-Free Quantization via Diagonal Hessian Approximation
Figure 4 for SQuant: On-the-Fly Data-Free Quantization via Diagonal Hessian Approximation
Viaarxiv icon