Alert button
Picture for SangJeong Lee

SangJeong Lee

Alert button

A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models

Add code
Bookmark button
Alert button
Nov 30, 2020
Jeong-Hoe Ku, JiHun Oh, YoungYoon Lee, Gaurav Pooniwala, SangJeong Lee

Figure 1 for A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models
Figure 2 for A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models
Figure 3 for A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models
Figure 4 for A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models
Viaarxiv icon

Weight Equalizing Shift Scaler-Coupled Post-training Quantization

Add code
Bookmark button
Alert button
Aug 13, 2020
Jihun Oh, SangJeong Lee, Meejeong Park, Pooni Walagaurav, Kiseok Kwon

Figure 1 for Weight Equalizing Shift Scaler-Coupled Post-training Quantization
Figure 2 for Weight Equalizing Shift Scaler-Coupled Post-training Quantization
Figure 3 for Weight Equalizing Shift Scaler-Coupled Post-training Quantization
Figure 4 for Weight Equalizing Shift Scaler-Coupled Post-training Quantization
Viaarxiv icon

Iterative Compression of End-to-End ASR Model using AutoML

Add code
Bookmark button
Alert button
Aug 06, 2020
Abhinav Mehrotra, Łukasz Dudziak, Jinsu Yeo, Young-yoon Lee, Ravichander Vipperla, Mohamed S. Abdelfattah, Sourav Bhattacharya, Samin Ishtiaq, Alberto Gil C. P. Ramos, SangJeong Lee, Daehyun Kim, Nicholas D. Lane

Figure 1 for Iterative Compression of End-to-End ASR Model using AutoML
Figure 2 for Iterative Compression of End-to-End ASR Model using AutoML
Figure 3 for Iterative Compression of End-to-End ASR Model using AutoML
Figure 4 for Iterative Compression of End-to-End ASR Model using AutoML
Viaarxiv icon