Alert button
Picture for Jongpil Kim

Jongpil Kim

Alert button

Co-training and Co-distillation for Quality Improvement and Compression of Language Models

Add code
Bookmark button
Alert button
Nov 07, 2023
Hayeon Lee, Rui Hou, Jongpil Kim, Davis Liang, Hongbo Zhang, Sung Ju Hwang, Alexander Min

Viaarxiv icon

A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models

Add code
Bookmark button
Alert button
May 26, 2023
Hayeon Lee, Rui Hou, Jongpil Kim, Davis Liang, Sung Ju Hwang, Alexander Min

Figure 1 for A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
Figure 2 for A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
Figure 3 for A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
Figure 4 for A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
Viaarxiv icon

Discovering Characteristic Landmarks on Ancient Coins using Convolutional Networks

Add code
Bookmark button
Alert button
Jul 01, 2015
Jongpil Kim, Vladimir Pavlovic

Figure 1 for Discovering Characteristic Landmarks on Ancient Coins using Convolutional Networks
Figure 2 for Discovering Characteristic Landmarks on Ancient Coins using Convolutional Networks
Figure 3 for Discovering Characteristic Landmarks on Ancient Coins using Convolutional Networks
Figure 4 for Discovering Characteristic Landmarks on Ancient Coins using Convolutional Networks
Viaarxiv icon