Alert button
Picture for Seonghak Kim

Seonghak Kim

Alert button

Maximizing Discrimination Capability of Knowledge Distillation with Energy-based Score

Add code
Bookmark button
Alert button
Nov 24, 2023
Seonghak Kim, Gyeongdo Ham, Suin Lee, Donggon Jang, Daeshik Kim

Viaarxiv icon

Cosine Similarity Knowledge Distillation for Individual Class Information Transfer

Add code
Bookmark button
Alert button
Nov 24, 2023
Gyeongdo Ham, Seonghak Kim, Suin Lee, Jae-Hyeok Lee, Daeshik Kim

Viaarxiv icon

Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning

Add code
Bookmark button
Alert button
Nov 23, 2023
Seonghak Kim, Gyeongdo Ham, Yucheol Cho, Daeshik Kim

Viaarxiv icon

A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation

Add code
Bookmark button
Alert button
Feb 22, 2020
Tae Jun Ham, Sung Jun Jung, Seonghak Kim, Young H. Oh, Yeonhong Park, Yoonho Song, Jung-Hun Park, Sanghee Lee, Kyoung Park, Jae W. Lee, Deog-Kyoon Jeong

Figure 1 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 2 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 3 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 4 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Viaarxiv icon