Picture for Seonghak Kim

Seonghak Kim

FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer

Add code
May 17, 2025
Viaarxiv icon

Cosine Similarity Knowledge Distillation for Individual Class Information Transfer

Add code
Nov 24, 2023
Viaarxiv icon

Maximizing Discrimination Capability of Knowledge Distillation with Energy-based Score

Add code
Nov 24, 2023
Figure 1 for Maximizing Discrimination Capability of Knowledge Distillation with Energy-based Score
Figure 2 for Maximizing Discrimination Capability of Knowledge Distillation with Energy-based Score
Figure 3 for Maximizing Discrimination Capability of Knowledge Distillation with Energy-based Score
Figure 4 for Maximizing Discrimination Capability of Knowledge Distillation with Energy-based Score
Viaarxiv icon

Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning

Add code
Nov 23, 2023
Viaarxiv icon

A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation

Add code
Feb 22, 2020
Figure 1 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 2 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 3 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 4 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Viaarxiv icon