Picture for Joonyoung Kim

Joonyoung Kim

Attention-aware Post-training Quantization without Backpropagation

Add code
Jun 19, 2024
Viaarxiv icon

Towards Next-Level Post-Training Quantization of Hyper-Scale Transformers

Add code
Feb 14, 2024
Figure 1 for Towards Next-Level Post-Training Quantization of Hyper-Scale Transformers
Figure 2 for Towards Next-Level Post-Training Quantization of Hyper-Scale Transformers
Figure 3 for Towards Next-Level Post-Training Quantization of Hyper-Scale Transformers
Figure 4 for Towards Next-Level Post-Training Quantization of Hyper-Scale Transformers
Viaarxiv icon

Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning

Add code
Jul 15, 2023
Figure 1 for Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning
Figure 2 for Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning
Figure 3 for Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning
Figure 4 for Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning
Viaarxiv icon

Augment & Valuate : A Data Enhancement Pipeline for Data-Centric AI

Add code
Dec 07, 2021
Figure 1 for Augment & Valuate : A Data Enhancement Pipeline for Data-Centric AI
Figure 2 for Augment & Valuate : A Data Enhancement Pipeline for Data-Centric AI
Figure 3 for Augment & Valuate : A Data Enhancement Pipeline for Data-Centric AI
Figure 4 for Augment & Valuate : A Data Enhancement Pipeline for Data-Centric AI
Viaarxiv icon

Neural Sequence-to-grid Module for Learning Symbolic Rules

Add code
Jan 13, 2021
Figure 1 for Neural Sequence-to-grid Module for Learning Symbolic Rules
Figure 2 for Neural Sequence-to-grid Module for Learning Symbolic Rules
Figure 3 for Neural Sequence-to-grid Module for Learning Symbolic Rules
Figure 4 for Neural Sequence-to-grid Module for Learning Symbolic Rules
Viaarxiv icon