Alert button
Picture for Seungyeon Kim

Seungyeon Kim

Alert button

Supervision Complexity and its Role in Knowledge Distillation

Add code
Bookmark button
Alert button
Jan 28, 2023
Hrayr Harutyunyan, Ankit Singh Rawat, Aditya Krishna Menon, Seungyeon Kim, Sanjiv Kumar

Figure 1 for Supervision Complexity and its Role in Knowledge Distillation
Figure 2 for Supervision Complexity and its Role in Knowledge Distillation
Figure 3 for Supervision Complexity and its Role in Knowledge Distillation
Figure 4 for Supervision Complexity and its Role in Knowledge Distillation
Viaarxiv icon

EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval

Add code
Bookmark button
Alert button
Jan 27, 2023
Seungyeon Kim, Ankit Singh Rawat, Manzil Zaheer, Sadeep Jayasumana, Veeranjaneyulu Sadhanala, Wittawat Jitkrittum, Aditya Krishna Menon, Rob Fergus, Sanjiv Kumar

Figure 1 for EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval
Figure 2 for EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval
Figure 3 for EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval
Figure 4 for EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval
Viaarxiv icon

Teacher Guided Training: An Efficient Framework for Knowledge Transfer

Add code
Bookmark button
Alert button
Aug 14, 2022
Manzil Zaheer, Ankit Singh Rawat, Seungyeon Kim, Chong You, Himanshu Jain, Andreas Veit, Rob Fergus, Sanjiv Kumar

Figure 1 for Teacher Guided Training: An Efficient Framework for Knowledge Transfer
Figure 2 for Teacher Guided Training: An Efficient Framework for Knowledge Transfer
Figure 3 for Teacher Guided Training: An Efficient Framework for Knowledge Transfer
Figure 4 for Teacher Guided Training: An Efficient Framework for Knowledge Transfer
Viaarxiv icon

Balancing Robustness and Sensitivity using Feature Contrastive Learning

Add code
Bookmark button
Alert button
May 19, 2021
Seungyeon Kim, Daniel Glasner, Srikumar Ramalingam, Cho-Jui Hsieh, Kishore Papineni, Sanjiv Kumar

Figure 1 for Balancing Robustness and Sensitivity using Feature Contrastive Learning
Figure 2 for Balancing Robustness and Sensitivity using Feature Contrastive Learning
Figure 3 for Balancing Robustness and Sensitivity using Feature Contrastive Learning
Figure 4 for Balancing Robustness and Sensitivity using Feature Contrastive Learning
Viaarxiv icon

On the Reproducibility of Neural Network Predictions

Add code
Bookmark button
Alert button
Feb 05, 2021
Srinadh Bhojanapalli, Kimberly Wilber, Andreas Veit, Ankit Singh Rawat, Seungyeon Kim, Aditya Menon, Sanjiv Kumar

Figure 1 for On the Reproducibility of Neural Network Predictions
Figure 2 for On the Reproducibility of Neural Network Predictions
Figure 3 for On the Reproducibility of Neural Network Predictions
Figure 4 for On the Reproducibility of Neural Network Predictions
Viaarxiv icon

Semantic Label Smoothing for Sequence to Sequence Problems

Add code
Bookmark button
Alert button
Oct 15, 2020
Michal Lukasik, Himanshu Jain, Aditya Krishna Menon, Seungyeon Kim, Srinadh Bhojanapalli, Felix Yu, Sanjiv Kumar

Figure 1 for Semantic Label Smoothing for Sequence to Sequence Problems
Figure 2 for Semantic Label Smoothing for Sequence to Sequence Problems
Figure 3 for Semantic Label Smoothing for Sequence to Sequence Problems
Figure 4 for Semantic Label Smoothing for Sequence to Sequence Problems
Viaarxiv icon

Evaluations and Methods for Explanation through Robustness Analysis

Add code
Bookmark button
Alert button
May 31, 2020
Cheng-Yu Hsieh, Chih-Kuan Yeh, Xuanqing Liu, Pradeep Ravikumar, Seungyeon Kim, Sanjiv Kumar, Cho-Jui Hsieh

Figure 1 for Evaluations and Methods for Explanation through Robustness Analysis
Figure 2 for Evaluations and Methods for Explanation through Robustness Analysis
Figure 3 for Evaluations and Methods for Explanation through Robustness Analysis
Figure 4 for Evaluations and Methods for Explanation through Robustness Analysis
Viaarxiv icon

Why distillation helps: a statistical perspective

Add code
Bookmark button
Alert button
May 21, 2020
Aditya Krishna Menon, Ankit Singh Rawat, Sashank J. Reddi, Seungyeon Kim, Sanjiv Kumar

Figure 1 for Why distillation helps: a statistical perspective
Figure 2 for Why distillation helps: a statistical perspective
Figure 3 for Why distillation helps: a statistical perspective
Figure 4 for Why distillation helps: a statistical perspective
Viaarxiv icon

Why ADAM Beats SGD for Attention Models

Add code
Bookmark button
Alert button
Dec 06, 2019
Jingzhao Zhang, Sai Praneeth Karimireddy, Andreas Veit, Seungyeon Kim, Sashank J Reddi, Sanjiv Kumar, Suvrit Sra

Figure 1 for Why ADAM Beats SGD for Attention Models
Figure 2 for Why ADAM Beats SGD for Attention Models
Figure 3 for Why ADAM Beats SGD for Attention Models
Figure 4 for Why ADAM Beats SGD for Attention Models
Viaarxiv icon

Local Space-Time Smoothing for Version Controlled Documents

Add code
Bookmark button
Alert button
Aug 08, 2013
Seungyeon Kim, Guy Lebanon

Figure 1 for Local Space-Time Smoothing for Version Controlled Documents
Figure 2 for Local Space-Time Smoothing for Version Controlled Documents
Figure 3 for Local Space-Time Smoothing for Version Controlled Documents
Figure 4 for Local Space-Time Smoothing for Version Controlled Documents
Viaarxiv icon