Alert button
Picture for Kuluhan Binici

Kuluhan Binici

Alert button

CRISP: Hybrid Structured Sparsity for Class-aware Model Pruning

Add code
Bookmark button
Alert button
Nov 24, 2023
Shivam Aggarwal, Kuluhan Binici, Tulika Mitra

Viaarxiv icon

Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks

Add code
Bookmark button
Alert button
Mar 13, 2023
Cihan Acar, Kuluhan Binici, Alp Tekirdağ, Wu Yan

Figure 1 for Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks
Figure 2 for Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks
Figure 3 for Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks
Figure 4 for Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks
Viaarxiv icon

Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay

Add code
Bookmark button
Alert button
Jan 09, 2022
Kuluhan Binici, Shivam Aggarwal, Nam Trung Pham, Karianto Leman, Tulika Mitra

Figure 1 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 2 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 3 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 4 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Viaarxiv icon

Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data

Add code
Bookmark button
Alert button
Aug 11, 2021
Kuluhan Binici, Nam Trung Pham, Tulika Mitra, Karianto Leman

Figure 1 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 2 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 3 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 4 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Viaarxiv icon