Alert button
Picture for Seyedarmin Azizi

Seyedarmin Azizi

Alert button

Memory-Efficient Vision Transformers: An Activation-Aware Mixed-Rank Compression Strategy

Add code
Bookmark button
Alert button
Feb 08, 2024
Seyedarmin Azizi, Mahdi Nazemi, Massoud Pedram

Viaarxiv icon

Low-Precision Mixed-Computation Models for Inference on Edge

Add code
Bookmark button
Alert button
Dec 03, 2023
Seyedarmin Azizi, Mahdi Nazemi, Mehdi Kamal, Massoud Pedram

Viaarxiv icon

Sensitivity-Aware Mixed-Precision Quantization and Width Optimization of Deep Neural Networks Through Cluster-Based Tree-Structured Parzen Estimation

Add code
Bookmark button
Alert button
Aug 16, 2023
Seyedarmin Azizi, Mahdi Nazemi, Arash Fayyazi, Massoud Pedram

Viaarxiv icon

SNT: Sharpness-Minimizing Network Transformation for Fast Compression-friendly Pretraining

Add code
Bookmark button
Alert button
May 08, 2023
Jung Hwan Heo, Seyedarmin Azizi, Arash Fayyazi, Mahdi Nazemi, Massoud Pedram

Figure 1 for SNT: Sharpness-Minimizing Network Transformation for Fast Compression-friendly Pretraining
Figure 2 for SNT: Sharpness-Minimizing Network Transformation for Fast Compression-friendly Pretraining
Figure 3 for SNT: Sharpness-Minimizing Network Transformation for Fast Compression-friendly Pretraining
Figure 4 for SNT: Sharpness-Minimizing Network Transformation for Fast Compression-friendly Pretraining
Viaarxiv icon