Alert button
Picture for Charbel Sakr

Charbel Sakr

Alert button

VaPr: Variable-Precision Tensors to Accelerate Robot Motion Planning

Oct 11, 2023
Yu-Shun Hsiao, Siva Kumar Sastry Hari, Balakumar Sundaralingam, Jason Yik, Thierry Tambe, Charbel Sakr, Stephen W. Keckler, Vijay Janapa Reddi

Figure 1 for VaPr: Variable-Precision Tensors to Accelerate Robot Motion Planning
Figure 2 for VaPr: Variable-Precision Tensors to Accelerate Robot Motion Planning
Figure 3 for VaPr: Variable-Precision Tensors to Accelerate Robot Motion Planning
Figure 4 for VaPr: Variable-Precision Tensors to Accelerate Robot Motion Planning
Viaarxiv icon

Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training

Jun 13, 2022
Charbel Sakr, Steve Dai, Rangharajan Venkatesan, Brian Zimmer, William J. Dally, Brucek Khailany

Figure 1 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Figure 2 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Figure 3 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Figure 4 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Viaarxiv icon

Fundamental Limits on Energy-Delay-Accuracy of In-memory Architectures in Inference Applications

Dec 25, 2020
Sujan Kumar Gonugondla, Charbel Sakr, Hassan Dbouk, Naresh R. Shanbhag

Figure 1 for Fundamental Limits on Energy-Delay-Accuracy of In-memory Architectures in Inference Applications
Figure 2 for Fundamental Limits on Energy-Delay-Accuracy of In-memory Architectures in Inference Applications
Figure 3 for Fundamental Limits on Energy-Delay-Accuracy of In-memory Architectures in Inference Applications
Figure 4 for Fundamental Limits on Energy-Delay-Accuracy of In-memory Architectures in Inference Applications
Viaarxiv icon

HarDNN: Feature Map Vulnerability Evaluation in CNNs

Feb 25, 2020
Abdulrahman Mahmoud, Siva Kumar Sastry Hari, Christopher W. Fletcher, Sarita V. Adve, Charbel Sakr, Naresh Shanbhag, Pavlo Molchanov, Michael B. Sullivan, Timothy Tsai, Stephen W. Keckler

Figure 1 for HarDNN: Feature Map Vulnerability Evaluation in CNNs
Figure 2 for HarDNN: Feature Map Vulnerability Evaluation in CNNs
Figure 3 for HarDNN: Feature Map Vulnerability Evaluation in CNNs
Figure 4 for HarDNN: Feature Map Vulnerability Evaluation in CNNs
Viaarxiv icon

Accumulation Bit-Width Scaling For Ultra-Low Precision Training Of Deep Networks

Jan 19, 2019
Charbel Sakr, Naigang Wang, Chia-Yu Chen, Jungwook Choi, Ankur Agrawal, Naresh Shanbhag, Kailash Gopalakrishnan

Figure 1 for Accumulation Bit-Width Scaling For Ultra-Low Precision Training Of Deep Networks
Figure 2 for Accumulation Bit-Width Scaling For Ultra-Low Precision Training Of Deep Networks
Figure 3 for Accumulation Bit-Width Scaling For Ultra-Low Precision Training Of Deep Networks
Figure 4 for Accumulation Bit-Width Scaling For Ultra-Low Precision Training Of Deep Networks
Viaarxiv icon

Per-Tensor Fixed-Point Quantization of the Back-Propagation Algorithm

Dec 31, 2018
Charbel Sakr, Naresh Shanbhag

Figure 1 for Per-Tensor Fixed-Point Quantization of the Back-Propagation Algorithm
Figure 2 for Per-Tensor Fixed-Point Quantization of the Back-Propagation Algorithm
Figure 3 for Per-Tensor Fixed-Point Quantization of the Back-Propagation Algorithm
Figure 4 for Per-Tensor Fixed-Point Quantization of the Back-Propagation Algorithm
Viaarxiv icon

Understanding the Energy and Precision Requirements for Online Learning

Aug 26, 2016
Charbel Sakr, Ameya Patil, Sai Zhang, Yongjune Kim, Naresh Shanbhag

Figure 1 for Understanding the Energy and Precision Requirements for Online Learning
Figure 2 for Understanding the Energy and Precision Requirements for Online Learning
Figure 3 for Understanding the Energy and Precision Requirements for Online Learning
Figure 4 for Understanding the Energy and Precision Requirements for Online Learning
Viaarxiv icon