Alert button
Picture for Andrew Anderson

Andrew Anderson

Alert button

Domino Saliency Metrics: Improving Existing Channel Saliency Metrics with Structural Information

Add code
Bookmark button
Alert button
May 04, 2022
Kaveena Persand, Andrew Anderson, David Gregg

Figure 1 for Domino Saliency Metrics: Improving Existing Channel Saliency Metrics with Structural Information
Figure 2 for Domino Saliency Metrics: Improving Existing Channel Saliency Metrics with Structural Information
Figure 3 for Domino Saliency Metrics: Improving Existing Channel Saliency Metrics with Structural Information
Figure 4 for Domino Saliency Metrics: Improving Existing Channel Saliency Metrics with Structural Information
Viaarxiv icon

Winograd Convolution for Deep Neural Networks: Efficient Point Selection

Add code
Bookmark button
Alert button
Jan 25, 2022
Syed Asad Alam, Andrew Anderson, Barbara Barabasz, David Gregg

Figure 1 for Winograd Convolution for Deep Neural Networks: Efficient Point Selection
Figure 2 for Winograd Convolution for Deep Neural Networks: Efficient Point Selection
Figure 3 for Winograd Convolution for Deep Neural Networks: Efficient Point Selection
Figure 4 for Winograd Convolution for Deep Neural Networks: Efficient Point Selection
Viaarxiv icon

TASO: Time and Space Optimization for Memory-Constrained DNN Inference

Add code
Bookmark button
Alert button
May 21, 2020
Yuan Wen, Andrew Anderson, Valentin Radu, Michael F. P. O'Boyle, David Gregg

Figure 1 for TASO: Time and Space Optimization for Memory-Constrained DNN Inference
Figure 2 for TASO: Time and Space Optimization for Memory-Constrained DNN Inference
Figure 3 for TASO: Time and Space Optimization for Memory-Constrained DNN Inference
Figure 4 for TASO: Time and Space Optimization for Memory-Constrained DNN Inference
Viaarxiv icon

Composition of Saliency Metrics for Channel Pruning with a Myopic Oracle

Add code
Bookmark button
Alert button
Apr 03, 2020
Kaveena Persand, Andrew Anderson, David Gregg

Figure 1 for Composition of Saliency Metrics for Channel Pruning with a Myopic Oracle
Figure 2 for Composition of Saliency Metrics for Channel Pruning with a Myopic Oracle
Figure 3 for Composition of Saliency Metrics for Channel Pruning with a Myopic Oracle
Figure 4 for Composition of Saliency Metrics for Channel Pruning with a Myopic Oracle
Viaarxiv icon

Performance-Oriented Neural Architecture Search

Add code
Bookmark button
Alert button
Jan 09, 2020
Andrew Anderson, Jing Su, Rozenn Dahyot, David Gregg

Figure 1 for Performance-Oriented Neural Architecture Search
Figure 2 for Performance-Oriented Neural Architecture Search
Figure 3 for Performance-Oriented Neural Architecture Search
Figure 4 for Performance-Oriented Neural Architecture Search
Viaarxiv icon

A Taxonomy of Channel Pruning Signals in CNNs

Add code
Bookmark button
Alert button
Jun 11, 2019
Kaveena Persand, Andrew Anderson, David Gregg

Figure 1 for A Taxonomy of Channel Pruning Signals in CNNs
Figure 2 for A Taxonomy of Channel Pruning Signals in CNNs
Figure 3 for A Taxonomy of Channel Pruning Signals in CNNs
Figure 4 for A Taxonomy of Channel Pruning Signals in CNNs
Viaarxiv icon

Explaining Reinforcement Learning to Mere Mortals: An Empirical Study

Add code
Bookmark button
Alert button
Mar 22, 2019
Andrew Anderson, Jonathan Dodge, Amrita Sadarangani, Zoe Juozapaitis, Evan Newman, Jed Irvine, Souti Chattopadhyay, Alan Fern, Margaret Burnett

Figure 1 for Explaining Reinforcement Learning to Mere Mortals: An Empirical Study
Figure 2 for Explaining Reinforcement Learning to Mere Mortals: An Empirical Study
Figure 3 for Explaining Reinforcement Learning to Mere Mortals: An Empirical Study
Figure 4 for Explaining Reinforcement Learning to Mere Mortals: An Empirical Study
Viaarxiv icon

Optimal DNN Primitive Selection with Partitioned Boolean Quadratic Programming

Add code
Bookmark button
Alert button
Nov 02, 2018
Andrew Anderson, David Gregg

Figure 1 for Optimal DNN Primitive Selection with Partitioned Boolean Quadratic Programming
Figure 2 for Optimal DNN Primitive Selection with Partitioned Boolean Quadratic Programming
Figure 3 for Optimal DNN Primitive Selection with Partitioned Boolean Quadratic Programming
Figure 4 for Optimal DNN Primitive Selection with Partitioned Boolean Quadratic Programming
Viaarxiv icon

Scalar Arithmetic Multiple Data: Customizable Precision for Deep Neural Networks

Add code
Bookmark button
Alert button
Sep 27, 2018
Andrew Anderson, David Gregg

Figure 1 for Scalar Arithmetic Multiple Data: Customizable Precision for Deep Neural Networks
Figure 2 for Scalar Arithmetic Multiple Data: Customizable Precision for Deep Neural Networks
Figure 3 for Scalar Arithmetic Multiple Data: Customizable Precision for Deep Neural Networks
Figure 4 for Scalar Arithmetic Multiple Data: Customizable Precision for Deep Neural Networks
Viaarxiv icon

Error Analysis and Improving the Accuracy of Winograd Convolution for Deep Neural Networks

Add code
Bookmark button
Alert button
Sep 22, 2018
Barbara Barabasz, Andrew Anderson, David Gregg

Figure 1 for Error Analysis and Improving the Accuracy of Winograd Convolution for Deep Neural Networks
Figure 2 for Error Analysis and Improving the Accuracy of Winograd Convolution for Deep Neural Networks
Figure 3 for Error Analysis and Improving the Accuracy of Winograd Convolution for Deep Neural Networks
Figure 4 for Error Analysis and Improving the Accuracy of Winograd Convolution for Deep Neural Networks
Viaarxiv icon