Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

GRANITE: A Graph Neural Network Model for Basic Block Throughput Estimation


Oct 11, 2022
Ondrej Sykora, Phitchaya Mangpo Phothilimthana, Charith Mendis, Amir Yazdanbakhsh

Add code

* 13 pages; 5 figures; published at IISWC 2022; Included IEEE copyright; 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Text and Patterns: For Effective Chain of Thought, It Takes Two to Tango


Sep 16, 2022
Aman Madaan, Amir Yazdanbakhsh

Add code

* 115 pages, 15 figures, and 84 tables. The authors contributed equally. Work done when Aman Madaan was a student researcher at Google Research, Brain Team 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Training Recipe for N:M Structured Sparsity with Decaying Pruning Mask


Sep 15, 2022
Sheng-Chun Kao, Amir Yazdanbakhsh, Suvinay Subramanian, Shivani Agrawal, Utku Evci, Tushar Krishna

Add code

* 11 pages, 2 figures, and 9 tables. Published at the ICML Workshop on Sparsity in Neural Networks Advancing Understanding and Practice, 2022. First two authors contributed equally 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Sparse Attention Acceleration with Synergistic In-Memory Pruning and On-Chip Recomputation


Sep 01, 2022
Amir Yazdanbakhsh, Ashkan Moradifirouzabadi, Zheng Li, Mingu Kang

Add code

* 15 pages; 14 figures; published at MICRO 2022; First three authors contributed equally 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Accelerating Attention through Gradient-Based Learned Runtime Pruning


Apr 15, 2022
Zheng Li, Soroush Ghodrati, Amir Yazdanbakhsh, Hadi Esmaeilzadeh, Mingu Kang

Add code

* First three authors contributed equally; published at ISCA 2022 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Data-Driven Offline Optimization For Architecting Hardware Accelerators


Oct 20, 2021
Aviral Kumar, Amir Yazdanbakhsh, Milad Hashemi, Kevin Swersky, Sergey Levine

Add code

* First two authors contributed equally 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

An Evaluation of Edge TPU Accelerators for Convolutional Neural Networks


Feb 20, 2021
Amir Yazdanbakhsh, Kiran Seshadri, Berkin Akin, James Laudon, Ravi Narayanaswami

Add code

* 11 pages, 15 figures, submitted to ISCA 2021 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Rethinking Co-design of Neural Architectures and Hardware Accelerators


Feb 17, 2021
Yanqi Zhou, Xuanyi Dong, Berkin Akin, Mingxing Tan, Daiyi Peng, Tianjian Meng, Amir Yazdanbakhsh, Da Huang, Ravi Narayanaswami, James Laudon

Add code


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Apollo: Transferable Architecture Exploration


Feb 02, 2021
Amir Yazdanbakhsh, Christof Angermueller, Berkin Akin, Yanqi Zhou, Albin Jones, Milad Hashemi, Kevin Swersky, Satrajit Chatterjee, Ravi Narayanaswami, James Laudon

Add code

* 10 pages, 5 figures, Accepted to Workshop on ML for Systems at the 34th Conference on Neural Information Processing Systems (NeurIPS 2020) 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email
1
2
>>