Get our free extension to see links to code for papers anywhere online!

 Add to Chrome

 Add to Firefox

CatalyzeX Code Finder - Browser extension linking code for ML papers across the web! | Product Hunt Embed
On the Validity of Self-Attention as Explanation in Transformer Models

Aug 12, 2019
Gino Brunner, Yang Liu, Damián Pascual, Oliver Richter, Roger Wattenhofer

* Preprint. Work in progress 

  Access Paper or Ask Questions