Get our free extension to see links to code for papers anywhere online!

 Add to Chrome

 Add to Firefox

CatalyzeX Code Finder - Browser extension linking code for ML papers across the web! | Product Hunt Embed
Do Transformers Need Deep Long-Range Memory

Jul 07, 2020
Jack W. Rae, Ali Razavi

* published at 58th Annual Meeting of the Association for Computational Linguistics. 6 pages, 4 figures, 1 table 

  Access Paper or Ask Questions

Compressive Transformers for Long-Range Sequence Modelling

Nov 13, 2019
Jack W. Rae, Anna Potapenko, Siddhant M. Jayakumar, Timothy P. Lillicrap

* 19 pages, 6 figures, 10 tables 

  Access Paper or Ask Questions

Stabilizing Transformers for Reinforcement Learning

Oct 13, 2019
Emilio Parisotto, H. Francis Song, Jack W. Rae, Razvan Pascanu, Caglar Gulcehre, Siddhant M. Jayakumar, Max Jaderberg, Raphael Lopez Kaufman, Aidan Clark, Seb Noury, Matthew M. Botvinick, Nicolas Heess, Raia Hadsell


  Access Paper or Ask Questions

V-MPO: On-Policy Maximum a Posteriori Policy Optimization for Discrete and Continuous Control

Sep 26, 2019
H. Francis Song, Abbas Abdolmaleki, Jost Tobias Springenberg, Aidan Clark, Hubert Soyer, Jack W. Rae, Seb Noury, Arun Ahuja, Siqi Liu, Dhruva Tirumala, Nicolas Heess, Dan Belov, Martin Riedmiller, Matthew M. Botvinick

* * equal contribution 

  Access Paper or Ask Questions

Memory-based Parameter Adaptation

Feb 28, 2018
Pablo Sprechmann, Siddhant M. Jayakumar, Jack W. Rae, Alexander Pritzel, Adrià Puigdomènech Badia, Benigno Uria, Oriol Vinyals, Demis Hassabis, Razvan Pascanu, Charles Blundell

* Published as a conference paper at ICLR 2018 

  Access Paper or Ask Questions