Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

Reactive Exploration to Cope with Non-Stationarity in Lifelong Reinforcement Learning


Jul 12, 2022
Christian Steinparz, Thomas Schmied, Fabian Paischer, Marius-Constantin Dinu, Vihang Patil, Angela Bitto-Nemling, Hamid Eghbal-zadeh, Sepp Hochreiter

* CoLLAs 2022 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Few-Shot Learning by Dimensionality Reduction in Gradient Space


Jun 07, 2022
Martin Gauch, Maximilian Beck, Thomas Adler, Dmytro Kotsur, Stefan Fiel, Hamid Eghbal-zadeh, Johannes Brandstetter, Johannes Kofler, Markus Holzleitner, Werner Zellinger, Daniel Klotz, Sepp Hochreiter, Sebastian Lehner

* Accepted at Conference on Lifelong Learning Agents (CoLLAs) 2022. Code: https://github.com/ml-jku/subgd Blog post: https://ml-jku.github.io/subgd 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

History Compression via Language Models in Reinforcement Learning


May 24, 2022
Fabian Paischer, Thomas Adler, Vihang Patil, Angela Bitto-Nemling, Markus Holzleitner, Sebastian Lehner, Hamid Eghbal-zadeh, Sepp Hochreiter


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Efficient Training of Audio Transformers with Patchout


Oct 29, 2021
Khaled Koutini, Jan Schlüter, Hamid Eghbal-zadeh, Gerhard Widmer

* Source code: https://github.com/kkoutini/PaSST 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Over-Parameterization and Generalization in Audio Classification


Jul 19, 2021
Khaled Koutini, Hamid Eghbal-zadeh, Florian Henkel, Jan Schlüter, Gerhard Widmer

* Presented at the ICML 2021 Workshop on Overparameterization: Pitfalls & Opportunities 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Receptive Field Regularization Techniques for Audio Classification and Tagging with Deep Convolutional Neural Networks


May 26, 2021
Khaled Koutini, Hamid Eghbal-zadeh, Gerhard Widmer

* Accepted in IEEE/ACM Transactions on Audio, Speech, and Language Processing. Code available: https://github.com/kkoutini/cpjku_dcase20 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Low-Complexity Models for Acoustic Scene Classification Based on Receptive Field Regularization and Frequency Damping


Nov 05, 2020
Khaled Koutini, Florian Henkel, Hamid Eghbal-zadeh, Gerhard Widmer

* Proceedings of the Detection and Classification of Acoustic Scenes and Events 2020 Workshop (DCASE2020) 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

On Data Augmentation and Adversarial Risk: An Empirical Analysis


Jul 06, 2020
Hamid Eghbal-zadeh, Khaled Koutini, Paul Primus, Verena Haunschmid, Michal Lewandowski, Werner Zellinger, Bernhard A. Moser, Gerhard Widmer

* 21 pages, 15 figures, 3 tables 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Emotion and Theme Recognition in Music with Frequency-Aware RF-Regularized CNNs


Oct 28, 2019
Khaled Koutini, Shreyan Chowdhury, Verena Haunschmid, Hamid Eghbal-zadeh, Gerhard Widmer

* MediaEval`19, 27-29 October 2019, Sophia Antipolis, France 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email
1
2
>>