Get our free extension to see links to code for papers anywhere online!

Chrome logo Add to Chrome

Firefox logo Add to Firefox

Picture for Raden Mu'az Mun'im

Sequence-Level Knowledge Distillation for Model Compression of Attention-based Sequence-to-Sequence Speech Recognition


Nov 12, 2018
Raden Mu'az Mun'im, Nakamasa Inoue, Koichi Shinoda


  Access Paper or Ask Questions