Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

Large vocabulary speech recognition for languages of Africa: multilingual modeling and self-supervised learning


Aug 05, 2022
Sandy Ritchie, You-Chi Cheng, Mingqing Chen, Rajiv Mathews, Daan van Esch, Bo Li, Khe Chai Sim


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

UserLibri: A Dataset for ASR Personalization Using Only Text


Jul 02, 2022
Theresa Breiner, Swaroop Ramaswamy, Ehsan Variani, Shefali Garg, Rajiv Mathews, Khe Chai Sim, Kilol Gupta, Mingqing Chen, Lara McConnaughey

* Accepted for publication in Interspeech 2022. 9 total pages with appendix, 9 total tables, 5 total figures 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Mixed Federated Learning: Joint Decentralized and Centralized Learning


May 26, 2022
Sean Augenstein, Andrew Hard, Lin Ning, Karan Singhal, Satyen Kale, Kurt Partridge, Rajiv Mathews

* 36 pages, 12 figures 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Online Model Compression for Federated Learning with Large Models


May 06, 2022
Tien-Ju Yang, Yonghui Xiao, Giovanni Motta, Fran├žoise Beaufays, Rajiv Mathews, Mingqing Chen

* Submitted to INTERSPEECH 2022 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Detecting Unintended Memorization in Language-Model-Fused ASR


Apr 20, 2022
W. Ronny Huang, Steve Chien, Om Thakkar, Rajiv Mathews


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Extracting Targeted Training Data from ASR Models, and How to Mitigate It


Apr 18, 2022
Ehsan Amid, Om Thakkar, Arun Narayanan, Rajiv Mathews, Fran├žoise Beaufays


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Production federated keyword spotting via distillation, filtering, and joint federated-centralized training


Apr 11, 2022
Andrew Hard, Kurt Partridge, Neng Chen, Sean Augenstein, Aishanee Shah, Hyun Jin Park, Alex Park, Sara Ng, Jessica Nguyen, Ignacio Lopez Moreno, Rajiv Mathews, Fran├žoise Beaufays

* Submitted to Interspeech 2022 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Capitalization Normalization for Language Modeling with an Accurate and Efficient Hierarchical RNN Model


Feb 16, 2022
Hao Zhang, You-Chi Cheng, Shankar Kumar, W. Ronny Huang, Mingqing Chen, Rajiv Mathews

* arXiv admin note: substantial text overlap with arXiv:2108.11943 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Public Data-Assisted Mirror Descent for Private Model Training


Dec 01, 2021
Ehsan Amid, Arun Ganesh, Rajiv Mathews, Swaroop Ramaswamy, Shuang Song, Thomas Steinke, Vinith M. Suriyakumar, Om Thakkar, Abhradeep Thakurta

* 20 pages, 9 figures, 3 tables 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Jointly Learning from Decentralized (Federated) and Centralized Data to Mitigate Distribution Shift


Nov 23, 2021
Sean Augenstein, Andrew Hard, Kurt Partridge, Rajiv Mathews

* 9 pages, 1 figure. Camera-ready NeurIPS 2021 DistShift workshop version 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email
1
2
3
>>