Picture for Pratyush Kumar

Pratyush Kumar

Empowering Low-Resource Language ASR via Large-Scale Pseudo Labeling

Add code
Aug 26, 2024
Figure 1 for Empowering Low-Resource Language ASR via Large-Scale Pseudo Labeling
Figure 2 for Empowering Low-Resource Language ASR via Large-Scale Pseudo Labeling
Figure 3 for Empowering Low-Resource Language ASR via Large-Scale Pseudo Labeling
Figure 4 for Empowering Low-Resource Language ASR via Large-Scale Pseudo Labeling
Viaarxiv icon

RoundTable: Leveraging Dynamic Schema and Contextual Autocomplete for Enhanced Query Precision in Tabular Question Answering

Add code
Aug 23, 2024
Viaarxiv icon

IndicLLMSuite: A Blueprint for Creating Pre-training and Fine-Tuning Datasets for Indian Languages

Add code
Mar 11, 2024
Viaarxiv icon

IndicVoices: Towards building an Inclusive Multilingual Speech Dataset for Indian Languages

Add code
Mar 04, 2024
Viaarxiv icon

DSFormer: Effective Compression of Text-Transformers by Dense-Sparse Weight Factorization

Add code
Dec 20, 2023
Viaarxiv icon

Svarah: Evaluating English ASR Systems on Indian Accents

Add code
May 25, 2023
Figure 1 for Svarah: Evaluating English ASR Systems on Indian Accents
Figure 2 for Svarah: Evaluating English ASR Systems on Indian Accents
Figure 3 for Svarah: Evaluating English ASR Systems on Indian Accents
Figure 4 for Svarah: Evaluating English ASR Systems on Indian Accents
Viaarxiv icon

IndicTrans2: Towards High-Quality and Accessible Machine Translation Models for all 22 Scheduled Indian Languages

Add code
May 25, 2023
Viaarxiv icon

Vistaar: Diverse Benchmarks and Training Sets for Indian Language ASR

Add code
May 24, 2023
Viaarxiv icon

Large Language Models Humanize Technology

Add code
May 09, 2023
Figure 1 for Large Language Models Humanize Technology
Figure 2 for Large Language Models Humanize Technology
Figure 3 for Large Language Models Humanize Technology
Figure 4 for Large Language Models Humanize Technology
Viaarxiv icon

An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models

Add code
Apr 19, 2023
Figure 1 for An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models
Figure 2 for An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models
Figure 3 for An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models
Figure 4 for An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models
Viaarxiv icon