Alert button
Picture for Surangika Ranathunga

Surangika Ranathunga

Alert button

Automating Research Synthesis with Domain-Specific Large Language Model Fine-Tuning

Add code
Bookmark button
Alert button
Apr 08, 2024
Teo Susnjak, Peter Hwang, Napoleon H. Reyes, Andre L. C. Barczak, Timothy R. McIntosh, Surangika Ranathunga

Viaarxiv icon

Unlocking Parameter-Efficient Fine-Tuning for Low-Resource Language Translation

Add code
Bookmark button
Alert button
Apr 05, 2024
Tong Su, Xin Peng, Sarubi Thillainathan, David Guzmán, Surangika Ranathunga, En-Shiun Annie Lee

Viaarxiv icon

Harnessing the power of LLMs for normative reasoning in MASs

Add code
Bookmark button
Alert button
Mar 25, 2024
Bastin Tony Roy Savarimuthu, Surangika Ranathunga, Stephen Cranefield

Viaarxiv icon

Quality Does Matter: A Detailed Look at the Quality and Utility of Web-Mined Parallel Corpora

Add code
Bookmark button
Alert button
Feb 13, 2024
Surangika Ranathunga, Nisansa de Silva, Menan Velayuthan, Aloka Fernando, Charitha Rathnayake

Viaarxiv icon

Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation

Add code
Bookmark button
Alert button
Jun 02, 2023
Shravan Nayak, Surangika Ranathunga, Sarubi Thillainathan, Rikki Hung, Anthony Rinaldi, Yining Wang, Jonah Mackey, Andrew Ho, En-Shiun Annie Lee

Figure 1 for Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation
Figure 2 for Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation
Figure 3 for Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation
Figure 4 for Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation
Viaarxiv icon

Some Languages are More Equal than Others: Probing Deeper into the Linguistic Disparity in the NLP World

Add code
Bookmark button
Alert button
Oct 20, 2022
Surangika Ranathunga, Nisansa de Silva

Figure 1 for Some Languages are More Equal than Others: Probing Deeper into the Linguistic Disparity in the NLP World
Figure 2 for Some Languages are More Equal than Others: Probing Deeper into the Linguistic Disparity in the NLP World
Figure 3 for Some Languages are More Equal than Others: Probing Deeper into the Linguistic Disparity in the NLP World
Figure 4 for Some Languages are More Equal than Others: Probing Deeper into the Linguistic Disparity in the NLP World
Viaarxiv icon

BERTifying Sinhala -- A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification

Add code
Bookmark button
Alert button
Aug 17, 2022
Vinura Dhananjaya, Piyumal Demotte, Surangika Ranathunga, Sanath Jayasena

Figure 1 for BERTifying Sinhala -- A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification
Figure 2 for BERTifying Sinhala -- A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification
Figure 3 for BERTifying Sinhala -- A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification
Figure 4 for BERTifying Sinhala -- A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification
Viaarxiv icon

Data Augmentation to Address Out-of-Vocabulary Problem in Low-Resource Sinhala-English Neural Machine Translation

Add code
Bookmark button
Alert button
May 18, 2022
Aloka Fernando, Surangika Ranathunga

Figure 1 for Data Augmentation to Address Out-of-Vocabulary Problem in Low-Resource Sinhala-English Neural Machine Translation
Figure 2 for Data Augmentation to Address Out-of-Vocabulary Problem in Low-Resource Sinhala-English Neural Machine Translation
Figure 3 for Data Augmentation to Address Out-of-Vocabulary Problem in Low-Resource Sinhala-English Neural Machine Translation
Figure 4 for Data Augmentation to Address Out-of-Vocabulary Problem in Low-Resource Sinhala-English Neural Machine Translation
Viaarxiv icon

Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?

Add code
Bookmark button
Alert button
Apr 09, 2022
En-Shiun Annie Lee, Sarubi Thillainathan, Shravan Nayak, Surangika Ranathunga, David Ifeoluwa Adelani, Ruisi Su, Arya D. McCarthy

Figure 1 for Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
Figure 2 for Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
Figure 3 for Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
Figure 4 for Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
Viaarxiv icon