Alert button
Picture for Dario Stojanovski

Dario Stojanovski

Alert button

Language-Family Adapters for Multilingual Neural Machine Translation

Add code
Bookmark button
Alert button
Sep 30, 2022
Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser

Figure 1 for Language-Family Adapters for Multilingual Neural Machine Translation
Figure 2 for Language-Family Adapters for Multilingual Neural Machine Translation
Figure 3 for Language-Family Adapters for Multilingual Neural Machine Translation
Figure 4 for Language-Family Adapters for Multilingual Neural Machine Translation
Viaarxiv icon

Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation

Add code
Bookmark button
Alert button
Apr 14, 2021
Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser

Figure 1 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Figure 2 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Figure 3 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Figure 4 for Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Viaarxiv icon

The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task

Add code
Bookmark button
Alert button
Oct 25, 2020
Alexandra Chronopoulou, Dario Stojanovski, Viktor Hangya, Alexander Fraser

Figure 1 for The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task
Figure 2 for The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task
Viaarxiv icon

Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT

Add code
Bookmark button
Alert button
Oct 06, 2020
Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser

Figure 1 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 2 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 3 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 4 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Viaarxiv icon

Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation

Add code
Bookmark button
Alert button
Apr 30, 2020
Dario Stojanovski, Alexander Fraser

Figure 1 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Figure 2 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Figure 3 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Figure 4 for Addressing Zero-Resource Domains Using Document-Level Context in Neural Machine Translation
Viaarxiv icon