Picture for Jörg Tiedemann

Jörg Tiedemann

The Impact of Vocabulary Overlaps on Knowledge Transfer in Multilingual Machine Translation

Add code
May 05, 2026
Viaarxiv icon

On the limited utility of parallel data for learning shared multilingual representations

Add code
Mar 30, 2026
Viaarxiv icon

Life Cycle-Aware Evaluation of Knowledge Distillation for Machine Translation: Environmental Impact and Translation Quality Trade-offs

Add code
Feb 10, 2026
Viaarxiv icon

Scaling Low-Resource MT via Synthetic Data Generation with LLMs

Add code
May 20, 2025
Figure 1 for Scaling Low-Resource MT via Synthetic Data Generation with LLMs
Figure 2 for Scaling Low-Resource MT via Synthetic Data Generation with LLMs
Figure 3 for Scaling Low-Resource MT via Synthetic Data Generation with LLMs
Figure 4 for Scaling Low-Resource MT via Synthetic Data Generation with LLMs
Viaarxiv icon

SemEval-2025 Task 3: Mu-SHROOM, the Multilingual Shared Task on Hallucinations and Related Observable Overgeneration Mistakes

Add code
Apr 16, 2025
Viaarxiv icon

Rethinking Multilingual Continual Pretraining: Data Mixing for Adapting LLMs Across Languages and Resources

Add code
Apr 05, 2025
Viaarxiv icon

GlotEval: A Test Suite for Massively Multilingual Evaluation of Large Language Models

Add code
Apr 05, 2025
Viaarxiv icon

EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models

Add code
Sep 26, 2024
Figure 1 for EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models
Figure 2 for EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models
Figure 3 for EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models
Figure 4 for EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models
Viaarxiv icon

Two Stacks Are Better Than One: A Comparison of Language Modeling and Translation as Multilingual Pretraining Objectives

Add code
Jul 22, 2024
Figure 1 for Two Stacks Are Better Than One: A Comparison of Language Modeling and Translation as Multilingual Pretraining Objectives
Figure 2 for Two Stacks Are Better Than One: A Comparison of Language Modeling and Translation as Multilingual Pretraining Objectives
Figure 3 for Two Stacks Are Better Than One: A Comparison of Language Modeling and Translation as Multilingual Pretraining Objectives
Figure 4 for Two Stacks Are Better Than One: A Comparison of Language Modeling and Translation as Multilingual Pretraining Objectives
Viaarxiv icon

Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?

Add code
Mar 25, 2024
Figure 1 for Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?
Figure 2 for Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?
Figure 3 for Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?
Figure 4 for Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?
Viaarxiv icon