Alert button
Picture for Lukas Edman

Lukas Edman

Alert button

Too Much Information: Keeping Training Simple for BabyLMs

Add code
Bookmark button
Alert button
Nov 03, 2023
Lukas Edman, Lisa Bylinina

Viaarxiv icon

LCT-1 at SemEval-2023 Task 10: Pre-training and Multi-task Learning for Sexism Detection and Classification

Add code
Bookmark button
Alert button
Jun 08, 2023
Konstantin Chernyshev, Ekaterina Garanina, Duygu Bayram, Qiankun Zheng, Lukas Edman

Figure 1 for LCT-1 at SemEval-2023 Task 10: Pre-training and Multi-task Learning for Sexism Detection and Classification
Figure 2 for LCT-1 at SemEval-2023 Task 10: Pre-training and Multi-task Learning for Sexism Detection and Classification
Figure 3 for LCT-1 at SemEval-2023 Task 10: Pre-training and Multi-task Learning for Sexism Detection and Classification
Figure 4 for LCT-1 at SemEval-2023 Task 10: Pre-training and Multi-task Learning for Sexism Detection and Classification
Viaarxiv icon

Are Character-level Translations Worth the Wait? An Extensive Comparison of Character- and Subword-level Models for Machine Translation

Add code
Bookmark button
Alert button
Feb 28, 2023
Lukas Edman, Antonio Toral, Gertjan van Noord

Figure 1 for Are Character-level Translations Worth the Wait? An Extensive Comparison of Character- and Subword-level Models for Machine Translation
Figure 2 for Are Character-level Translations Worth the Wait? An Extensive Comparison of Character- and Subword-level Models for Machine Translation
Figure 3 for Are Character-level Translations Worth the Wait? An Extensive Comparison of Character- and Subword-level Models for Machine Translation
Figure 4 for Are Character-level Translations Worth the Wait? An Extensive Comparison of Character- and Subword-level Models for Machine Translation
Viaarxiv icon

Subword-Delimited Downsampling for Better Character-Level Translation

Add code
Bookmark button
Alert button
Dec 02, 2022
Lukas Edman, Antonio Toral, Gertjan van Noord

Figure 1 for Subword-Delimited Downsampling for Better Character-Level Translation
Figure 2 for Subword-Delimited Downsampling for Better Character-Level Translation
Figure 3 for Subword-Delimited Downsampling for Better Character-Level Translation
Figure 4 for Subword-Delimited Downsampling for Better Character-Level Translation
Viaarxiv icon

Patching Leaks in the Charformer for Efficient Character-Level Generation

Add code
Bookmark button
Alert button
May 27, 2022
Lukas Edman, Antonio Toral, Gertjan van Noord

Figure 1 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Figure 2 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Figure 3 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Figure 4 for Patching Leaks in the Charformer for Efficient Character-Level Generation
Viaarxiv icon

The Importance of Context in Very Low Resource Language Modeling

Add code
Bookmark button
Alert button
May 10, 2022
Lukas Edman, Antonio Toral, Gertjan van Noord

Figure 1 for The Importance of Context in Very Low Resource Language Modeling
Figure 2 for The Importance of Context in Very Low Resource Language Modeling
Figure 3 for The Importance of Context in Very Low Resource Language Modeling
Figure 4 for The Importance of Context in Very Low Resource Language Modeling
Viaarxiv icon

Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language

Add code
Bookmark button
Alert button
Sep 24, 2021
Lukas Edman, Ahmet Üstün, Antonio Toral, Gertjan van Noord

Figure 1 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Figure 2 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Figure 3 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Figure 4 for Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language
Viaarxiv icon