Picture for Lukas Lange

Lukas Lange

GradSim: Gradient-Based Language Grouping for Effective Multilingual Training

Add code
Oct 23, 2023
Viaarxiv icon

TADA: Efficient Task-Agnostic Domain Adaptation for Transformers

Add code
May 22, 2023
Figure 1 for TADA: Efficient Task-Agnostic Domain Adaptation for Transformers
Figure 2 for TADA: Efficient Task-Agnostic Domain Adaptation for Transformers
Figure 3 for TADA: Efficient Task-Agnostic Domain Adaptation for Transformers
Figure 4 for TADA: Efficient Task-Agnostic Domain Adaptation for Transformers
Viaarxiv icon

NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language Selection for Low-Resource Multilingual Sentiment Analysis

Add code
Apr 28, 2023
Figure 1 for NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language Selection for Low-Resource Multilingual Sentiment Analysis
Figure 2 for NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language Selection for Low-Resource Multilingual Sentiment Analysis
Figure 3 for NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language Selection for Low-Resource Multilingual Sentiment Analysis
Figure 4 for NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language Selection for Low-Resource Multilingual Sentiment Analysis
Viaarxiv icon

SwitchPrompt: Learning Domain-Specific Gated Soft Prompts for Classification in Low-Resource Domains

Add code
Feb 14, 2023
Viaarxiv icon

Multilingual Normalization of Temporal Expressions with Masked Language Models

Add code
May 20, 2022
Figure 1 for Multilingual Normalization of Temporal Expressions with Masked Language Models
Figure 2 for Multilingual Normalization of Temporal Expressions with Masked Language Models
Figure 3 for Multilingual Normalization of Temporal Expressions with Masked Language Models
Figure 4 for Multilingual Normalization of Temporal Expressions with Masked Language Models
Viaarxiv icon

CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain

Add code
Dec 17, 2021
Figure 1 for CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain
Figure 2 for CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain
Figure 3 for CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain
Figure 4 for CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain
Viaarxiv icon

Boosting Transformers for Job Expression Extraction and Classification in a Low-Resource Setting

Add code
Sep 17, 2021
Figure 1 for Boosting Transformers for Job Expression Extraction and Classification in a Low-Resource Setting
Figure 2 for Boosting Transformers for Job Expression Extraction and Classification in a Low-Resource Setting
Figure 3 for Boosting Transformers for Job Expression Extraction and Classification in a Low-Resource Setting
Figure 4 for Boosting Transformers for Job Expression Extraction and Classification in a Low-Resource Setting
Viaarxiv icon

To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning

Add code
Apr 16, 2021
Figure 1 for To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning
Figure 2 for To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning
Figure 3 for To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning
Figure 4 for To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning
Viaarxiv icon

ANEA: Distant Supervision for Low-Resource Named Entity Recognition

Add code
Feb 25, 2021
Figure 1 for ANEA: Distant Supervision for Low-Resource Named Entity Recognition
Figure 2 for ANEA: Distant Supervision for Low-Resource Named Entity Recognition
Viaarxiv icon

NLNDE at CANTEMIST: Neural Sequence Labeling and Parsing Approaches for Clinical Concept Extraction

Add code
Oct 23, 2020
Figure 1 for NLNDE at CANTEMIST: Neural Sequence Labeling and Parsing Approaches for Clinical Concept Extraction
Figure 2 for NLNDE at CANTEMIST: Neural Sequence Labeling and Parsing Approaches for Clinical Concept Extraction
Figure 3 for NLNDE at CANTEMIST: Neural Sequence Labeling and Parsing Approaches for Clinical Concept Extraction
Figure 4 for NLNDE at CANTEMIST: Neural Sequence Labeling and Parsing Approaches for Clinical Concept Extraction
Viaarxiv icon