Alert button
Picture for Alexandra Chronopoulou

Alexandra Chronopoulou

Alert button

The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task

Oct 25, 2020
Alexandra Chronopoulou, Dario Stojanovski, Viktor Hangya, Alexander Fraser

Figure 1 for The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task
Figure 2 for The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task
Viaarxiv icon

Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT

Oct 06, 2020
Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser

Figure 1 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 2 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 3 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Figure 4 for Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
Viaarxiv icon

Domain Adversarial Fine-Tuning as an Effective Regularizer

Oct 05, 2020
Giorgos Vernikos, Katerina Margatina, Alexandra Chronopoulou, Ion Androutsopoulos

Figure 1 for Domain Adversarial Fine-Tuning as an Effective Regularizer
Figure 2 for Domain Adversarial Fine-Tuning as an Effective Regularizer
Figure 3 for Domain Adversarial Fine-Tuning as an Effective Regularizer
Figure 4 for Domain Adversarial Fine-Tuning as an Effective Regularizer
Viaarxiv icon

An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models

Apr 10, 2019
Alexandra Chronopoulou, Christos Baziotis, Alexandros Potamianos

Figure 1 for An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
Figure 2 for An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
Figure 3 for An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
Figure 4 for An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
Viaarxiv icon

NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion Classification

Sep 03, 2018
Alexandra Chronopoulou, Aikaterini Margatina, Christos Baziotis, Alexandros Potamianos

Figure 1 for NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion Classification
Figure 2 for NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion Classification
Figure 3 for NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion Classification
Figure 4 for NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion Classification
Viaarxiv icon

NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning

Apr 18, 2018
Christos Baziotis, Nikos Athanasiou, Alexandra Chronopoulou, Athanasia Kolovou, Georgios Paraskevopoulos, Nikolaos Ellinas, Shrikanth Narayanan, Alexandros Potamianos

Figure 1 for NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning
Figure 2 for NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning
Figure 3 for NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning
Figure 4 for NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning
Viaarxiv icon