Picture for Abteen Ebrahimi

Abteen Ebrahimi

Since the Scientific Literature Is Multilingual, Our Models Should Be Too

Mar 27, 2024
Figure 1 for Since the Scientific Literature Is Multilingual, Our Models Should Be Too
Figure 2 for Since the Scientific Literature Is Multilingual, Our Models Should Be Too
Figure 3 for Since the Scientific Literature Is Multilingual, Our Models Should Be Too
Figure 4 for Since the Scientific Literature Is Multilingual, Our Models Should Be Too
Viaarxiv icon

Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models

Add code
Feb 15, 2023
Figure 1 for Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models
Figure 2 for Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models
Figure 3 for Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models
Figure 4 for Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models
Viaarxiv icon

How to Adapt Your Pretrained Multilingual Model to 1600 Languages

Add code
Jun 03, 2021
Figure 1 for How to Adapt Your Pretrained Multilingual Model to 1600 Languages
Figure 2 for How to Adapt Your Pretrained Multilingual Model to 1600 Languages
Figure 3 for How to Adapt Your Pretrained Multilingual Model to 1600 Languages
Figure 4 for How to Adapt Your Pretrained Multilingual Model to 1600 Languages
Viaarxiv icon

AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages

Add code
Apr 18, 2021
Figure 1 for AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages
Figure 2 for AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages
Figure 3 for AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages
Figure 4 for AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages
Viaarxiv icon

Athena: Constructing Dialogues Dynamically with Discourse Constraints

Add code
Nov 21, 2020
Figure 1 for Athena: Constructing Dialogues Dynamically with Discourse Constraints
Figure 2 for Athena: Constructing Dialogues Dynamically with Discourse Constraints
Figure 3 for Athena: Constructing Dialogues Dynamically with Discourse Constraints
Figure 4 for Athena: Constructing Dialogues Dynamically with Discourse Constraints
Viaarxiv icon

Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG

Add code
Jun 14, 2019
Figure 1 for Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG
Figure 2 for Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG
Figure 3 for Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG
Figure 4 for Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG
Viaarxiv icon