Alert button
Picture for Laura Pérez-Mayos

Laura Pérez-Mayos

Alert button

How much pretraining data do language models need to learn syntax?

Add code
Bookmark button
Alert button
Sep 09, 2021
Laura Pérez-Mayos, Miguel Ballesteros, Leo Wanner

Figure 1 for How much pretraining data do language models need to learn syntax?
Figure 2 for How much pretraining data do language models need to learn syntax?
Figure 3 for How much pretraining data do language models need to learn syntax?
Figure 4 for How much pretraining data do language models need to learn syntax?
Viaarxiv icon

Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models

Add code
Bookmark button
Alert button
May 10, 2021
Laura Pérez-Mayos, Alba Táboas García, Simon Mille, Leo Wanner

Figure 1 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Figure 2 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Figure 3 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Figure 4 for Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
Viaarxiv icon

On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations

Add code
Bookmark button
Alert button
Feb 10, 2021
Laura Pérez-Mayos, Roberto Carlini, Miguel Ballesteros, Leo Wanner

Figure 1 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Figure 2 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Figure 3 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Figure 4 for On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations
Viaarxiv icon