Alert button
Picture for Iulia Turc

Iulia Turc

Alert button

Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding

Add code
Bookmark button
Alert button
Oct 07, 2022
Kenton Lee, Mandar Joshi, Iulia Turc, Hexiang Hu, Fangyu Liu, Julian Eisenschlos, Urvashi Khandelwal, Peter Shaw, Ming-Wei Chang, Kristina Toutanova

Figure 1 for Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding
Figure 2 for Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding
Figure 3 for Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding
Figure 4 for Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding
Viaarxiv icon

Measuring Attribution in Natural Language Generation Models

Add code
Bookmark button
Alert button
Dec 23, 2021
Hannah Rashkin, Vitaly Nikolaev, Matthew Lamm, Michael Collins, Dipanjan Das, Slav Petrov, Gaurav Singh Tomar, Iulia Turc, David Reitter

Figure 1 for Measuring Attribution in Natural Language Generation Models
Figure 2 for Measuring Attribution in Natural Language Generation Models
Figure 3 for Measuring Attribution in Natural Language Generation Models
Figure 4 for Measuring Attribution in Natural Language Generation Models
Viaarxiv icon

Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer

Add code
Bookmark button
Alert button
Jun 30, 2021
Iulia Turc, Kenton Lee, Jacob Eisenstein, Ming-Wei Chang, Kristina Toutanova

Figure 1 for Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer
Figure 2 for Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer
Figure 3 for Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer
Figure 4 for Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer
Viaarxiv icon

The MultiBERTs: BERT Reproductions for Robustness Analysis

Add code
Bookmark button
Alert button
Jun 30, 2021
Thibault Sellam, Steve Yadlowsky, Jason Wei, Naomi Saphra, Alexander D'Amour, Tal Linzen, Jasmijn Bastings, Iulia Turc, Jacob Eisenstein, Dipanjan Das, Ian Tenney, Ellie Pavlick

Figure 1 for The MultiBERTs: BERT Reproductions for Robustness Analysis
Figure 2 for The MultiBERTs: BERT Reproductions for Robustness Analysis
Figure 3 for The MultiBERTs: BERT Reproductions for Robustness Analysis
Figure 4 for The MultiBERTs: BERT Reproductions for Robustness Analysis
Viaarxiv icon

CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation

Add code
Bookmark button
Alert button
Mar 31, 2021
Jonathan H. Clark, Dan Garrette, Iulia Turc, John Wieting

Figure 1 for CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
Figure 2 for CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
Figure 3 for CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
Figure 4 for CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
Viaarxiv icon

Well-Read Students Learn Better: On the Importance of Pre-training Compact Models

Add code
Bookmark button
Alert button
Sep 25, 2019
Iulia Turc, Ming-Wei Chang, Kenton Lee, Kristina Toutanova

Figure 1 for Well-Read Students Learn Better: On the Importance of Pre-training Compact Models
Figure 2 for Well-Read Students Learn Better: On the Importance of Pre-training Compact Models
Figure 3 for Well-Read Students Learn Better: On the Importance of Pre-training Compact Models
Figure 4 for Well-Read Students Learn Better: On the Importance of Pre-training Compact Models
Viaarxiv icon

Well-Read Students Learn Better: The Impact of Student Initialization on Knowledge Distillation

Add code
Bookmark button
Alert button
Aug 23, 2019
Iulia Turc, Ming-Wei Chang, Kenton Lee, Kristina Toutanova

Figure 1 for Well-Read Students Learn Better: The Impact of Student Initialization on Knowledge Distillation
Figure 2 for Well-Read Students Learn Better: The Impact of Student Initialization on Knowledge Distillation
Figure 3 for Well-Read Students Learn Better: The Impact of Student Initialization on Knowledge Distillation
Figure 4 for Well-Read Students Learn Better: The Impact of Student Initialization on Knowledge Distillation
Viaarxiv icon