Picture for Ethan Gotlieb Wilcox

Ethan Gotlieb Wilcox

BabyLM Turns 4 and Goes Multilingual: Call for Papers for the 2026 BabyLM Workshop

Add code
Feb 24, 2026
Viaarxiv icon

A Unified Assessment of the Poverty of the Stimulus Argument for Neural Language Models

Add code
Feb 10, 2026
Viaarxiv icon

From Linear Input to Hierarchical Structure: Function Words as Statistical Cues for Language Learning

Add code
Jan 29, 2026
Viaarxiv icon

Using Information Theory to Characterize Prosodic Typology: The Case of Tone, Pitch-Accent and Stress-Accent

Add code
May 12, 2025
Viaarxiv icon

Looking forward: Linguistic theory and methods

Add code
Feb 25, 2025
Viaarxiv icon

Findings of the Second BabyLM Challenge: Sample-Efficient Pretraining on Developmentally Plausible Corpora

Add code
Dec 06, 2024
Viaarxiv icon

Reverse-Engineering the Reader

Add code
Oct 16, 2024
Viaarxiv icon

On the Role of Context in Reading Time Prediction

Add code
Sep 12, 2024
Figure 1 for On the Role of Context in Reading Time Prediction
Figure 2 for On the Role of Context in Reading Time Prediction
Figure 3 for On the Role of Context in Reading Time Prediction
Figure 4 for On the Role of Context in Reading Time Prediction
Viaarxiv icon

Revisiting the Optimality of Word Lengths

Add code
Dec 06, 2023
Figure 1 for Revisiting the Optimality of Word Lengths
Figure 2 for Revisiting the Optimality of Word Lengths
Figure 3 for Revisiting the Optimality of Word Lengths
Figure 4 for Revisiting the Optimality of Word Lengths
Viaarxiv icon

Testing the Predictions of Surprisal Theory in 11 Languages

Add code
Jul 10, 2023
Viaarxiv icon