Picture for Stanislas Dehaene

Stanislas Dehaene

Cracking the neural code for word recognition in convolutional neural networks

Add code
Mar 10, 2024
Figure 1 for Cracking the neural code for word recognition in convolutional neural networks
Figure 2 for Cracking the neural code for word recognition in convolutional neural networks
Figure 3 for Cracking the neural code for word recognition in convolutional neural networks
Figure 4 for Cracking the neural code for word recognition in convolutional neural networks
Viaarxiv icon

Aligning individual brains with Fused Unbalanced Gromov-Wasserstein

Add code
Jun 19, 2022
Figure 1 for Aligning individual brains with Fused Unbalanced Gromov-Wasserstein
Figure 2 for Aligning individual brains with Fused Unbalanced Gromov-Wasserstein
Figure 3 for Aligning individual brains with Fused Unbalanced Gromov-Wasserstein
Figure 4 for Aligning individual brains with Fused Unbalanced Gromov-Wasserstein
Viaarxiv icon

Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models

Add code
Jun 10, 2022
Viaarxiv icon

Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans

Add code
Oct 14, 2021
Figure 1 for Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans
Figure 2 for Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans
Figure 3 for Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans
Viaarxiv icon

Can RNNs learn Recursive Nested Subject-Verb Agreements?

Add code
Jan 06, 2021
Figure 1 for Can RNNs learn Recursive Nested Subject-Verb Agreements?
Figure 2 for Can RNNs learn Recursive Nested Subject-Verb Agreements?
Figure 3 for Can RNNs learn Recursive Nested Subject-Verb Agreements?
Figure 4 for Can RNNs learn Recursive Nested Subject-Verb Agreements?
Viaarxiv icon

Exploring Processing of Nested Dependencies in Neural-Network Language Models and Humans

Add code
Jun 19, 2020
Figure 1 for Exploring Processing of Nested Dependencies in Neural-Network Language Models and Humans
Figure 2 for Exploring Processing of Nested Dependencies in Neural-Network Language Models and Humans
Figure 3 for Exploring Processing of Nested Dependencies in Neural-Network Language Models and Humans
Figure 4 for Exploring Processing of Nested Dependencies in Neural-Network Language Models and Humans
Viaarxiv icon

The emergence of number and syntax units in LSTM language models

Add code
Apr 02, 2019
Figure 1 for The emergence of number and syntax units in LSTM language models
Figure 2 for The emergence of number and syntax units in LSTM language models
Figure 3 for The emergence of number and syntax units in LSTM language models
Figure 4 for The emergence of number and syntax units in LSTM language models
Viaarxiv icon