Picture for Francesco Cagnetta

Francesco Cagnetta

Deep networks learn to parse uniform-depth context-free languages from local statistics

Add code
Feb 09, 2026
Viaarxiv icon

Deriving Neural Scaling Laws from the statistics of natural language

Add code
Feb 07, 2026
Viaarxiv icon

Learning curves theory for hierarchically compositional data with power-law distributed features

Add code
May 11, 2025
Viaarxiv icon

Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures

Add code
May 11, 2025
Figure 1 for Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures
Figure 2 for Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures
Figure 3 for Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures
Figure 4 for Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures
Viaarxiv icon

How compositional generalization and creativity improve as diffusion models are trained

Add code
Feb 17, 2025
Figure 1 for How compositional generalization and creativity improve as diffusion models are trained
Figure 2 for How compositional generalization and creativity improve as diffusion models are trained
Figure 3 for How compositional generalization and creativity improve as diffusion models are trained
Figure 4 for How compositional generalization and creativity improve as diffusion models are trained
Viaarxiv icon

Towards a theory of how the structure of language is acquired by deep neural networks

Add code
May 28, 2024
Viaarxiv icon

How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model

Add code
Jul 31, 2023
Figure 1 for How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model
Figure 2 for How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model
Figure 3 for How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model
Figure 4 for How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model
Viaarxiv icon

Kernels, Data & Physics

Add code
Jul 05, 2023
Viaarxiv icon

How deep convolutional neural networks lose spatial information with training

Add code
Oct 04, 2022
Figure 1 for How deep convolutional neural networks lose spatial information with training
Figure 2 for How deep convolutional neural networks lose spatial information with training
Figure 3 for How deep convolutional neural networks lose spatial information with training
Figure 4 for How deep convolutional neural networks lose spatial information with training
Viaarxiv icon

How Wide Convolutional Neural Networks Learn Hierarchical Tasks

Add code
Aug 01, 2022
Figure 1 for How Wide Convolutional Neural Networks Learn Hierarchical Tasks
Figure 2 for How Wide Convolutional Neural Networks Learn Hierarchical Tasks
Viaarxiv icon