Picture for Marco Mondelli

Marco Mondelli

Better Rates for Private Linear Regression in the Proportional Regime via Aggressive Clipping

Add code
May 22, 2025
Viaarxiv icon

Attention with Trained Embeddings Provably Selects Important Tokens

Add code
May 22, 2025
Viaarxiv icon

Neural Collapse is Globally Optimal in Deep Regularized ResNets and Transformers

Add code
May 21, 2025
Viaarxiv icon

Test-Time Training Provably Improves Transformers as In-context Learners

Add code
Mar 14, 2025
Viaarxiv icon

Spurious Correlations in High Dimensional Regression: The Roles of Regularization, Simplicity Bias and Over-Parameterization

Add code
Feb 03, 2025
Viaarxiv icon

Spectral Estimators for Multi-Index Models: Precise Asymptotics and Optimal Weak Recovery

Add code
Feb 03, 2025
Viaarxiv icon

Neural Collapse Beyond the Unconstrainted Features Model: Landscape, Dynamics, and Generalization in the Mean-Field Regime

Add code
Jan 31, 2025
Viaarxiv icon

High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws

Add code
Oct 24, 2024
Viaarxiv icon

Privacy for Free in the Over-Parameterized Regime

Add code
Oct 18, 2024
Figure 1 for Privacy for Free in the Over-Parameterized Regime
Figure 2 for Privacy for Free in the Over-Parameterized Regime
Figure 3 for Privacy for Free in the Over-Parameterized Regime
Figure 4 for Privacy for Free in the Over-Parameterized Regime
Viaarxiv icon

Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse

Add code
Oct 07, 2024
Figure 1 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Figure 2 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Figure 3 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Figure 4 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Viaarxiv icon