Picture for Andrea Bragagnolo

Andrea Bragagnolo

Cardiac Output Prediction from Echocardiograms: Self-Supervised Learning with Limited Data

Add code
Feb 14, 2026
Viaarxiv icon

When Does Pruning Benefit Vision Representations?

Add code
Jul 02, 2025
Figure 1 for When Does Pruning Benefit Vision Representations?
Figure 2 for When Does Pruning Benefit Vision Representations?
Figure 3 for When Does Pruning Benefit Vision Representations?
Figure 4 for When Does Pruning Benefit Vision Representations?
Viaarxiv icon

To update or not to update? Neurons at equilibrium in deep models

Add code
Jul 19, 2022
Figure 1 for To update or not to update? Neurons at equilibrium in deep models
Figure 2 for To update or not to update? Neurons at equilibrium in deep models
Figure 3 for To update or not to update? Neurons at equilibrium in deep models
Figure 4 for To update or not to update? Neurons at equilibrium in deep models
Viaarxiv icon

SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks

Add code
Feb 07, 2021
Figure 1 for SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks
Figure 2 for SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks
Figure 3 for SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks
Figure 4 for SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks
Viaarxiv icon

LOss-Based SensiTivity rEgulaRization: towards deep sparse neural networks

Add code
Nov 16, 2020
Figure 1 for LOss-Based SensiTivity rEgulaRization: towards deep sparse neural networks
Figure 2 for LOss-Based SensiTivity rEgulaRization: towards deep sparse neural networks
Figure 3 for LOss-Based SensiTivity rEgulaRization: towards deep sparse neural networks
Figure 4 for LOss-Based SensiTivity rEgulaRization: towards deep sparse neural networks
Viaarxiv icon

Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima

Add code
Apr 30, 2020
Figure 1 for Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima
Figure 2 for Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima
Figure 3 for Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima
Figure 4 for Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima
Viaarxiv icon