Network Pruning


Network pruning is a popular approach to reduce a heavy network to obtain a lightweight form by removing redundancy in the heavy network. In this approach, a complex over-parameterized network is first trained, then pruned based on some criteria, and finally fine-tuned to achieve comparable performance with reduced parameters.

Topology-Aware Revival for Efficient Sparse Training

Add code
Feb 04, 2026
Viaarxiv icon

It's not a Lottery, it's a Race: Understanding How Gradient Descent Adapts the Network's Capacity to the Task

Add code
Feb 04, 2026
Viaarxiv icon

Toward a Sustainable Federated Learning Ecosystem: A Practical Least Core Mechanism for Payoff Allocation

Add code
Feb 03, 2026
Viaarxiv icon

E-Globe: Scalable $ε$-Global Verification of Neural Networks via Tight Upper Bounds and Pattern-Aware Branching

Add code
Feb 04, 2026
Viaarxiv icon

TopoPrune: Robust Data Pruning via Unified Latent Space Topology

Add code
Feb 02, 2026
Viaarxiv icon

GMAC: Global Multi-View Constraint for Automatic Multi-Camera Extrinsic Calibration

Add code
Feb 01, 2026
Viaarxiv icon

Lyapunov Stability-Aware Stackelberg Game for Low-Altitude Economy: A Control-Oriented Pruning-Based DRL Approach

Add code
Feb 01, 2026
Viaarxiv icon

Reliability-Aware Determinantal Point Processes for Robust Informative Data Selection in Large Language Models

Add code
Jan 31, 2026
Viaarxiv icon

Optimizing Tensor Train Decomposition in DNNs for RISC-V Architectures Using Design Space Exploration and Compiler Optimizations

Add code
Feb 02, 2026
Viaarxiv icon

Denoising deterministic networks using iterative Fourier transforms

Add code
Jan 31, 2026
Viaarxiv icon