Picture for Christoph Hertrich

Christoph Hertrich

The Computational Complexity of Counting Linear Regions in ReLU Neural Networks

Add code
May 22, 2025
Viaarxiv icon

Better Neural Network Expressivity: Subdividing the Simplex

Add code
May 20, 2025
Viaarxiv icon

On the Depth of Monotone ReLU Neural Networks and ICNNs

Add code
May 09, 2025
Viaarxiv icon

Depth-Bounds for Neural Networks via the Braid Arrangement

Add code
Feb 13, 2025
Viaarxiv icon

Neural Networks and (Virtual) Extended Formulations

Add code
Nov 05, 2024
Viaarxiv icon

Decomposition Polyhedra of Piecewise Linear Functions

Add code
Oct 07, 2024
Viaarxiv icon

Mode Connectivity in Auction Design

Add code
May 18, 2023
Viaarxiv icon

Training Neural Networks is NP-Hard in Fixed Dimension

Add code
Mar 29, 2023
Viaarxiv icon

Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes

Add code
Feb 24, 2023
Viaarxiv icon

Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete

Add code
Apr 04, 2022
Figure 1 for Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Figure 2 for Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Figure 3 for Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Figure 4 for Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Viaarxiv icon