Alert button
Picture for Felix Voigtlaender

Felix Voigtlaender

Alert button

On the Lipschitz constant of random neural networks

Add code
Bookmark button
Alert button
Nov 02, 2023
Paul Geuchen, Thomas Heindl, Dominik Stöger, Felix Voigtlaender

Viaarxiv icon

Optimal approximation of $C^k$-functions using shallow complex-valued neural networks

Add code
Bookmark button
Alert button
Mar 29, 2023
Paul Geuchen, Felix Voigtlaender

Figure 1 for Optimal approximation of $C^k$-functions using shallow complex-valued neural networks
Figure 2 for Optimal approximation of $C^k$-functions using shallow complex-valued neural networks
Figure 3 for Optimal approximation of $C^k$-functions using shallow complex-valued neural networks
Figure 4 for Optimal approximation of $C^k$-functions using shallow complex-valued neural networks
Viaarxiv icon

$L^p$ sampling numbers for the Fourier-analytic Barron space

Add code
Bookmark button
Alert button
Aug 16, 2022
Felix Voigtlaender

Viaarxiv icon

Training ReLU networks to high uniform accuracy is intractable

Add code
Bookmark button
Alert button
May 26, 2022
Julius Berner, Philipp Grohs, Felix Voigtlaender

Figure 1 for Training ReLU networks to high uniform accuracy is intractable
Figure 2 for Training ReLU networks to high uniform accuracy is intractable
Figure 3 for Training ReLU networks to high uniform accuracy is intractable
Figure 4 for Training ReLU networks to high uniform accuracy is intractable
Viaarxiv icon

Optimal learning of high-dimensional classification problems using deep neural networks

Add code
Bookmark button
Alert button
Dec 24, 2021
Philipp Petersen, Felix Voigtlaender

Viaarxiv icon

Sobolev-type embeddings for neural network approximation spaces

Add code
Bookmark button
Alert button
Oct 28, 2021
Philipp Grohs, Felix Voigtlaender

Viaarxiv icon

Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces

Add code
Bookmark button
Alert button
Apr 06, 2021
Philipp Grohs, Felix Voigtlaender

Figure 1 for Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces
Figure 2 for Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces
Viaarxiv icon

The universal approximation theorem for complex-valued neural networks

Add code
Bookmark button
Alert button
Dec 06, 2020
Felix Voigtlaender

Viaarxiv icon

Neural network approximation and estimation of classifiers with classification boundary in a Barron class

Add code
Bookmark button
Alert button
Nov 18, 2020
Andrei Caragea, Philipp Petersen, Felix Voigtlaender

Figure 1 for Neural network approximation and estimation of classifiers with classification boundary in a Barron class
Figure 2 for Neural network approximation and estimation of classifiers with classification boundary in a Barron class
Viaarxiv icon

Phase Transitions in Rate Distortion Theory and Deep Learning

Add code
Bookmark button
Alert button
Aug 03, 2020
Philipp Grohs, Andreas Klotz, Felix Voigtlaender

Figure 1 for Phase Transitions in Rate Distortion Theory and Deep Learning
Figure 2 for Phase Transitions in Rate Distortion Theory and Deep Learning
Viaarxiv icon