# Codes and Expansions (CodEx) Seminar

## Felix Voigtländer (Catholic University Eichstätt-Ingolstadt):Sampling numbers of the Fourier-type Barron spaces

The so-called Barron space (which is roughly defined as the set of functions with one finite Fourier moment) is a popular class of functions in the context of machine learning and neural networks. The main reason for this is the classical result by Barron showing that each function in the Barron space can be approximated up to error $O(N^{-1/2})$ by a shallow neural network with $N$ neurons, where the implied constant depends very mildly on the input dimension $d$.

In this talk we are concerned with the information-theoretic question of how well a function $f$ from the Barron space can be reconstructed from the knowledge of $N$ point samples $f(x_1),\dots,f(x_N)$, where the locations of the sampling points can be chosen arbitrarily (but independent of the target function). We show that when the error is measured in $L^p$, then the optimal reconstruction error behaves like $O(N^{-1/d - 1/\max \{ p, 2 \}})$, up to log factors. The proof combines results and techniques from approximation theory and compressive sensing.