In this paper, we show that the Kolmogorov two hidden layer neural network model with a continuous, discontinuous bounded or unbounded activation function in the second hidden layer can precisely represent continuous, discontinuous bounded and all unbounded multivariate functions, respectively.
In this paper, we study approximation properties of single hidden layer neural networks with weights varying on finitely many directions and thresholds from an open interval. We obtain a necessary and at the same time sufficient measure theoretic condition for density of such networks in the space of continuous functions. Further, we prove a density result for neural networks with a specifically constructed activation function and a fixed number of neurons.
This paper provides an explicit formula for the approximation error of single hidden layer neural networks with two fixed weights.
In 1987, Hecht-Nielsen showed that any continuous multivariate function could be implemented by a certain type three-layer neural network. This result was very much discussed in neural network literature. In this paper we prove that not only continuous functions but also all discontinuous functions can be implemented by such neural networks.