Under the auspices of the Computational Complexity Foundation (CCF)

REPORTS > DETAIL:

### Paper:

TR00-002 | 23rd December 1999 00:00

#### Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks

TR00-002
Authors: Michael Schmitt
Publication: 14th January 2000 15:56
Keywords:

Abstract:

We calculate lower bounds on the size of sigmoidal neural networks
that approximate continuous functions. In particular, we show that
for the approximation of polynomials the network size has to grow
as \$\Omega((\log k)^{1/4})\$ where \$k\$ is the degree of the polynomials.
This bound is valid for any input dimension, i.e. independently of
the number of variables. The result is obtained by introducing a new
method employing upper bounds on the Vapnik-Chervonenkis dimension
for proving lower bounds on the size of networks that approximate
continuous functions.

ISSN 1433-8092 | Imprint