Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR99-005 | 21st December 1998 00:00

On the Sample Complexity for Nonoverlapping Neural Networks

RSS-Feed

Abstract:

A neural network is said to be nonoverlapping if there is at most one
edge outgoing from each node. We investigate the number of examples
that a learning algorithm needs when using nonoverlapping neural
networks as hypotheses. We derive bounds for this sample complexity
in terms of the Vapnik-Chervonenkis dimension. In particular, we
consider networks consisting of threshold, sigmoidal and linear
gates. We show that the class of nonoverlapping threshold networks and
the class of nonoverlapping sigmoidal networks on $n$ inputs both have
Vapnik-Chervonenkis dimension $\Omega(n\log n)$. This bound is
asymptotically tight for the class of nonoverlapping threshold
networks. We also present an upper bound for this class where the
constants involved are considerably smaller than in a previous
calculation. Finally, we argue that the Vapnik-Chervonenkis dimension
of nonoverlapping threshold or sigmoidal networks cannot become larger
by allowing the nodes to compute linear functions. This sheds some
light on a recent result that exhibited neural networks with quadratic
VC dimension.



ISSN 1433-8092 | Imprint