ECCC-Report TR99-005https://eccc.weizmann.ac.il/report/1999/005Comments and Revisions published for TR99-005en-usMon, 01 Mar 1999 12:31:48 +0200
Paper TR99-005
| On the Sample Complexity for Nonoverlapping Neural Networks |
Michael Schmitt
https://eccc.weizmann.ac.il/report/1999/005A neural network is said to be nonoverlapping if there is at most one
edge outgoing from each node. We investigate the number of examples
that a learning algorithm needs when using nonoverlapping neural
networks as hypotheses. We derive bounds for this sample complexity
in terms of the Vapnik-Chervonenkis dimension. In particular, we
consider networks consisting of threshold, sigmoidal and linear
gates. We show that the class of nonoverlapping threshold networks and
the class of nonoverlapping sigmoidal networks on $n$ inputs both have
Vapnik-Chervonenkis dimension $\Omega(n\log n)$. This bound is
asymptotically tight for the class of nonoverlapping threshold
networks. We also present an upper bound for this class where the
constants involved are considerably smaller than in a previous
calculation. Finally, we argue that the Vapnik-Chervonenkis dimension
of nonoverlapping threshold or sigmoidal networks cannot become larger
by allowing the nodes to compute linear functions. This sheds some
light on a recent result that exhibited neural networks with quadratic
VC dimension.
Mon, 01 Mar 1999 12:31:48 +0200https://eccc.weizmann.ac.il/report/1999/005