Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR94-017 | 12th December 1994 00:00

Neural Nets with Superlinear VC-Dimension

RSS-Feed




TR94-017
Authors:
Publication: 12th December 1994 00:00
Downloads: 3263
Keywords: 


Abstract:

It has been known for quite a while that the Vapnik-Chervonenkis dimension
(VC-dimension) of a feedforward neural net with linear threshold gates is at
most O(w log w), where w is the total number of weights in the neural net.
We show in this paper that this bound is in fact asymptotically optimal.
More precisely, we exhibit for any depth d >= 3 a large class of feedforward
neural nets of depth d with w weights that have VC-dimension Omega(w log w).
This lower bound holds even if the inputs are restricted to boolean values.
The proof of this result relies on a new method that allows us to encode
more "program-bits" in the weights of a neural net than previously thought
possible.



ISSN 1433-8092 | Imprint