Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR94-012 | 12th December 1994 00:00

Bounds for the Computational Power and Learning Complexity of Analog Neural Nets

RSS-Feed




TR94-012
Authors:
Publication: 12th December 1994 00:00
Downloads: 3989
Keywords: 


Abstract:

It is shown that high order feedforward neural nets of constant depth with piecewise
polynomial activation functions and arbitrary real weights can be simulated for boolean
inputs and outputs by neural nets of a somewhat larger size and depth with heaviside
gates and weights from {-1,0,1\}. This provides the first known upper bound for the
computational power of the former type of neural nets. It is also shown that in the case
of first order nets with piecewise linear activation functions one can replace arbitrary
real weights by rational numbers with polynomially many bits, without changing the
boolean function that is computed by the neural net. In order to prove these results we
introduce two new methods for reducing nonlinear problems about weights in multi-layer
neural nets to linear problems for a transformed set of parameters. These transformed
parameters can be interpreted as weights in a somewhat larger neural net.
As another application of our new proof technique we show that neural nets with
piecewise polynomial activation functions and a constant number of analog inputs are
probably approximately learnable (in Valiant's model for PAC-learning).



ISSN 1433-8092 | Imprint