Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR00-055 | 14th July 2000 00:00

Learning of Depth Two Neural Networks with Constant Fan-in at the Hidden Nodes

RSS-Feed




TR00-055
Authors: Peter Auer, Stephen Kwek, Manfred K. Warmuth
Publication: 14th July 2000 17:56
Downloads: 3453
Keywords: 


Abstract:

We present algorithms for learning depth two neural networks where the
hidden nodes are threshold gates with constant fan-in. The transfer
function of the output node might be more general: we have results for
the cases when the threshold function, the logistic function or the
identity function is used as the transfer function at the output node.
We give batch and on-line learning algorithms for these classes of
neural networks and prove bounds on the performance of our algorithms.
The batch algorithms work for real valued inputs whereas the on-line
algorithms assume that the inputs are discretized.

The hypotheses of our algorithms are essentially also neural networks
of depth two. However, their number of hidden nodes might be much
larger than the number of hidden nodes of the neural network that has
to be learned. Our algorithms can handle such a large number of hidden
nodes since they rely on multiplicative weight updates at the output
node, and the performance of these algorithms scales only
logarithmically with the number of hidden nodes used.



ISSN 1433-8092 | Imprint