ECCC-Report TR00-055https://eccc.weizmann.ac.il/report/2000/055Comments and Revisions published for TR00-055en-usFri, 14 Jul 2000 17:56:30 +0300
Paper TR00-055
| Learning of Depth Two Neural Networks with Constant Fan-in at the Hidden Nodes |
Peter Auer,
Stephen Kwek,
Manfred K. Warmuth
https://eccc.weizmann.ac.il/report/2000/055We present algorithms for learning depth two neural networks where the
hidden nodes are threshold gates with constant fan-in. The transfer
function of the output node might be more general: we have results for
the cases when the threshold function, the logistic function or the
identity function is used as the transfer function at the output node.
We give batch and on-line learning algorithms for these classes of
neural networks and prove bounds on the performance of our algorithms.
The batch algorithms work for real valued inputs whereas the on-line
algorithms assume that the inputs are discretized.
The hypotheses of our algorithms are essentially also neural networks
of depth two. However, their number of hidden nodes might be much
larger than the number of hidden nodes of the neural network that has
to be learned. Our algorithms can handle such a large number of hidden
nodes since they rely on multiplicative weight updates at the output
node, and the performance of these algorithms scales only
logarithmically with the number of hidden nodes used.
Fri, 14 Jul 2000 17:56:30 +0300https://eccc.weizmann.ac.il/report/2000/055