We introduce a new method for proving explicit upper bounds
on the VC Dimension of general functional basis networks,
and prove as an application, for the first time, that the
VC Dimension of analog neural networks with the sigmoidal
activation function $\sigma(y)=1/1+e^{-y}$ ...
more >>>
We introduce a new method for proving explicit upper bounds on the VC
Dimension of general functional basis networks, and prove as an
application, for the first time, the VC Dimension of analog neural
networks with the sigmoid activation function $\sigma(y)=1/1+e^{-y}$
to ...
more >>>