We introduce a model for analog computation with discrete
time in the presence of analog noise
that is flexible enough to cover the most important concrete
cases, such as noisy analog neural nets and networks of spiking neurons.
This model subsumes the classical ...
more >>>
We present a new approach to the composition
of learning algorithms (in various models) for
classes of constant VC-dimension into learning algorithms for
more complicated classes.
We prove that if a class $\CC$ is learnable
in time $t$ from a hypothesis class $\HH$ of constant VC-dimension
then the class ...
more >>>
One can learn any hypothesis class $H$ with $O(\log|H|)$ labeled examples. Alas, learning with so few examples requires saving the examples in memory, and this requires $|X|^{O(\log|H|)}$ memory states, where $X$ is the set of all labeled examples. A question that arises is how many labeled examples are needed in ... more >>>