It is shown that high order feedforward neural nets of constant depth with piecewise

polynomial activation functions and arbitrary real weights can be simulated for boolean

inputs and outputs by neural nets of a somewhat larger size and depth with heaviside

gates and weights ...
more >>>

Pekka Orponen

We introduce a model for analog computation with discrete

time in the presence of analog noise

that is flexible enough to cover the most important concrete

cases, such as noisy analog neural nets and networks of spiking neurons.

This model subsumes the classical ...
more >>>

Eduardo D. Sontag

We consider recurrent analog neural nets where the output of each

gate is subject to Gaussian noise, or any other common noise

distribution that is nonzero on a large set.

We show that many regular languages cannot be recognized by

networks of this type, and

more >>>