We consider recurrent analog neural nets where the output of each
gate is subject to Gaussian noise, or any other common noise
distribution that is nonzero on a large set.
We show that many regular languages cannot be recognized by
networks of this type, and
we give a precise characterization of those languages which can be
recognized. This result implies severe constraints on possibilities
for constructing recurrent analog neural nets that are robust
against realistic types of analog noise. On the other hand we
present a method for constructing feedforward analog neural nets
that are robust with regard to analog noise of this type.