Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > AUTHORS > MICHAEL SCHMITT:
All reports by Author Michael Schmitt:

TR04-075 | 21st July 2004
Michael Schmitt

Some dichotomy theorems for neural learning problems

The computational complexity of learning from binary examples is
investigated for linear threshold neurons. We introduce
combinatorial measures that create classes of infinitely many
learning problems with sample restrictions. We analyze how the
complexity of these problems depends on the values for the measures.
... more >>>


TR04-033 | 23rd January 2004
Michael Schmitt

On the sample complexity of learning for networks of spiking neurons with nonlinear synaptic interactions

We study networks of spiking neurons that use the timing of pulses
to encode information. Nonlinear interactions model the spatial
groupings of synapses on the dendrites and describe the computations
performed at local branches. We analyze the question of how many
examples these networks must ... more >>>


TR01-045 | 26th April 2001
Michael Schmitt

Neural Networks with Local Receptive Fields and Superlinear VC Dimension

Local receptive field neurons comprise such well-known and widely
used unit types as radial basis function neurons and neurons with
center-surround receptive field. We study the Vapnik-Chervonenkis
(VC) dimension of feedforward neural networks with one hidden layer
of these units. For several variants of local receptive field
neurons we show ... more >>>


TR00-086 | 26th September 2000
Michael Schmitt

On the Complexity of Computing and Learning with Multiplicative Neural Networks

In a great variety of neuron models neural inputs are
combined using the summing operation. We introduce the concept of
multiplicative neural networks which contain units that multiply
their inputs instead of summing them and, thus, allow inputs to
interact nonlinearly. The class of multiplicative networks
comprises such widely known ... more >>>


TR00-002 | 23rd December 1999
Michael Schmitt

Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks

We calculate lower bounds on the size of sigmoidal neural networks
that approximate continuous functions. In particular, we show that
for the approximation of polynomials the network size has to grow
as $\Omega((\log k)^{1/4})$ where $k$ is the degree of the polynomials.
This bound is ... more >>>


TR99-005 | 21st December 1998
Michael Schmitt

On the Sample Complexity for Nonoverlapping Neural Networks

A neural network is said to be nonoverlapping if there is at most one
edge outgoing from each node. We investigate the number of examples
that a learning algorithm needs when using nonoverlapping neural
networks as hypotheses. We derive bounds for this sample complexity
in terms of the Vapnik-Chervonenkis dimension. ... more >>>


TR97-049 | 22nd October 1997
Michael Schmitt

On the Complexity of Learning for Spiking Neurons with Temporal Coding

Spiking neurons are models for the computational units in
biological neural systems where information is considered to be encoded
mainly in the temporal pattern of their activity. In a network of
spiking neurons a new set of parameters becomes relevant which has no
counterpart in traditional ... more >>>




ISSN 1433-8092 | Imprint