Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > KEYWORD > VAPNIK-CHERVONENKIS DIMENSION:
Reports tagged with Vapnik-Chervonenkis dimension:
TR97-049 | 22nd October 1997
Michael Schmitt

On the Complexity of Learning for Spiking Neurons with Temporal Coding

Spiking neurons are models for the computational units in
biological neural systems where information is considered to be encoded
mainly in the temporal pattern of their activity. In a network of
spiking neurons a new set of parameters becomes relevant which has no
counterpart in traditional ... more >>>


TR99-005 | 21st December 1998
Michael Schmitt

On the Sample Complexity for Nonoverlapping Neural Networks

A neural network is said to be nonoverlapping if there is at most one
edge outgoing from each node. We investigate the number of examples
that a learning algorithm needs when using nonoverlapping neural
networks as hypotheses. We derive bounds for this sample complexity
in terms of the Vapnik-Chervonenkis dimension. ... more >>>


TR00-002 | 23rd December 1999
Michael Schmitt

Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks

We calculate lower bounds on the size of sigmoidal neural networks
that approximate continuous functions. In particular, we show that
for the approximation of polynomials the network size has to grow
as $\Omega((\log k)^{1/4})$ where $k$ is the degree of the polynomials.
This bound is ... more >>>


TR00-086 | 26th September 2000
Michael Schmitt

On the Complexity of Computing and Learning with Multiplicative Neural Networks

In a great variety of neuron models neural inputs are
combined using the summing operation. We introduce the concept of
multiplicative neural networks which contain units that multiply
their inputs instead of summing them and, thus, allow inputs to
interact nonlinearly. The class of multiplicative networks
comprises such widely known ... more >>>


TR04-033 | 23rd January 2004
Michael Schmitt

On the sample complexity of learning for networks of spiking neurons with nonlinear synaptic interactions

We study networks of spiking neurons that use the timing of pulses
to encode information. Nonlinear interactions model the spatial
groupings of synapses on the dendrites and describe the computations
performed at local branches. We analyze the question of how many
examples these networks must ... more >>>




ISSN 1433-8092 | Imprint