Weizmann Logo
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style

Reports tagged with learnability:
TR00-014 | 16th February 2000
Matthias Krause, Stefan Lucks

On Learning versus Distinguishing and the Minimal Hardware Complexity of Pseudorandom Function Generators

A set $F$ of $n$-ary Boolean functions is called a pseudorandom function generator
(PRFG) if communicating
with a randomly chosen secret function from $F$ cannot be
efficiently distinguished from communicating with a truly random function.
We ask for the minimal hardware complexity of a PRFG. This question ... more >>>

TR00-086 | 26th September 2000
Michael Schmitt

On the Complexity of Computing and Learning with Multiplicative Neural Networks

In a great variety of neuron models neural inputs are
combined using the summing operation. We introduce the concept of
multiplicative neural networks which contain units that multiply
their inputs instead of summing them and, thus, allow inputs to
interact nonlinearly. The class of multiplicative networks
comprises such widely known ... more >>>

TR04-033 | 23rd January 2004
Michael Schmitt

On the sample complexity of learning for networks of spiking neurons with nonlinear synaptic interactions

We study networks of spiking neurons that use the timing of pulses
to encode information. Nonlinear interactions model the spatial
groupings of synapses on the dendrites and describe the computations
performed at local branches. We analyze the question of how many
examples these networks must ... more >>>

ISSN 1433-8092 | Imprint