Abstract:
There is strong evidence that biological neurons encode through their firing information not only in the firing rate but also in the timing of single spikes. This thesis explores various ways for computing and learning with networks of simplified spiking neurons (essentially of the leaky integrate-and-fire type) using temporal coding. We present both supervised and unsupervised learning rules, which are Hebbian in the sense that the strength of a synapse is modified if a pre- and a postsynaptic spike arrive at the synapse within a certain learning window. Recent neurobiological findings have confirmed such a dependency and have shown that the sign and strength of the change depends on the timing of the two spikes. On the basis of these principles we show how methods originally designed for artificial neural networks like competitive learning, self-organizing behavior and radial basis functions can be realized within this context. We also address the question whether these results still hold for biologically more realistic neurons and show by computer simulations for example that such neurons can compute linear functions in a very natural way.