__
TR97-049 | 22nd October 1997 00:00
__

#### On the Complexity of Learning for Spiking Neurons with Temporal Coding

**Abstract:**
Spiking neurons are models for the computational units in

biological neural systems where information is considered to be encoded

mainly in the temporal pattern of their activity. In a network of

spiking neurons a new set of parameters becomes relevant which has no

counterpart in traditional neural network models: the time that a

pulse needs to travel through a connection between two neurons (also

known as delay of a connection). It is known that these delays

are tuned in biological neural systems through a variety of

mechanisms. We investigate the VC-dimension of networks of spiking

neurons where the delays are viewed as programmable parameters

and we prove tight bounds for this VC-dimension. Thus we get

quantitative estimates for the diversity of functions that a network

with fixed architecture can compute with different settings of its

delays. In particular, it turns out that a network of spiking

neurons with $k$ adjustable delays is able to compute a much richer

class of functions than a threshold circuit with $k$ adjustable

weights. The results also yield bounds for the number of training

examples that an algorithm needs for tuning the delays of a network

of spiking neurons. Results about the computational complexity of

such algorithms are also given.