Dynamic stochastic synapses as computational units
In most neural network models, synapses are treated as static weights that
change only on the slow time scales of learning. In fact, however, synapses
are highly dynamic, and show use-dependet plasticity over a wide range of
time scales. Moreover, synaptic transmission is an inherently stochastic
process: a spike arriving at a presynaptic terminal triggers release of a
vesicle of neurotransmitter from a release site with a probability that can
be much less than one. Changes in release probability represent one of the
main mechanisms by which synaptic efficacy is modulated in neural circuits.
We propose and investigate a simple model for dynamic stochastic synapses
that can easily be integrated into common models for neural computation. We
show through computer simulations and rigorous theoretical analysis thath
this model for a dynamic stochastic synapse increases computational power in
a nontrivial wa. Our results may have implications for the processing of
time-warying signals by both biological and artificial neural networks.
Reference: W. Maass and A. M. Zador.
Dynamic stochastic synapses as computational units.
In Advances in Neural Processing Systems, volume 10, pages 194-200. MIT
Press (Cambridge), 1998.