Movement Generation and Control with Generic Neural Microcircuits
Abstract:
Simple linear readouts from generic neural microcircuit models consisting of
spiking neurons and dynamic synapses can be trained to generate and control
basic movements, for example, reaching with an arm to various target points.
After suitable training of these readouts on a small number of target points;
reaching movements to other target points can also be generated. Sensory or
proprioceptive feedback turns out to be essential for such movement control,
even if it is noisy and substantially delayed. Such feedback turns out to
optimally improve the performance of the neural microcircuit model if it
arrives with a biologically realistic delay of 100 to 200 ms. Furthermore,
additional feedbacks of ``prediction of sensory variables'' are shown to
improve the performance significantly. The proposed model also provides a new
approach for movement control in robotics. Existing control methods in
robotics that take the particular dynamics of the sensors and actuators into
account (``embodiment of robot control'') are taken one step further by this
approach, which provides methods for also using the ``embodiment of
computation'', i.e. the inherent dynamics and spatial structure of neural
circuits, for the design of robot movement controllers.
Reference: P. Joshi and W. Maass.
Movement generation and control with generic neural microcircuits.
In A. J. Ijspeert, M. Murata, and N. Wakamiya, editors, Biologically
Inspired Approaches to Advanced Information Technology. First International
Workshop, BioADIT 2004, Lausanne, Switzerland, January 2004, Revised
Selected Papers, volume 3141 of Lecture Notes in Computer Science,
pages 258-273. Springer Verlag, 2004.