A normative framework for learning top-down predictions through synaptic
plasticity in apical dendrites
A. Rao, R. Legenstein, A. Subramoney, and W. Maass
Abstract:
Predictive coding has been identified as a key aspect of computation and
learning in cortical microcircuits. But we do not know how synaptic
plasticity processes install and maintain predictive coding capabilites in
these neural circuits. Predictions are inherently uncertain, and learning
rules that aim at discriminating linearly separable classes of inputs - such
as the perceptron learning rule - do not perform well if the goal is learning
to predict. We show that experimental data on synaptic plasticity in apical
dendrites of pyramidal cells support another learning rule that is suitable
for learning to predict. More precisely, it enables a spike-based
approximation to logistic regression, a well-known gold standard for
probabilistic prediction. We also show that data-based interactions between
apical dendrites support learning of predictions for more complex probability
distributions than those that can be handled by single dendrites. The
resulting learning theory for top-down inputs to pyramidal cells provides a
normative framework for evaluating experimental data, and suggests further
experiments for tracking the emergence of predictive coding through synaptic
plasticity in apical dendrites.
Reference: A. Rao, R. Legenstein, A. Subramoney, and W. Maass.
A normative framework for learning top-down predictions through synaptic
plasticity in apical dendrites.
BioRxiv/2021/433822, 2021.