A learning rule for dynamic recruitment and decorrelation

K. P. Körding, P. König*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

The interest in neuronal networks originates for a good part in the option not to construct, but to train them. The mechanisms governing synaptic modifications during such training are assumed to depend on signals locally available at the synapses. In contrast, the performance of a network is suitably measured on a global scale. Here we propose a learning rule that addresses this conflict. It is inspired by recent physiological experiments and exploits the interaction of inhibitory input and backpropagating action potentials in pyramidal neurons. This mechanism makes information on the global scale available as a local signal. As a result, several desirable features can be combined: the learning rule allows fast synaptic modifications approaching one-shot learning. Nevertheless, it leads to stable representations during ongoing learning. Furthermore, the response properties of the neurons are not globally correlated, but cover the whole stimulus space. Copyright (C) 1999 Elsevier Science Ltd.

Original languageEnglish (US)
Pages (from-to)1-9
Number of pages9
JournalNeural Networks
Volume13
Issue number1
DOIs
StatePublished - Jan 2000

Keywords

  • Backpropagating action potential
  • Inhibition
  • Learning
  • Oscillation
  • Receptive field
  • Synchronization
  • synaptic plasticity

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'A learning rule for dynamic recruitment and decorrelation'. Together they form a unique fingerprint.

Cite this