Emergence of symmetric, modular, and reciprocal connections in recurrent networks with Hebbian learning

Sherwin E. Hua, James C. Houk, Ferdinando A. Mussa-Ivaldi*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

While learning and development are well characterized in feedforward networks, these features are more difficult to analyze in recurrent networks due to the increased complexity of dual dynamics - the rapid dynamics arising from activation states and the slow dynamics arising from learning or developmental plasticity. We present analytical and numerical results that consider dual dynamics in a recurrent network undergoing Hebbian learning with either constant weight decay or weight normalization. Starting from initially random connections, the recurrent network develops symmetric or near-symmetric connections through Hebbian learning. Reciprocity and modularity arise naturally through correlations in the activation states. Additionally, weight normalization may be better than constant weight decay for the development of multiple attractor states that allow a diverse representation of the inputs. These results suggest a natural mechanism by which synaptic plasticity in recurrent networks such as cortical and brainstem premotor circuits could enhance neural computation and the generation of motor programs.

Original languageEnglish (US)
Pages (from-to)211-225
Number of pages15
JournalBiological Cybernetics
Volume81
Issue number3
DOIs
StatePublished - 1999

Keywords

  • Attractor dynamics
  • Hebbian learning rule
  • Multiplicative normalization
  • Self-organization
  • Stability
  • Symmetric connections

ASJC Scopus subject areas

  • General Computer Science
  • Biotechnology

Fingerprint

Dive into the research topics of 'Emergence of symmetric, modular, and reciprocal connections in recurrent networks with Hebbian learning'. Together they form a unique fingerprint.

Cite this