Spike-Based Bayesian-Hebbian Learning of Temporal Sequences

Philip J Tully, Henrik Lindén, Matthias H Hennig, Anders Lansner

17 Citations (Scopus)
67 Downloads (Pure)

Abstract

Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model's feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison.

Original languageEnglish
Article numbere1004954
JournalPLoS Computational Biology
Volume12
Issue number5
Number of pages35
ISSN1553-7358
DOIs
Publication statusPublished - May 2016

Fingerprint

Dive into the research topics of 'Spike-Based Bayesian-Hebbian Learning of Temporal Sequences'. Together they form a unique fingerprint.

Cite this