Spike-Based Bayesian-Hebbian Learning of Temporal Sequences

Philip J Tully, Henrik Lindén, Matthias H Hennig, Anders Lansner

17 Citationer (Scopus)
67 Downloads (Pure)

Abstract

Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model's feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison.

OriginalsprogEngelsk
Artikelnummere1004954
TidsskriftPLoS Computational Biology
Vol/bind12
Udgave nummer5
Antal sider35
ISSN1553-7358
DOI
StatusUdgivet - maj 2016

Fingeraftryk

Dyk ned i forskningsemnerne om 'Spike-Based Bayesian-Hebbian Learning of Temporal Sequences'. Sammen danner de et unikt fingeraftryk.

Citationsformater