1
|
Learning with filopodia and spines: Complementary strong and weak competition lead to specialized, graded, and protected receptive fields. PLoS Comput Biol 2024; 20:e1012110. [PMID: 38743789 PMCID: PMC11125506 DOI: 10.1371/journal.pcbi.1012110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 05/24/2024] [Accepted: 04/25/2024] [Indexed: 05/16/2024] Open
Abstract
Filopodia are thin synaptic protrusions that have been long known to play an important role in early development. Recently, they have been found to be more abundant in the adult cortex than previously thought, and more plastic than spines (button-shaped mature synapses). Inspired by these findings, we introduce a new model of synaptic plasticity that jointly describes learning of filopodia and spines. The model assumes that filopodia exhibit strongly competitive learning dynamics -similarly to additive spike-timing-dependent plasticity (STDP). At the same time it proposes that, if filopodia undergo sufficient potentiation, they consolidate into spines. Spines follow weakly competitive learning, classically associated with multiplicative, soft-bounded models of STDP. This makes spines more stable and sensitive to the fine structure of input correlations. We show that our learning rule has a selectivity comparable to additive STDP and captures input correlations as well as multiplicative models of STDP. We also show how it can protect previously formed memories and perform synaptic consolidation. Overall, our results can be seen as a phenomenological description of how filopodia and spines could cooperate to overcome the individual difficulties faced by strong and weak competition mechanisms.
Collapse
|
2
|
Stability against fluctuations: a two-dimensional study of scaling, bifurcations and spontaneous symmetry breaking in stochastic models of synaptic plasticity. BIOLOGICAL CYBERNETICS 2024; 118:39-81. [PMID: 38583095 DOI: 10.1007/s00422-024-00985-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/12/2022] [Accepted: 02/12/2024] [Indexed: 04/08/2024]
Abstract
Stochastic models of synaptic plasticity must confront the corrosive influence of fluctuations in synaptic strength on patterns of synaptic connectivity. To solve this problem, we have proposed that synapses act as filters, integrating plasticity induction signals and expressing changes in synaptic strength only upon reaching filter threshold. Our earlier analytical study calculated the lifetimes of quasi-stable patterns of synaptic connectivity with synaptic filtering. We showed that the plasticity step size in a stochastic model of spike-timing-dependent plasticity (STDP) acts as a temperature-like parameter, exhibiting a critical value below which neuronal structure formation occurs. The filter threshold scales this temperature-like parameter downwards, cooling the dynamics and enhancing stability. A key step in this calculation was a resetting approximation, essentially reducing the dynamics to one-dimensional processes. Here, we revisit our earlier study to examine this resetting approximation, with the aim of understanding in detail why it works so well by comparing it, and a simpler approximation, to the system's full dynamics consisting of various embedded two-dimensional processes without resetting. Comparing the full system to the simpler approximation, to our original resetting approximation, and to a one-afferent system, we show that their equilibrium distributions of synaptic strengths and critical plasticity step sizes are all qualitatively similar, and increasingly quantitatively similar as the filter threshold increases. This increasing similarity is due to the decorrelation in changes in synaptic strength between different afferents caused by our STDP model, and the amplification of this decorrelation with larger synaptic filters.
Collapse
|
3
|
A dynamic attractor network model of memory formation, reinforcement and forgetting. PLoS Comput Biol 2023; 19:e1011727. [PMID: 38117859 PMCID: PMC10766193 DOI: 10.1371/journal.pcbi.1011727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 01/04/2024] [Accepted: 12/02/2023] [Indexed: 12/22/2023] Open
Abstract
Empirical evidence shows that memories that are frequently revisited are easy to recall, and that familiar items involve larger hippocampal representations than less familiar ones. In line with these observations, here we develop a modelling approach to provide a mechanistic understanding of how hippocampal neural assemblies evolve differently, depending on the frequency of presentation of the stimuli. For this, we added an online Hebbian learning rule, background firing activity, neural adaptation and heterosynaptic plasticity to a rate attractor network model, thus creating dynamic memory representations that can persist, increase or fade according to the frequency of presentation of the corresponding memory patterns. Specifically, we show that a dynamic interplay between Hebbian learning and background firing activity can explain the relationship between the memory assembly sizes and their frequency of stimulation. Frequently stimulated assemblies increase their size independently from each other (i.e. creating orthogonal representations that do not share neurons, thus avoiding interference). Importantly, connections between neurons of assemblies that are not further stimulated become labile so that these neurons can be recruited by other assemblies, providing a neuronal mechanism of forgetting.
Collapse
|
4
|
Robust encoding of natural stimuli by neuronal response sequences in monkey visual cortex. Nat Commun 2023; 14:3021. [PMID: 37231014 DOI: 10.1038/s41467-023-38587-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Accepted: 05/08/2023] [Indexed: 05/27/2023] Open
Abstract
Parallel multisite recordings in the visual cortex of trained monkeys revealed that the responses of spatially distributed neurons to natural scenes are ordered in sequences. The rank order of these sequences is stimulus-specific and maintained even if the absolute timing of the responses is modified by manipulating stimulus parameters. The stimulus specificity of these sequences was highest when they were evoked by natural stimuli and deteriorated for stimulus versions in which certain statistical regularities were removed. This suggests that the response sequences result from a matching operation between sensory evidence and priors stored in the cortical network. Decoders trained on sequence order performed as well as decoders trained on rate vectors but the former could decode stimulus identity from considerably shorter response intervals than the latter. A simulated recurrent network reproduced similarly structured stimulus-specific response sequences, particularly once it was familiarized with the stimuli through non-supervised Hebbian learning. We propose that recurrent processing transforms signals from stationary visual scenes into sequential responses whose rank order is the result of a Bayesian matching operation. If this temporal code were used by the visual system it would allow for ultrafast processing of visual scenes.
Collapse
|
5
|
Meta-SpikePropamine: learning to learn with synaptic plasticity in spiking neural networks. Front Neurosci 2023; 17:1183321. [PMID: 37250397 PMCID: PMC10213417 DOI: 10.3389/fnins.2023.1183321] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 04/06/2023] [Indexed: 05/31/2023] Open
Abstract
We propose that in order to harness our understanding of neuroscience toward machine learning, we must first have powerful tools for training brain-like models of learning. Although substantial progress has been made toward understanding the dynamics of learning in the brain, neuroscience-derived models of learning have yet to demonstrate the same performance capabilities as methods in deep learning such as gradient descent. Inspired by the successes of machine learning using gradient descent, we introduce a bi-level optimization framework that seeks to both solve online learning tasks and improve the ability to learn online using models of plasticity from neuroscience. We demonstrate that models of three-factor learning with synaptic plasticity taken from the neuroscience literature can be trained in Spiking Neural Networks (SNNs) with gradient descent via a framework of learning-to-learn to address challenging online learning problems. This framework opens a new path toward developing neuroscience inspired online learning algorithms.
Collapse
|
6
|
STDP-based associative memory formation and retrieval. J Math Biol 2023; 86:49. [PMID: 36826758 DOI: 10.1007/s00285-023-01883-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 12/11/2022] [Accepted: 01/31/2023] [Indexed: 02/25/2023]
Abstract
Spike-timing-dependent plasticity (STDP) is a biological process in which the precise order and timing of neuronal spikes affect the degree of synaptic modification. While there has been numerous research focusing on the role of STDP in neural coding, the functional implications of STDP at the macroscopic level in the brain have not been fully explored yet. In this work, we propose a neurodynamical model based on STDP that renders storage and retrieval of a group of associative memories. We showed that the function of STDP at the macroscopic level is to form a "memory plane" in the neural state space which dynamically encodes high dimensional data. We derived the analytic relation between the input, the memory plane, and the induced macroscopic neural oscillations around the memory plane. Such plane produces a limit cycle in reaction to a similar memory cue, which can be used for retrieval of the original input.
Collapse
|
7
|
Binary and analog variation of synapses between cortical pyramidal neurons. eLife 2022; 11:e76120. [PMID: 36382887 PMCID: PMC9704804 DOI: 10.7554/elife.76120] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Accepted: 11/15/2022] [Indexed: 11/17/2022] Open
Abstract
Learning from experience depends at least in part on changes in neuronal connections. We present the largest map of connectivity to date between cortical neurons of a defined type (layer 2/3 [L2/3] pyramidal cells in mouse primary visual cortex), which was enabled by automated analysis of serial section electron microscopy images with improved handling of image defects (250 × 140 × 90 μm3 volume). We used the map to identify constraints on the learning algorithms employed by the cortex. Previous cortical studies modeled a continuum of synapse sizes by a log-normal distribution. A continuum is consistent with most neural network models of learning, in which synaptic strength is a continuously graded analog variable. Here, we show that synapse size, when restricted to synapses between L2/3 pyramidal cells, is well modeled by the sum of a binary variable and an analog variable drawn from a log-normal distribution. Two synapses sharing the same presynaptic and postsynaptic cells are known to be correlated in size. We show that the binary variables of the two synapses are highly correlated, while the analog variables are not. Binary variation could be the outcome of a Hebbian or other synaptic plasticity rule depending on activity signals that are relatively uniform across neuronal arbors, while analog variation may be dominated by other influences such as spontaneous dynamical fluctuations. We discuss the implications for the longstanding hypothesis that activity-dependent plasticity switches synapses between bistable states.
Collapse
|
8
|
Synaptic reshaping of plastic neuronal networks by periodic multichannel stimulation with single-pulse and burst stimuli. PLoS Comput Biol 2022; 18:e1010568. [PMID: 36327232 PMCID: PMC9632832 DOI: 10.1371/journal.pcbi.1010568] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2022] [Accepted: 09/14/2022] [Indexed: 11/06/2022] Open
Abstract
Synaptic dysfunction is associated with several brain disorders, including Alzheimer's disease, Parkinson's disease (PD) and obsessive compulsive disorder (OCD). Utilizing synaptic plasticity, brain stimulation is capable of reshaping synaptic connectivity. This may pave the way for novel therapies that specifically counteract pathological synaptic connectivity. For instance, in PD, novel multichannel coordinated reset stimulation (CRS) was designed to counteract neuronal synchrony and down-regulate pathological synaptic connectivity. CRS was shown to entail long-lasting therapeutic aftereffects in PD patients and related animal models. This is in marked contrast to conventional deep brain stimulation (DBS) therapy, where PD symptoms return shortly after stimulation ceases. In the present paper, we study synaptic reshaping by periodic multichannel stimulation (PMCS) in networks of leaky integrate-and-fire (LIF) neurons with spike-timing-dependent plasticity (STDP). During PMCS, phase-shifted periodic stimulus trains are delivered to segregated neuronal subpopulations. Harnessing STDP, PMCS leads to changes of the synaptic network structure. We found that the PMCS-induced changes of the network structure depend on both the phase lags between stimuli and the shape of individual stimuli. Single-pulse stimuli and burst stimuli with low intraburst frequency down-regulate synapses between neurons receiving stimuli simultaneously. In contrast, burst stimuli with high intraburst frequency up-regulate these synapses. We derive theoretical approximations of the stimulation-induced network structure. This enables us to formulate stimulation strategies for inducing a variety of network structures. Our results provide testable hypotheses for future pre-clinical and clinical studies and suggest that periodic multichannel stimulation may be suitable for reshaping plastic neuronal networks to counteract pathological synaptic connectivity. Furthermore, we provide novel insight on how the stimulus type may affect the long-lasting outcome of conventional DBS. This may strongly impact parameter adjustment procedures for clinical DBS, which, so far, primarily focused on acute effects of stimulation.
Collapse
|
9
|
Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
|
10
|
Multiplicative Shot-Noise: A New Route to Stability of Plastic Networks. PHYSICAL REVIEW LETTERS 2022; 129:068101. [PMID: 36018633 DOI: 10.1103/physrevlett.129.068101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 06/30/2022] [Indexed: 06/15/2023]
Abstract
Fluctuations of synaptic weights, among many other physical, biological, and ecological quantities, are driven by coincident events of two "parent" processes. We propose a multiplicative shot-noise model that can capture the behaviors of a broad range of such natural phenomena, and analytically derive an approximation that accurately predicts its statistics. We apply our results to study the effects of a multiplicative synaptic plasticity rule that was recently extracted from measurements in physiological conditions. Using mean-field theory analysis and network simulations, we investigate how this rule shapes the connectivity and dynamics of recurrent spiking neural networks. The multiplicative plasticity rule is shown to support efficient learning of input stimuli, and it gives a stable, unimodal synaptic-weight distribution with a large fraction of strong synapses. The strong synapses remain stable over long times but do not "run away." Our results suggest that the multiplicative shot-noise offers a new route to understand the tradeoff between flexibility and stability in neural circuits and other dynamic networks.
Collapse
|
11
|
Weight dependence in BCM leads to adjustable synaptic competition. J Comput Neurosci 2022; 50:431-444. [PMID: 35764852 PMCID: PMC9666303 DOI: 10.1007/s10827-022-00824-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Revised: 05/15/2022] [Accepted: 06/08/2022] [Indexed: 11/28/2022]
Abstract
Models of synaptic plasticity have been used to better understand neural development as well as learning and memory. One prominent classic model is the Bienenstock-Cooper-Munro (BCM) model that has been particularly successful in explaining plasticity of the visual cortex. Here, in an effort to include more biophysical detail in the BCM model, we incorporate 1) feedforward inhibition, and 2) the experimental observation that large synapses are relatively harder to potentiate than weak ones, while synaptic depression is proportional to the synaptic strength. These modifications change the outcome of unsupervised plasticity under the BCM model. The amount of feed-forward inhibition adds a parameter to BCM that turns out to determine the strength of competition. In the limit of strong inhibition the learning outcome is identical to standard BCM and the neuron becomes selective to one stimulus only (winner-take-all). For smaller values of inhibition, competition is weaker and the receptive fields are less selective. However, both BCM variants can yield realistic receptive fields.
Collapse
|
12
|
Spontaneous dynamics of synaptic weights in stochastic models with pair-based spike-timing-dependent plasticity. Phys Rev E 2022; 105:054405. [PMID: 35706237 DOI: 10.1103/physreve.105.054405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Accepted: 03/31/2022] [Indexed: 06/15/2023]
Abstract
We investigate spike-timing dependent plasticity (STPD) in the case of a synapse connecting two neuronal cells. We develop a theoretical analysis of several STDP rules using Markovian theory. In this context there are two different timescales, fast neuronal activity and slower synaptic weight updates. Exploiting this timescale separation, we derive the long-time limits of a single synaptic weight subject to STDP. We show that the pairing model of presynaptic and postsynaptic spikes controls the synaptic weight dynamics for small external input on an excitatory synapse. This result implies in particular that mean-field analysis of plasticity may miss some important properties of STDP. Anti-Hebbian STDP favors the emergence of a stable synaptic weight. In the case of an inhibitory synapse the pairing schemes matter less, and we observe convergence of the synaptic weight to a nonnull value only for Hebbian STDP. We extensively study different asymptotic regimes for STDP rules, raising interesting questions for future work on adaptative neuronal networks and, more generally, on adaptative systems.
Collapse
|
13
|
Long-Lasting Desynchronization of Plastic Neuronal Networks by Double-Random Coordinated Reset Stimulation. FRONTIERS IN NETWORK PHYSIOLOGY 2022; 2:864859. [PMID: 36926109 PMCID: PMC10013062 DOI: 10.3389/fnetp.2022.864859] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 03/18/2022] [Indexed: 11/13/2022]
Abstract
Hypersynchrony of neuronal activity is associated with several neurological disorders, including essential tremor and Parkinson's disease (PD). Chronic high-frequency deep brain stimulation (HF DBS) is the standard of care for medically refractory PD. Symptoms may effectively be suppressed by HF DBS, but return shortly after cessation of stimulation. Coordinated reset (CR) stimulation is a theory-based stimulation technique that was designed to specifically counteract neuronal synchrony by desynchronization. During CR, phase-shifted stimuli are delivered to multiple neuronal subpopulations. Computational studies on CR stimulation of plastic neuronal networks revealed long-lasting desynchronization effects obtained by down-regulating abnormal synaptic connectivity. This way, networks are moved into attractors of stable desynchronized states such that stimulation-induced desynchronization persists after cessation of stimulation. Preclinical and clinical studies confirmed corresponding long-lasting therapeutic and desynchronizing effects in PD. As PD symptoms are associated with different pathological synchronous rhythms, stimulation-induced long-lasting desynchronization effects should favorably be robust to variations of the stimulation frequency. Recent computational studies suggested that this robustness can be improved by randomizing the timings of stimulus deliveries. We study the long-lasting effects of CR stimulation with randomized stimulus amplitudes and/or randomized stimulus timing in networks of leaky integrate-and-fire (LIF) neurons with spike-timing-dependent plasticity. Performing computer simulations and analytical calculations, we study long-lasting desynchronization effects of CR with and without randomization of stimulus amplitudes alone, randomization of stimulus times alone as well as the combination of both. Varying the CR stimulation frequency (with respect to the frequency of abnormal target rhythm) and the number of separately stimulated neuronal subpopulations, we reveal parameter regions and related mechanisms where the two qualitatively different randomization mechanisms improve the robustness of long-lasting desynchronization effects of CR. In particular, for clinically relevant parameter ranges double-random CR stimulation, i.e., CR stimulation with the specific combination of stimulus amplitude randomization and stimulus time randomization, may outperform regular CR stimulation with respect to long-lasting desynchronization. In addition, our results provide the first evidence that an effective reduction of the overall stimulation current by stimulus amplitude randomization may improve the frequency robustness of long-lasting therapeutic effects of brain stimulation.
Collapse
|
14
|
Characterization of Generalizability of Spike Timing Dependent Plasticity Trained Spiking Neural Networks. Front Neurosci 2021; 15:695357. [PMID: 34776837 PMCID: PMC8589121 DOI: 10.3389/fnins.2021.695357] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2021] [Accepted: 09/29/2021] [Indexed: 11/30/2022] Open
Abstract
A Spiking Neural Network (SNN) is trained with Spike Timing Dependent Plasticity (STDP), which is a neuro-inspired unsupervised learning method for various machine learning applications. This paper studies the generalizability properties of the STDP learning processes using the Hausdorff dimension of the trajectories of the learning algorithm. The paper analyzes the effects of STDP learning models and associated hyper-parameters on the generalizability properties of an SNN. The analysis is used to develop a Bayesian optimization approach to optimize the hyper-parameters for an STDP model for improving the generalizability properties of an SNN.
Collapse
|
15
|
Astrocyte GluN2C NMDA receptors control basal synaptic strengths of hippocampal CA1 pyramidal neurons in the stratum radiatum. eLife 2021; 10:70818. [PMID: 34693906 PMCID: PMC8594917 DOI: 10.7554/elife.70818] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 10/22/2021] [Indexed: 12/12/2022] Open
Abstract
Experience-dependent plasticity is a key feature of brain synapses for which neuronal N-Methyl-D-Aspartate receptors (NMDARs) play a major role, from developmental circuit refinement to learning and memory. Astrocytes also express NMDARs, although their exact function has remained controversial. Here, we identify in mouse hippocampus, a circuit function for GluN2C NMDAR, a subtype highly expressed in astrocytes, in layer-specific tuning of synaptic strengths in CA1 pyramidal neurons. Interfering with astrocyte NMDAR or GluN2C NMDAR activity reduces the range of presynaptic strength distribution specifically in the stratum radiatum inputs without an appreciable change in the mean presynaptic strength. Mathematical modeling shows that narrowing of the width of presynaptic release probability distribution compromises the expression of long-term synaptic plasticity. Our findings suggest a novel feedback signaling system that uses astrocyte GluN2C NMDARs to adjust basal synaptic weight distribution of Schaffer collateral inputs, which in turn impacts computations performed by the CA1 pyramidal neuron.
Collapse
|
16
|
Robust rhythmogenesis via spike-timing-dependent plasticity. Phys Rev E 2021; 104:024413. [PMID: 34525545 DOI: 10.1103/physreve.104.024413] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Accepted: 07/21/2021] [Indexed: 11/07/2022]
Abstract
Rhythmic activity has been observed in numerous animal species ranging from insects to humans, and in relation to a wide range of cognitive tasks. Various experimental and theoretical studies have investigated rhythmic activity. The theoretical efforts have mainly been focused on the neuronal dynamics, under the assumption that network connectivity satisfies certain fine-tuning conditions required to generate oscillations. However, it remains unclear how this fine-tuning is achieved. Here we investigated the hypothesis that spike-timing-dependent plasticity (STDP) can provide the underlying mechanism for tuning synaptic connectivity to generate rhythmic activity. We addressed this question in a modeling study. We examined STDP dynamics in the framework of a network of excitatory and inhibitory neuronal populations that has been suggested to underlie the generation of oscillations in the gamma range. Mean-field Fokker-Planck equations for the synaptic weight dynamics are derived in the limit of slow learning. We drew on this approximation to determine which types of STDP rules drive the system to exhibit rhythmic activity, and we demonstrate how the parameters that characterize the plasticity rule govern the rhythmic activity. Finally, we propose a mechanism that can ensure the robustness of self-developing processes in general, and for rhythmogenesis in particular.
Collapse
|
17
|
Stochastic binary synapses having sigmoidal cumulative distribution functions for unsupervised learning with spike timing-dependent plasticity. Sci Rep 2021; 11:18282. [PMID: 34521895 PMCID: PMC8440757 DOI: 10.1038/s41598-021-97583-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2021] [Accepted: 08/23/2021] [Indexed: 11/17/2022] Open
Abstract
Spike timing-dependent plasticity (STDP), which is widely studied as a fundamental synaptic update rule for neuromorphic hardware, requires precise control of continuous weights. From the viewpoint of hardware implementation, a simplified update rule is desirable. Although simplified STDP with stochastic binary synapses was proposed previously, we find that it leads to degradation of memory maintenance during learning, which is unfavourable for unsupervised online learning. In this work, we propose a stochastic binary synaptic model where the cumulative probability of the weight change evolves in a sigmoidal fashion with potentiation or depression trials, which can be implemented using a pair of switching devices consisting of serially connected multiple binary memristors. As a benchmark test we perform simulations of unsupervised learning of MNIST images with a two-layer network and show that simplified STDP in combination with this model can outperform conventional rules with continuous weights not only in memory maintenance but also in recognition accuracy. Our method achieves 97.3% in recognition accuracy, which is higher than that reported with standard STDP in the same framework. We also show that the high performance of our learning rule is robust against device-to-device variability of the memristor's probabilistic behaviour.
Collapse
|
18
|
Multistability in a star network of Kuramoto-type oscillators with synaptic plasticity. Sci Rep 2021; 11:9840. [PMID: 33972613 PMCID: PMC8110549 DOI: 10.1038/s41598-021-89198-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Accepted: 04/20/2021] [Indexed: 11/09/2022] Open
Abstract
We analyze multistability in a star-type network of phase oscillators with coupling weights governed by phase-difference-dependent plasticity. It is shown that a network with N leaves can evolve into \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$2^N$$\end{document}2N various asymptotic states, characterized by different values of the coupling strength between the hub and the leaves. Starting from the simple case of two coupled oscillators, we develop an analytical approach based on two small parameters \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\varepsilon$$\end{document}ε and \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\mu$$\end{document}μ, where \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\varepsilon$$\end{document}ε is the ratio of the time scales of the phase variables and synaptic weights, and \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\mu$$\end{document}μ defines the sharpness of the plasticity boundary function. The limit \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\mu \rightarrow 0$$\end{document}μ→0 corresponds to a hard boundary. The analytical results obtained on the model of two oscillators are generalized for multi-leaf star networks. Multistability with \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$2^N$$\end{document}2N various asymptotic states is numerically demonstrated for one-, two-, three- and nine-leaf star-type networks.
Collapse
|
19
|
Non-linear Memristive Synaptic Dynamics for Efficient Unsupervised Learning in Spiking Neural Networks. Front Neurosci 2021; 15:580909. [PMID: 33633531 PMCID: PMC7901913 DOI: 10.3389/fnins.2021.580909] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2020] [Accepted: 01/06/2021] [Indexed: 11/13/2022] Open
Abstract
Spiking neural networks (SNNs) are a computational tool in which the information is coded into spikes, as in some parts of the brain, differently from conventional neural networks (NNs) that compute over real-numbers. Therefore, SNNs can implement intelligent information extraction in real-time at the edge of data acquisition and correspond to a complementary solution to conventional NNs working for cloud-computing. Both NN classes face hardware constraints due to limited computing parallelism and separation of logic and memory. Emerging memory devices, like resistive switching memories, phase change memories, or memristive devices in general are strong candidates to remove these hurdles for NN applications. The well-established training procedures of conventional NNs helped in defining the desiderata for memristive device dynamics implementing synaptic units. The generally agreed requirements are a linear evolution of memristive conductance upon stimulation with train of identical pulses and a symmetric conductance change for conductance increase and decrease. Conversely, little work has been done to understand the main properties of memristive devices supporting efficient SNN operation. The reason lies in the lack of a background theory for their training. As a consequence, requirements for NNs have been taken as a reference to develop memristive devices for SNNs. In the present work, we show that, for efficient CMOS/memristive SNNs, the requirements for synaptic memristive dynamics are very different from the needs of a conventional NN. System-level simulations of a SNN trained to classify hand-written digit images through a spike timing dependent plasticity protocol are performed considering various linear and non-linear plausible synaptic memristive dynamics. We consider memristive dynamics bounded by artificial hard conductance values and limited by the natural dynamics evolution toward asymptotic values (soft-boundaries). We quantitatively analyze the impact of resolution and non-linearity properties of the synapses on the network training and classification performance. Finally, we demonstrate that the non-linear synapses with hard boundary values enable higher classification performance and realize the best trade-off between classification accuracy and required training time. With reference to the obtained results, we discuss how memristive devices with non-linear dynamics constitute a technologically convenient solution for the development of on-line SNN training.
Collapse
|
20
|
Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network. Neural Netw 2020; 134:64-75. [PMID: 33291017 DOI: 10.1016/j.neunet.2020.11.005] [Citation(s) in RCA: 48] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Revised: 09/19/2020] [Accepted: 11/12/2020] [Indexed: 11/28/2022]
Abstract
This work is aimed to study experimental and theoretical approaches for searching effective local training rules for unsupervised pattern recognition by high-performance memristor-based Spiking Neural Networks (SNNs). First, the possibility of weight change using Spike-Timing-Dependent Plasticity (STDP) is demonstrated with a pair of hardware analog neurons connected through a (CoFeB)x(LiNbO3)1-x nanocomposite memristor. Next, the learning convergence to a solution of binary clusterization task is analyzed in a wide range of memristive STDP parameters for a single-layer fully connected feedforward SNN. The memristive STDP behavior supplying convergence in this simple task is shown also to provide it in the handwritten digit recognition domain by the more complex SNN architecture with a Winner-Take-All competition between neurons. To investigate basic conditions necessary for training convergence, an original probabilistic generative model of a rate-based single-layer network with independent or competing neurons is built and thoroughly analyzed. The main result is a statement of "correlation growth-anticorrelation decay" principle which prompts near-optimal policy to configure model parameters. This principle is in line with requiring the binary clusterization convergence which can be defined as the necessary condition for optimal learning and used as the simple benchmark for tuning parameters of various neural network realizations with population-rate information coding. At last, a heuristic algorithm is described to experimentally find out the convergence conditions in a memristive SNN, including robustness to a device variability. Due to the generality of the proposed approach, it can be applied to a wide range of memristors and neurons of software- or hardware-based rate-coding single-layer SNNs when searching for local rules that ensure their unsupervised learning convergence in a pattern recognition task domain.
Collapse
|
21
|
Effect of interpopulation spike-timing-dependent plasticity on synchronized rhythms in neuronal networks with inhibitory and excitatory populations. Cogn Neurodyn 2020; 14:535-567. [PMID: 32655716 DOI: 10.1007/s11571-020-09580-y] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2019] [Revised: 02/11/2020] [Accepted: 03/06/2020] [Indexed: 02/07/2023] Open
Abstract
We consider a two-population network consisting of both inhibitory (I) interneurons and excitatory (E) pyramidal cells. This I-E neuronal network has adaptive dynamic I to E and E to I interpopulation synaptic strengths, governed by interpopulation spike-timing-dependent plasticity (STDP). In previous works without STDPs, fast sparsely synchronized rhythms, related to diverse cognitive functions, were found to appear in a range of noise intensity D for static synaptic strengths. Here, by varying D, we investigate the effect of interpopulation STDPs on fast sparsely synchronized rhythms that emerge in both the I- and the E-populations. Depending on values of D, long-term potentiation (LTP) and long-term depression (LTD) for population-averaged values of saturated interpopulation synaptic strengths are found to occur. Then, the degree of fast sparse synchronization varies due to effects of LTP and LTD. In a broad region of intermediate D, the degree of good synchronization (with higher synchronization degree) becomes decreased, while in a region of large D, the degree of bad synchronization (with lower synchronization degree) gets increased. Consequently, in each I- or E-population, the synchronization degree becomes nearly the same in a wide range of D (including both the intermediate and the large D regions). This kind of "equalization effect" is found to occur via cooperative interplay between the average occupation and pacing degrees of spikes (i.e., the average fraction of firing neurons and the average degree of phase coherence between spikes in each synchronized stripe of spikes in the raster plot of spikes) in fast sparsely synchronized rhythms. Finally, emergences of LTP and LTD of interpopulation synaptic strengths (leading to occurrence of equalization effect) are intensively investigated via a microscopic method based on the distributions of time delays between the pre- and the post-synaptic spike times.
Collapse
|
22
|
Competitive Learning in a Spiking Neural Network: Towards an Intelligent Pattern Classifier. SENSORS 2020; 20:s20020500. [PMID: 31963143 PMCID: PMC7014236 DOI: 10.3390/s20020500] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/03/2019] [Revised: 01/10/2020] [Accepted: 01/14/2020] [Indexed: 12/24/2022]
Abstract
One of the modern trends in the design of human–machine interfaces (HMI) is to involve the so called spiking neuron networks (SNNs) in signal processing. The SNNs can be trained by simple and efficient biologically inspired algorithms. In particular, we have shown that sensory neurons in the input layer of SNNs can simultaneously encode the input signal based both on the spiking frequency rate and on varying the latency in generating spikes. In the case of such mixed temporal-rate coding, the SNN should implement learning working properly for both types of coding. Based on this, we investigate how a single neuron can be trained with pure rate and temporal patterns, and then build a universal SNN that is trained using mixed coding. In particular, we study Hebbian and competitive learning in SNN in the context of temporal and rate coding problems. We show that the use of Hebbian learning through pair-based and triplet-based spike timing-dependent plasticity (STDP) rule is accomplishable for temporal coding, but not for rate coding. Synaptic competition inducing depression of poorly used synapses is required to ensure a neural selectivity in the rate coding. This kind of competition can be implemented by the so-called forgetting function that is dependent on neuron activity. We show that coherent use of the triplet-based STDP and synaptic competition with the forgetting function is sufficient for the rate coding. Next, we propose a SNN capable of classifying electromyographical (EMG) patterns using an unsupervised learning procedure. The neuron competition achieved via lateral inhibition ensures the “winner takes all” principle among classifier neurons. The SNN also provides gradual output response dependent on muscular contraction strength. Furthermore, we modify the SNN to implement a supervised learning method based on stimulation of the target classifier neuron synchronously with the network input. In a problem of discrimination of three EMG patterns, the SNN with supervised learning shows median accuracy 99.5% that is close to the result demonstrated by multi-layer perceptron learned by back propagation of an error algorithm.
Collapse
|
23
|
STDP Plasticity in TRN Within Hierarchical Spike Timing Model of Visual Information Processing. IFIP ADVANCES IN INFORMATION AND COMMUNICATION TECHNOLOGY 2020. [PMCID: PMC7256410 DOI: 10.1007/978-3-030-49161-1_24] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
We investigated age related synaptic plasticity in thalamic reticular nucleus (TRN) as a part of visual information processing system in the brain. Simulation experiments were performed using a hierarchical spike timing neural network model in NEST simulator. The model consists of multiple layers starting with retinal photoreceptors through thalamic relay, primary visual cortex layers up to the lateral intraparietal cortex (LIP) responsible for decision making and preparation of motor response. All synaptic inter- and intra-layer connections of our model are structured according to the literature information. The present work extends the model with spike timing dependent plastic (STDP) synapses within TRN as well as from visual cortex to LIP area. Synaptic strength changes were forced by teaching signal typical for three different age groups (young, middle and elderly) determined experimentally from eye movement data collected by eye tracking device from human subjects preforming a simplified simulated visual navigation task.
Collapse
|
24
|
Frequency cluster formation and slow oscillations in neural populations with plasticity. PLoS One 2019; 14:e0225094. [PMID: 31725782 PMCID: PMC6855470 DOI: 10.1371/journal.pone.0225094] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 10/29/2019] [Indexed: 11/20/2022] Open
Abstract
We report the phenomenon of frequency clustering in a network of Hodgkin-Huxley neurons with spike timing-dependent plasticity. The clustering leads to a splitting of a neural population into a few groups synchronized at different frequencies. In this regime, the amplitude of the mean field undergoes low-frequency modulations, which may contribute to the mechanism of the emergence of slow oscillations of neural activity observed in spectral power of local field potentials or electroencephalographic signals at high frequencies. In addition to numerical simulations of such multi-clusters, we investigate the mechanisms of the observed phenomena using the simplest case of two clusters. In particular, we propose a phenomenological model which describes the dynamics of two clusters taking into account the adaptation of coupling weights. We also determine the set of plasticity functions (update rules), which lead to multi-clustering.
Collapse
|
25
|
Spike burst-pause dynamics of Purkinje cells regulate sensorimotor adaptation. PLoS Comput Biol 2019; 15:e1006298. [PMID: 30860991 PMCID: PMC6430425 DOI: 10.1371/journal.pcbi.1006298] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2018] [Revised: 03/22/2019] [Accepted: 01/08/2019] [Indexed: 11/25/2022] Open
Abstract
Cerebellar Purkinje cells mediate accurate eye movement coordination. However, it remains unclear how oculomotor adaptation depends on the interplay between the characteristic Purkinje cell response patterns, namely tonic, bursting, and spike pauses. Here, a spiking cerebellar model assesses the role of Purkinje cell firing patterns in vestibular ocular reflex (VOR) adaptation. The model captures the cerebellar microcircuit properties and it incorporates spike-based synaptic plasticity at multiple cerebellar sites. A detailed Purkinje cell model reproduces the three spike-firing patterns that are shown to regulate the cerebellar output. Our results suggest that pauses following Purkinje complex spikes (bursts) encode transient disinhibition of target medial vestibular nuclei, critically gating the vestibular signals conveyed by mossy fibres. This gating mechanism accounts for early and coarse VOR acquisition, prior to the late reflex consolidation. In addition, properly timed and sized Purkinje cell bursts allow the ratio between long-term depression and potentiation (LTD/LTP) to be finely shaped at mossy fibre-medial vestibular nuclei synapses, which optimises VOR consolidation. Tonic Purkinje cell firing maintains the consolidated VOR through time. Importantly, pauses are crucial to facilitate VOR phase-reversal learning, by reshaping previously learnt synaptic weight distributions. Altogether, these results predict that Purkinje spike burst-pause dynamics are instrumental to VOR learning and reversal adaptation. Cerebellar Purkinje cells regulate accurate eye movement coordination. However, it remains unclear how cerebellar-dependent oculomotor adaptation depends on the interplay between Purkinje cell characteristic response patterns: tonic, high frequency bursting, and post-complex spike pauses. We explore the role of Purkinje spike burst-pause dynamics in VOR adaptation. A biophysical model of Purkinje cell is at the core of a spiking network model, which captures the cerebellar microcircuit properties and incorporates spike-based synaptic plasticity mechanisms at different cerebellar sites. We show that Purkinje spike burst-pause dynamics are critical for (1) gating the vestibular-motor response association during VOR acquisition; (2) mediating the LTD/LTP balance for VOR consolidation; (3) reshaping synaptic efficacy distributions for VOR phase-reversal adaptation; (4) explaining the reversal VOR gain discontinuities during sleeping.
Collapse
|
26
|
Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
|
27
|
The Virtual Electrode Recording Tool for EXtracellular Potentials (VERTEX) Version 2.0: Modelling in vitro electrical stimulation of brain tissue. Wellcome Open Res 2019; 4:20. [PMID: 30984877 PMCID: PMC6439485 DOI: 10.12688/wellcomeopenres.15058.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/21/2019] [Indexed: 11/20/2022] Open
Abstract
Neuronal circuits can be modelled in detail allowing us to predict the effects of stimulation on individual neurons. Electrical stimulation of neuronal circuits in vitro and in vivo excites a range of neurons within the tissue and measurements of neural activity, e.g the local field potential (LFP), are again an aggregate of a large pool of cells. The previous version of our Virtual Electrode Recording Tool for EXtracellular Potentials (VERTEX) allowed for the simulation of the LFP generated by a patch of brain tissue. Here, we extend VERTEX to simulate the effect of electrical stimulation through a focal electric field. We observe both direct changes in neural activity and changes in synaptic plasticity. Testing our software in a model of a rat neocortical slice, we determine the currents contributing to the LFP, the effects of paired pulse stimulation to induce short term plasticity (STP), and the effect of theta burst stimulation (TBS) to induce long term potentiation (LTP).
Collapse
|
28
|
Extended memory lifetime in spiking neural networks employing memristive synapses with nonlinear conductance dynamics. NANOTECHNOLOGY 2019; 30:015102. [PMID: 30378572 DOI: 10.1088/1361-6528/aae81c] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Spiking neural networks (SNNs) employing memristive synapses are capable of life-long online learning. Because of their ability to process and classify large amounts of data in real-time using compact and low-power electronic systems, they promise a substantial technology breakthrough. However, the critical issue that memristor-based SNNs have to face is the fundamental limitation in their memory capacity due to finite resolution of the synaptic elements, which leads to the replacement of old memories with new ones and to a finite memory lifetime. In this study we demonstrate that the nonlinear conductance dynamics of memristive devices can be exploited to improve the memory lifetime of a network. The network is simulated on the basis of a spiking neuron model of mixed-signal digital-analogue sub-threshold neuromorphic CMOS circuits, and on memristive synapse models derived from the experimental nonlinear conductance dynamics of resistive memory devices when stimulated by trains of identical pulses. The network learning circuits implement a spike-based plasticity rule compatible with both spike-timing and rate-based learning rules. In order to get an insight on the memory lifetime of the network, we analyse the learning dynamics in the context of a classical benchmark of neural network learning, that is hand-written digit classification. In the proposed architecture, the memory lifetime and the performance of the network are improved for memristive synapses with nonlinear dynamics with respect to linear synapses with similar resolution. These results demonstrate the importance of following holistic approaches that combine the study of theoretical learning models with the development of neuromorphic CMOS SNNs with memristive devices used to implement life-long on-chip learning.
Collapse
|
29
|
Behavioral Learning in a Cognitive Neuromorphic Robot: An Integrative Approach. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:6132-6144. [PMID: 29994007 DOI: 10.1109/tnnls.2018.2816518] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
We present here a learning system using the iCub humanoid robot and the SpiNNaker neuromorphic chip to solve the real-world task of object-specific attention. Integrating spiking neural networks with robots introduces considerable complexity for questionable benefit if the objective is simply task performance. But, we suggest, in a cognitive robotics context, where the goal is understanding how to compute, such an approach may yield useful insights to neural architecture as well as learned behavior, especially if dedicated neural hardware is available. Recent advances in cognitive robotics and neuromorphic processing now make such systems possible. Using a scalable, structured, modular approach, we build a spiking neural network where the effects and impact of learning can be predicted and tested, and the network can be scaled or extended to new tasks automatically. We introduce several enhancements to a basic network and show how they can be used to direct performance toward behaviorally relevant goals. Results show that using a simple classical spike-timing-dependent plasticity (STDP) rule on selected connections, we can get the robot (and network) to progress from poor task-specific performance to good performance. Behaviorally relevant STDP appears to contribute strongly to positive learning: "do this" but less to negative learning: "don't do that." In addition, we observe that the effect of structural enhancements tends to be cumulative. The overall system suggests that it is by being able to exploit combinations of effects, rather than any one effect or property in isolation, that spiking networks can achieve compelling, task-relevant behavior.
Collapse
|
30
|
Propagation delays determine neuronal activity and synaptic connectivity patterns emerging in plastic neuronal networks. CHAOS (WOODBURY, N.Y.) 2018; 28:106308. [PMID: 30384625 DOI: 10.1063/1.5037309] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/23/2018] [Accepted: 08/01/2018] [Indexed: 06/08/2023]
Abstract
In plastic neuronal networks, the synaptic strengths are adapted to the neuronal activity. Specifically, spike-timing-dependent plasticity (STDP) is a fundamental mechanism that modifies the synaptic strengths based on the relative timing of pre- and postsynaptic spikes, taking into account the spikes' temporal order. In many studies, propagation delays were neglected to avoid additional dynamic complexity or computational costs. So far, networks equipped with a classic STDP rule typically rule out bidirectional couplings (i.e., either loops or uncoupled states) and are, hence, not able to reproduce fundamental experimental findings. In this review paper, we consider additional features, e.g., extensions of the classic STDP rule or additional aspects like noise, in order to overcome the contradictions between theory and experiment. In addition, we review in detail recent studies showing that a classic STDP rule combined with realistic propagation patterns is able to capture relevant experimental findings. In two coupled oscillatory neurons with propagation delays, bidirectional synapses can be preserved and potentiated. This result also holds for large networks of type-II phase oscillators. In addition, not only the mean of the initial distribution of synaptic weights, but also its standard deviation crucially determines the emergent structural connectivity, i.e., the mean final synaptic weight, the number of two-neuron loops, and the symmetry of the final connectivity pattern. The latter is affected by the firing rates, where more symmetric synaptic configurations emerge at higher firing rates. Finally, we discuss these findings in the context of the computational neuroscience-based development of desynchronizing brain stimulation techniques.
Collapse
|
31
|
Burst synchronization in a scale-free neuronal network with inhibitory spike-timing-dependent plasticity. Cogn Neurodyn 2018; 13:53-73. [PMID: 30728871 DOI: 10.1007/s11571-018-9505-1] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2018] [Revised: 08/19/2018] [Accepted: 08/28/2018] [Indexed: 01/09/2023] Open
Abstract
We are concerned about burst synchronization (BS), related to neural information processes in health and disease, in the Barabási-Albert scale-free network (SFN) composed of inhibitory bursting Hindmarsh-Rose neurons. This inhibitory neuronal population has adaptive dynamic synaptic strengths governed by the inhibitory spike-timing-dependent plasticity (iSTDP). In previous works without considering iSTDP, BS was found to appear in a range of noise intensities for fixed synaptic inhibition strengths. In contrast, in our present work, we take into consideration iSTDP and investigate its effect on BS by varying the noise intensity. Our new main result is to find occurrence of a Matthew effect in inhibitory synaptic plasticity: good BS gets better via LTD, while bad BS get worse via LTP. This kind of Matthew effect in inhibitory synaptic plasticity is in contrast to that in excitatory synaptic plasticity where good (bad) synchronization gets better (worse) via LTP (LTD). We note that, due to inhibition, the roles of LTD and LTP in inhibitory synaptic plasticity are reversed in comparison with those in excitatory synaptic plasticity. Moreover, emergences of LTD and LTP of synaptic inhibition strengths are intensively investigated via a microscopic method based on the distributions of time delays between the pre- and the post-synaptic burst onset times. Finally, in the presence of iSTDP we investigate the effects of network architecture on BS by varying the symmetric attachment degree l ∗ and the asymmetry parameter Δ l in the SFN.
Collapse
|
32
|
Rhythmogenesis evolves as a consequence of long-term plasticity of inhibitory synapses. Sci Rep 2018; 8:13050. [PMID: 30158555 PMCID: PMC6115462 DOI: 10.1038/s41598-018-31412-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2018] [Accepted: 08/07/2018] [Indexed: 11/08/2022] Open
Abstract
Brain rhythms are widely believed to reflect numerous cognitive processes. Changes in rhythmicity have been associated with pathological states. However, the mechanism underlying these rhythms remains unknown. Here, we present a theoretical analysis of the evolvement of rhythm generating capabilities in neuronal circuits. We tested the hypothesis that brain rhythms can be acquired via an intrinsic unsupervised learning process of activity dependent plasticity. Specifically, we focused on spike timing dependent plasticity (STDP) of inhibitory synapses. We detail how rhythmicity can develop via STDP under certain conditions that serve as a natural prediction of the hypothesis. We show how global features of the STDP rule govern and stabilize the resultant rhythmic activity. Finally, we demonstrate how rhythmicity is retained even in the face of synaptic variability. This study suggests a role for inhibitory plasticity that is beyond homeostatic processes.
Collapse
|
33
|
Interplay of multiple pathways and activity-dependent rules in STDP. PLoS Comput Biol 2018; 14:e1006184. [PMID: 30106953 PMCID: PMC6112684 DOI: 10.1371/journal.pcbi.1006184] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2018] [Revised: 08/28/2018] [Accepted: 05/09/2018] [Indexed: 12/13/2022] Open
Abstract
Hebbian plasticity describes a basic mechanism for synaptic plasticity whereby synaptic weights evolve depending on the relative timing of paired activity of the pre- and postsynaptic neurons. Spike-timing-dependent plasticity (STDP) constitutes a central experimental and theoretical synaptic Hebbian learning rule. Various mechanisms, mostly calcium-based, account for the induction and maintenance of STDP. Classically STDP is assumed to gradually emerge in a monotonic way as the number of pairings increases. However, non-monotonic STDP accounting for fast associative learning led us to challenge this monotonicity hypothesis and explore how the existence of multiple plasticity pathways affects the dynamical establishment of plasticity. To account for distinct forms of STDP emerging from increasing numbers of pairings and the variety of signaling pathways involved, we developed a general class of simple mathematical models of plasticity based on calcium transients and accommodating various calcium-based plasticity mechanisms. These mechanisms can either compete or cooperate for the establishment of long-term potentiation (LTP) and depression (LTD), that emerge depending on past calcium activity. Our model reproduces accurately the striatal STDP that involves endocannabinoid and NMDAR signaling pathways. Moreover, we predict how stimulus frequency alters plasticity, and how triplet rules are affected by the number of pairings. We further investigate the general model with an arbitrary number of pathways and show that depending on those pathways and their properties, a variety of plasticities may emerge upon variation of the number and/or the frequency of pairings, even when the outcome after large numbers of pairings is identical. These findings, built upon a biologically realistic example and generalized to other applications, argue that in order to fully describe synaptic plasticity it is not sufficient to record STDP curves at fixed pairing numbers and frequencies. In fact, considering the whole spectrum of activity-dependent parameters could have a great impact on the description of plasticity, and a better understanding of the engram. The brain’s capacity to treat information, learn and store memory relies on synaptic connectivity patterns, which are altered through synaptic plasticity mechanisms. Experimentally, such plasticities were evidenced through protocols involving numerous repetitive stimulations of a given synapse, and were shown to be supported by multiple pathways. Using a simple biologically grounded mathematical model, we show how activation timescales and inactivation levels of each pathway interact and alter plasticity in an intricate manner as stimuli are presented. Building upon data from the synapse between cortex and striatum, we show that synaptic changes may revert or re-emerge as stimuli are presented, and predict specific responses to changes in stimulus frequency or to distinct simulation patterns. Our general model shows that a given plasticity profile emerging in response to a repetitive stimulation protocol can unfold into various scenarii upon variations of the number of stimulus presentations or patterns, which tightly depends on the underlying activated pathways. Altogether, these results argue that in order to better understand learning and memory, single plasticity responses obtained through intensive stimulations do not reveal the complexity of the responses for smaller number of presentations, which may have a strong impact in fast learning of stimuli with low numbers of presentations.
Collapse
|
34
|
Eligibility Traces and Plasticity on Behavioral Time Scales: Experimental Support of NeoHebbian Three-Factor Learning Rules. Front Neural Circuits 2018; 12:53. [PMID: 30108488 PMCID: PMC6079224 DOI: 10.3389/fncir.2018.00053] [Citation(s) in RCA: 96] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2018] [Accepted: 06/19/2018] [Indexed: 11/13/2022] Open
Abstract
Most elementary behaviors such as moving the arm to grasp an object or walking into the next room to explore a museum evolve on the time scale of seconds; in contrast, neuronal action potentials occur on the time scale of a few milliseconds. Learning rules of the brain must therefore bridge the gap between these two different time scales. Modern theories of synaptic plasticity have postulated that the co-activation of pre- and postsynaptic neurons sets a flag at the synapse, called an eligibility trace, that leads to a weight change only if an additional factor is present while the flag is set. This third factor, signaling reward, punishment, surprise, or novelty, could be implemented by the phasic activity of neuromodulators or specific neuronal inputs signaling special events. While the theoretical framework has been developed over the last decades, experimental evidence in support of eligibility traces on the time scale of seconds has been collected only during the last few years. Here we review, in the context of three-factor rules of synaptic plasticity, four key experiments that support the role of synaptic eligibility traces in combination with a third factor as a biological implementation of neoHebbian three-factor learning rules.
Collapse
|
35
|
A Survey of Robotics Control Based on Learning-Inspired Spiking Neural Networks. Front Neurorobot 2018; 12:35. [PMID: 30034334 PMCID: PMC6043678 DOI: 10.3389/fnbot.2018.00035] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2018] [Accepted: 06/14/2018] [Indexed: 11/30/2022] Open
Abstract
Biological intelligence processes information using impulses or spikes, which makes those living creatures able to perceive and act in the real world exceptionally well and outperform state-of-the-art robots in almost every aspect of life. To make up the deficit, emerging hardware technologies and software knowledge in the fields of neuroscience, electronics, and computer science have made it possible to design biologically realistic robots controlled by spiking neural networks (SNNs), inspired by the mechanism of brains. However, a comprehensive review on controlling robots based on SNNs is still missing. In this paper, we survey the developments of the past decade in the field of spiking neural networks for control tasks, with particular focus on the fast emerging robotics-related applications. We first highlight the primary impetuses of SNN-based robotics tasks in terms of speed, energy efficiency, and computation capabilities. We then classify those SNN-based robotic applications according to different learning rules and explicate those learning rules with their corresponding robotic applications. We also briefly present some existing platforms that offer an interaction between SNNs and robotics simulations for exploration and exploitation. Finally, we conclude our survey with a forecast of future challenges and some associated potential research topics in terms of controlling robots based on SNNs.
Collapse
|
36
|
Effect of inhibitory spike-timing-dependent plasticity on fast sparsely synchronized rhythms in a small-world neuronal network. Neural Netw 2018; 106:50-66. [PMID: 30025272 DOI: 10.1016/j.neunet.2018.06.013] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2018] [Revised: 05/14/2018] [Accepted: 06/25/2018] [Indexed: 02/06/2023]
Abstract
We consider the Watts-Strogatz small-world network (SWN) consisting of inhibitory fast spiking Izhikevich interneurons. This inhibitory neuronal population has adaptive dynamic synaptic strengths governed by the inhibitory spike-timing-dependent plasticity (iSTDP). In previous works without iSTDP, fast sparsely synchronized rhythms, associated with diverse cognitive functions, were found to appear in a range of large noise intensities for fixed strong synaptic inhibition strengths. Here, we investigate the effect of iSTDP on fast sparse synchronization (FSS) by varying the noise intensity D. We employ an asymmetric anti-Hebbian time window for the iSTDP update rule [which is in contrast to the Hebbian time window for the excitatory STDP (eSTDP)]. Depending on values of D, population-averaged values of saturated synaptic inhibition strengths are potentiated [long-term potentiation (LTP)] or depressed [long-term depression (LTD)] in comparison with the initial mean value, and dispersions from the mean values of LTP/LTD are much increased when compared with the initial dispersion, independently of D. In most cases of LTD where the effect of mean LTD is dominant in comparison with the effect of dispersion, good synchronization (with higher spiking measure) is found to get better via LTD, while bad synchronization (with lower spiking measure) is found to get worse via LTP. This kind of Matthew effect in inhibitory synaptic plasticity is in contrast to that in excitatory synaptic plasticity where good (bad) synchronization gets better (worse) via LTP (LTD). Emergences of LTD and LTP of synaptic inhibition strengths are intensively investigated via a microscopic method based on the distributions of time delays between the pre- and the post-synaptic spike times. Furthermore, we also investigate the effects of network architecture on FSS by changing the rewiring probability p of the SWN in the presence of iSTDP.
Collapse
|
37
|
A memristive plasticity model of voltage-based STDP suitable for recurrent bidirectional neural networks in the hippocampus. Sci Rep 2018; 8:9367. [PMID: 29921840 PMCID: PMC6008480 DOI: 10.1038/s41598-018-27616-6] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2017] [Accepted: 06/04/2018] [Indexed: 01/02/2023] Open
Abstract
Memristive systems have gained considerable attention in the field of neuromorphic engineering, because they allow the emulation of synaptic functionality in solid state nano-physical systems. In this study, we show that memristive behavior provides a broad working framework for the phenomenological modelling of cellular synaptic mechanisms. In particular, we seek to understand how close a memristive system can account for the biological realism. The basic characteristics of memristive systems, i.e. voltage and memory behavior, are used to derive a voltage-based plasticity rule. We show that this model is suitable to account for a variety of electrophysiology plasticity data. Furthermore, we incorporate the plasticity model into an all-to-all connecting network scheme. Motivated by the auto-associative CA3 network of the hippocampus, we show that the implemented network allows the discrimination and processing of mnemonic pattern information, i.e. the formation of functional bidirectional connections resulting in the formation of local receptive fields. Since the presented plasticity model can be applied to real memristive devices as well, the presented theoretical framework can support both, the design of appropriate memristive devices for neuromorphic computing and the development of complex neuromorphic networks, which account for the specific advantage of memristive devices.
Collapse
|
38
|
Effect of spike-timing-dependent plasticity on stochastic burst synchronization in a scale-free neuronal network. Cogn Neurodyn 2018; 12:315-342. [PMID: 29765480 DOI: 10.1007/s11571-017-9470-0] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2017] [Revised: 11/29/2017] [Accepted: 12/26/2017] [Indexed: 01/02/2023] Open
Abstract
We consider an excitatory population of subthreshold Izhikevich neurons which cannot fire spontaneously without noise. As the coupling strength passes a threshold, individual neurons exhibit noise-induced burstings. This neuronal population has adaptive dynamic synaptic strengths governed by the spike-timing-dependent plasticity (STDP). However, STDP was not considered in previous works on stochastic burst synchronization (SBS) between noise-induced burstings of sub-threshold neurons. Here, we study the effect of additive STDP on SBS by varying the noise intensity D in the Barabási-Albert scale-free network (SFN). One of our main findings is a Matthew effect in synaptic plasticity which occurs due to a positive feedback process. Good burst synchronization (with higher bursting measure) gets better via long-term potentiation (LTP) of synaptic strengths, while bad burst synchronization (with lower bursting measure) gets worse via long-term depression (LTD). Consequently, a step-like rapid transition to SBS occurs by changing D, in contrast to a relatively smooth transition in the absence of STDP. We also investigate the effects of network architecture on SBS by varying the symmetric attachment degree [Formula: see text] and the asymmetry parameter [Formula: see text] in the SFN, and Matthew effects are also found to occur by varying [Formula: see text] and [Formula: see text]. Furthermore, emergences of LTP and LTD of synaptic strengths are investigated in details via our own microscopic methods based on both the distributions of time delays between the burst onset times of the pre- and the post-synaptic neurons and the pair-correlations between the pre- and the post-synaptic instantaneous individual burst rates (IIBRs). Finally, a multiplicative STDP case (depending on states) with soft bounds is also investigated in comparison with the additive STDP case (independent of states) with hard bounds. Due to the soft bounds, a Matthew effect with some quantitative differences is also found to occur for the case of multiplicative STDP.
Collapse
|
39
|
Scalable excitatory synaptic circuit design using floating gate based leaky integrators. Sci Rep 2017; 7:17579. [PMID: 29242504 PMCID: PMC5730552 DOI: 10.1038/s41598-017-17889-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Accepted: 12/01/2017] [Indexed: 11/09/2022] Open
Abstract
We propose a scalable synaptic circuit realizing spike timing dependent plasticity (STDP)-compatible with randomly spiking neurons. The feasible working of the circuit was examined by circuit simulation using the BSIM 4.6.0 model. A distinguishable feature of the circuit is the use of floating-gate integrators that provide the compact implementation of biologically plausible relaxation time scale. This relaxation occurs on the basis of charge tunneling that mainly relies upon area-independent tunnel barrier properties (e.g. barrier width and height) rather than capacitance. The circuit simulations feature (i) weight-dependent STDP that spontaneously limits the synaptic weight growth, (ii) competitive synaptic adaptation within both unsupervised and supervised frameworks with randomly spiking neurons. The estimated power consumption is merely 34 pW, perhaps meeting one of the most crucial principles (power-efficiency) of neuromorphic engineering. Finally, a means of fine-tuning the STDP behavior is provided.
Collapse
|
40
|
Hebbian plasticity requires compensatory processes on multiple timescales. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0259. [PMID: 28093557 PMCID: PMC5247595 DOI: 10.1098/rstb.2016.0259] [Citation(s) in RCA: 89] [Impact Index Per Article: 12.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/09/2016] [Indexed: 01/19/2023] Open
Abstract
We review a body of theoretical and experimental research on Hebbian and homeostatic plasticity, starting from a puzzling observation: while homeostasis of synapses found in experiments is a slow compensatory process, most mathematical models of synaptic plasticity use rapid compensatory processes (RCPs). Even worse, with the slow homeostatic plasticity reported in experiments, simulations of existing plasticity models cannot maintain network stability unless further control mechanisms are implemented. To solve this paradox, we suggest that in addition to slow forms of homeostatic plasticity there are RCPs which stabilize synaptic plasticity on short timescales. These rapid processes may include heterosynaptic depression triggered by episodes of high postsynaptic firing rate. While slower forms of homeostatic plasticity are not sufficient to stabilize Hebbian plasticity, they are important for fine-tuning neural circuits. Taken together we suggest that learning and memory rely on an intricate interplay of diverse plasticity mechanisms on different timescales which jointly ensure stability and plasticity of neural circuits.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'.
Collapse
|
41
|
Stochastic spike synchronization in a small-world neural network with spike-timing-dependent plasticity. Neural Netw 2017; 97:92-106. [PMID: 29096205 DOI: 10.1016/j.neunet.2017.09.016] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2017] [Revised: 08/17/2017] [Accepted: 09/29/2017] [Indexed: 10/18/2022]
Abstract
We consider the Watts-Strogatz small-world network (SWN) consisting of subthreshold neurons which exhibit noise-induced spikings. This neuronal network has adaptive dynamic synaptic strengths governed by the spike-timing-dependent plasticity (STDP). In previous works without STDP, stochastic spike synchronization (SSS) between noise-induced spikings of subthreshold neurons was found to occur in a range of intermediate noise intensities. Here, we investigate the effect of additive STDP on the SSS by varying the noise intensity. Occurrence of a "Matthew" effect in synaptic plasticity is found due to a positive feedback process. As a result, good synchronization gets better via long-term potentiation of synaptic strengths, while bad synchronization gets worse via long-term depression. Emergences of long-term potentiation and long-term depression of synaptic strengths are intensively investigated via microscopic studies based on the pair-correlations between the pre- and the post-synaptic IISRs (instantaneous individual spike rates) as well as the distributions of time delays between the pre- and the post-synaptic spike times. Furthermore, the effects of multiplicative STDP (which depends on states) on the SSS are studied and discussed in comparison with the case of additive STDP (independent of states). These effects of STDP on the SSS in the SWN are also compared with those in the regular lattice and the random graph.
Collapse
|
42
|
Natural Firing Patterns Imply Low Sensitivity of Synaptic Plasticity to Spike Timing Compared with Firing Rate. J Neurosci 2017; 36:11238-11258. [PMID: 27807166 DOI: 10.1523/jneurosci.0104-16.2016] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2016] [Accepted: 09/02/2016] [Indexed: 01/28/2023] Open
Abstract
Synaptic plasticity is sensitive to the rate and the timing of presynaptic and postsynaptic action potentials. In experimental protocols inducing plasticity, the imposed spike trains are typically regular and the relative timing between every presynaptic and postsynaptic spike is fixed. This is at odds with firing patterns observed in the cortex of intact animals, where cells fire irregularly and the timing between presynaptic and postsynaptic spikes varies. To investigate synaptic changes elicited by in vivo-like firing, we used numerical simulations and mathematical analysis of synaptic plasticity models. We found that the influence of spike timing on plasticity is weaker than expected from regular stimulation protocols. Moreover, when neurons fire irregularly, synaptic changes induced by precise spike timing can be equivalently induced by a modest firing rate variation. Our findings bridge the gap between existing results on synaptic plasticity and plasticity occurring in vivo, and challenge the dominant role of spike timing in plasticity. SIGNIFICANCE STATEMENT Synaptic plasticity, the change in efficacy of connections between neurons, is thought to underlie learning and memory. The dominant paradigm posits that the precise timing of neural action potentials (APs) is central for plasticity induction. This concept is based on experiments using highly regular and stereotyped patterns of APs, in stark contrast with natural neuronal activity. Using synaptic plasticity models, we investigated how irregular, in vivo-like activity shapes synaptic plasticity. We found that synaptic changes induced by precise timing of APs are much weaker than suggested by regular stimulation protocols, and can be equivalently induced by modest variations of the AP rate alone. Our results call into question the dominant role of precise AP timing for plasticity in natural conditions.
Collapse
|
43
|
Partial Breakdown of Input Specificity of STDP at Individual Synapses Promotes New Learning. J Neurosci 2017; 36:8842-55. [PMID: 27559167 DOI: 10.1523/jneurosci.0552-16.2016] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2016] [Accepted: 06/30/2016] [Indexed: 11/21/2022] Open
Abstract
UNLABELLED Hebbian-type learning rules, which underlie learning and refinement of neuronal connectivity, postulate input specificity of synaptic changes. However, theoretical analyses have long appreciated that additional mechanisms, not restricted to activated synapses, are needed to counteract positive feedback imposed by Hebbian-type rules on synaptic weight changes and to achieve stable operation of learning systems. The biological basis of such mechanisms has remained elusive. Here we show that, in layer 2/3 pyramidal neurons from slices of visual cortex of rats, synaptic changes induced at individual synapses by spike timing-dependent plasticity do not strictly follow the input specificity rule. Spike timing-dependent plasticity is accompanied by changes in unpaired synapses: heterosynaptic plasticity. The direction of heterosynaptic changes is weight-dependent, with balanced potentiation and depression, so that the total synaptic input to a cell remains preserved despite potentiation or depression of individual synapses. Importantly, this form of heterosynaptic plasticity is induced at unpaired synapses by the same pattern of postsynaptic activity that induces homosynaptic changes at paired synapses. In computer simulations, we show that experimentally observed heterosynaptic plasticity can indeed serve the theoretically predicted role of robustly preventing runaway dynamics of synaptic weights and activity. Moreover, it endows model neurons and networks with essential computational features: enhancement of synaptic competition, facilitation of the development of specific intrinsic connectivity, and the ability for relearning. We conclude that heterosynaptic plasticity is an inherent property of plastic synapses, crucial for normal operation of learning systems. SIGNIFICANCE STATEMENT We show that spike timing-dependent plasticity in L2/L3 pyramids from rat visual cortex is accompanied by plastic changes in unpaired synapses. These heterosynaptic changes are weight-dependent and balanced: individual synapses expressed significant LTP or LTD, but the average over all synapses did not change. Thus, the rule of input specificity breaks down at individual synapses but holds for responses averaged over many inputs. In model neurons and networks, this experimentally characterized form of heterosynaptic plasticity prevents runaway dynamics of synaptic weights and activity, enhances synaptic competition, facilitates development of specific intrinsic connectivity, and enables relearning. This new form of heterosynaptic plasticity represents the cellular basis of a theoretically postulated mechanism, which is additional to Hebbian-type rules, and is necessary for stable operation of learning systems.
Collapse
|
44
|
A Model of Fast Hebbian Spike Latency Normalization. Front Comput Neurosci 2017; 11:33. [PMID: 28555102 PMCID: PMC5430963 DOI: 10.3389/fncom.2017.00033] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2016] [Accepted: 04/13/2017] [Indexed: 11/13/2022] Open
Abstract
Hebbian changes of excitatory synapses are driven by and enhance correlations between pre- and postsynaptic neuronal activations, forming a positive feedback loop that can lead to instability in simulated neural networks. Because Hebbian learning may occur on time scales of seconds to minutes, it is conjectured that some form of fast stabilization of neural firing is necessary to avoid runaway of excitation, but both the theoretical underpinning and the biological implementation for such homeostatic mechanism are to be fully investigated. Supported by analytical and computational arguments, we show that a Hebbian spike-timing-dependent metaplasticity rule, accounts for inherently-stable, quick tuning of the total input weight of a single neuron in the general scenario of asynchronous neural firing characterized by UP and DOWN states of activity.
Collapse
|
45
|
The temporal paradox of Hebbian learning and homeostatic plasticity. Curr Opin Neurobiol 2017; 43:166-176. [DOI: 10.1016/j.conb.2017.03.015] [Citation(s) in RCA: 104] [Impact Index Per Article: 14.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 03/07/2017] [Accepted: 03/22/2017] [Indexed: 11/16/2022]
|
46
|
The Role of Neuromodulators in Cortical Plasticity. A Computational Perspective. Front Synaptic Neurosci 2017; 8:38. [PMID: 28119596 PMCID: PMC5222801 DOI: 10.3389/fnsyn.2016.00038] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 12/12/2016] [Indexed: 11/13/2022] Open
Abstract
Neuromodulators play a ubiquitous role across the brain in regulating plasticity. With recent advances in experimental techniques, it is possible to study the effects of diverse neuromodulatory states in specific brain regions. Neuromodulators are thought to impact plasticity predominantly through two mechanisms: the gating of plasticity and the upregulation of neuronal activity. However, the consequences of these mechanisms are poorly understood and there is a need for both experimental and theoretical exploration. Here we illustrate how neuromodulatory state affects cortical plasticity through these two mechanisms. First, we explore the ability of neuromodulators to gate plasticity by reshaping the learning window for spike-timing-dependent plasticity. Using a simple computational model, we implement four different learning rules and demonstrate their effects on receptive field plasticity. We then compare the neuromodulatory effects of upregulating learning rate versus the effects of upregulating neuronal activity. We find that these seemingly similar mechanisms do not yield the same outcome: upregulating neuronal activity can lead to either a broadening or a sharpening of receptive field tuning, whereas upregulating learning rate only intensifies the sharpening of receptive field tuning. This simple model demonstrates the need for further exploration of the rich landscape of neuromodulator-mediated plasticity. Future experiments, coupled with biologically detailed computational models, will elucidate the diversity of mechanisms by which neuromodulatory state regulates cortical plasticity.
Collapse
|
47
|
Dendritic and Axonal Propagation Delays Determine Emergent Structures of Neuronal Networks with Plastic Synapses. Sci Rep 2017; 7:39682. [PMID: 28045109 PMCID: PMC5206725 DOI: 10.1038/srep39682] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2016] [Accepted: 11/25/2016] [Indexed: 11/09/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) modifies synaptic strengths based on the relative timing of pre- and postsynaptic spikes. The temporal order of spikes turned out to be crucial. We here take into account how propagation delays, composed of dendritic and axonal delay times, may affect the temporal order of spikes. In a minimal setting, characterized by neglecting dendritic and axonal propagation delays, STDP eliminates bidirectional connections between two coupled neurons and turns them into unidirectional connections. In this paper, however, we show that depending on the dendritic and axonal propagation delays, the temporal order of spikes at the synapses can be different from those in the cell bodies and, consequently, qualitatively different connectivity patterns emerge. In particular, we show that for a system of two coupled oscillatory neurons, bidirectional synapses can be preserved and potentiated. Intriguingly, this finding also translates to large networks of type-II phase oscillators and, hence, crucially impacts on the overall hierarchical connectivity patterns of oscillatory neuronal networks.
Collapse
|
48
|
Propagation of Collective Temporal Regularity in Noisy Hierarchical Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:191-205. [PMID: 28055909 DOI: 10.1109/tnnls.2015.2502993] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Neuronal communication between different brain areas is achieved in terms of spikes. Consequently, spike-time regularity is closely related to many cognitive tasks and timing precision of neural information processing. A recent experiment on primate parietal cortex reports that spike-time regularity increases consistently from primary sensory to higher cortical regions. This observation conflicts with the influential view that spikes in the neocortex are fundamentally irregular. To uncover the underlying network mechanism, we construct a multilayered feedforward neural information transmission pathway and investigate how spike-time regularity evolves across subsequent layers. Numerical results reveal that despite the obviously irregular spiking patterns in previous several layers, neurons in downstream layers can generate rather regular spikes, which depends on the network topology. In particular, we find that collective temporal regularity in deeper layers exhibits resonance-like behavior with respect to both synaptic connection probability and synaptic weight, i.e., the optimal topology parameter maximizes the spike-timing regularity. Furthermore, it is demonstrated that synaptic properties, including inhibition, synaptic transient dynamics, and plasticity, have significant impacts on spike-timing regularity propagation. The emergence of the increasingly regular spiking (RS) patterns in higher parietal regions can, thus, be viewed as a natural consequence of spiking activity propagation between different brain areas. Finally, we validate an important function served by increased RS: promoting reliable propagation of spike-rate signals across downstream layers.
Collapse
|
49
|
Adenosine Shifts Plasticity Regimes between Associative and Homeostatic by Modulating Heterosynaptic Changes. J Neurosci 2016; 37:1439-1452. [PMID: 28028196 DOI: 10.1523/jneurosci.2984-16.2016] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2016] [Revised: 11/18/2016] [Accepted: 12/15/2016] [Indexed: 12/18/2022] Open
Abstract
Endogenous extracellular adenosine level fluctuates in an activity-dependent manner and with sleep-wake cycle, modulating synaptic transmission and short-term plasticity. Hebbian-type long-term plasticity introduces intrinsic positive feedback on synaptic weight changes, making them prone to runaway dynamics. We previously demonstrated that co-occurring, weight-dependent heterosynaptic plasticity can robustly prevent runaway dynamics. Here we show that at neocortical synapses in slices from rat visual cortex, adenosine modulates the weight dependence of heterosynaptic plasticity: blockade of adenosine A1 receptors abolished weight dependence, while increased adenosine level strengthened it. Using model simulations, we found that the strength of weight dependence determines the ability of heterosynaptic plasticity to prevent runaway dynamics of synaptic weights imposed by Hebbian-type learning. Changing the weight dependence of heterosynaptic plasticity within an experimentally observed range gradually shifted the operating point of neurons between an unbalancing regime dominated by associative plasticity and a homeostatic regime of tightly constrained synaptic changes. Because adenosine tone is a natural correlate of activity level (activity increases adenosine tone) and brain state (elevated adenosine tone increases sleep pressure), modulation of heterosynaptic plasticity by adenosine represents an endogenous mechanism that translates changes of the brain state into a shift of the regime of synaptic plasticity and learning. We speculate that adenosine modulation may provide a mechanism for fine-tuning of plasticity and learning according to brain state and activity.SIGNIFICANCE STATEMENT Associative learning depends on brain state and is impaired when the subject is sleepy or tired. However, the link between changes of brain state and modulation of synaptic plasticity and learning remains elusive. Here we show that adenosine regulates weight dependence of heterosynaptic plasticity: adenosine strengthened weight dependence of heterosynaptic plasticity; blockade of adenosine A1 receptors abolished it. In model neurons, such changes of the weight dependence of heterosynaptic plasticity shifted their operating point between regimes dominated by associative plasticity or by synaptic homeostasis. Because adenosine tone is a natural correlate of activity level and brain state, modulation of plasticity by adenosine represents an endogenous mechanism for translation of brain state changes into a shift of the regime of synaptic plasticity and learning.
Collapse
|
50
|
Timing Intervals Using Population Synchrony and Spike Timing Dependent Plasticity. Front Comput Neurosci 2016; 10:123. [PMID: 27990109 PMCID: PMC5133049 DOI: 10.3389/fncom.2016.00123] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2016] [Accepted: 11/15/2016] [Indexed: 11/13/2022] Open
Abstract
We present a computational model by which ensembles of regularly spiking neurons can encode different time intervals through synchronous firing. We show that a neuron responding to a large population of convergent inputs has the potential to learn to produce an appropriately-timed output via spike-time dependent plasticity. We explain why temporal variability of this population synchrony increases with increasing time intervals. We also show that the scalar property of timing and its violation at short intervals can be explained by the spike-wise accumulation of jitter in the inter-spike intervals of timing neurons. We explore how the challenge of encoding longer time intervals can be overcome and conclude that this may involve a switch to a different population of neurons with lower firing rate, with the added effect of producing an earlier bias in response. Experimental data on human timing performance show features in agreement with the model's output.
Collapse
|