1
|
Bhasin BJ, Raymond JL, Goldman MS. Synaptic weight dynamics underlying memory consolidation: implications for learning rules, circuit organization, and circuit function. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.03.20.586036. [PMID: 38585936 PMCID: PMC10996481 DOI: 10.1101/2024.03.20.586036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/09/2024]
Abstract
Systems consolidation is a common feature of learning and memory systems, in which a long-term memory initially stored in one brain region becomes persistently stored in another region. We studied the dynamics of systems consolidation in simple circuit architectures with two sites of plasticity, one in an early-learning and one in a late-learning brain area. We show that the synaptic dynamics of the circuit during consolidation of an analog memory can be understood as a temporal integration process, by which transient changes in activity driven by plasticity in the early-learning area are accumulated into persistent synaptic changes at the late-learning site. This simple principle naturally leads to a speed-accuracy tradeoff in systems consolidation and provides insight into how the circuit mitigates the stability-plasticity dilemma of storing new memories while preserving core features of older ones. Furthermore, it imposes two constraints on the circuit. First, the plasticity rule at the late-learning site must stably support a continuum of possible outputs for a given input. We show that this is readily achieved by heterosynaptic but not standard Hebbian rules. Second, to turn off the consolidation process and prevent erroneous changes at the late-learning site, neural activity in the early-learning area must be reset to its baseline activity. We propose two biologically plausible implementations for this reset that suggest novel roles for core elements of the cerebellar circuit. Significance Statement How are memories transformed over time? We propose a simple organizing principle for how long term memories are moved from an initial to a final site of storage. We show that successful transfer occurs when the late site of memory storage is endowed with synaptic plasticity rules that stably accumulate changes in activity occurring at the early site of memory storage. We instantiate this principle in a simple computational model that is representative of brain circuits underlying a variety of behaviors. The model suggests how a neural circuit can store new memories while preserving core features of older ones, and suggests novel roles for core elements of the cerebellar circuit.
Collapse
|
2
|
Yang X, La Camera G. Co-existence of synaptic plasticity and metastable dynamics in a spiking model of cortical circuits. PLoS Comput Biol 2024; 20:e1012220. [PMID: 38950068 PMCID: PMC11244818 DOI: 10.1371/journal.pcbi.1012220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Revised: 07/12/2024] [Accepted: 06/01/2024] [Indexed: 07/03/2024] Open
Abstract
Evidence for metastable dynamics and its role in brain function is emerging at a fast pace and is changing our understanding of neural coding by putting an emphasis on hidden states of transient activity. Clustered networks of spiking neurons have enhanced synaptic connections among groups of neurons forming structures called cell assemblies; such networks are capable of producing metastable dynamics that is in agreement with many experimental results. However, it is unclear how a clustered network structure producing metastable dynamics may emerge from a fully local plasticity rule, i.e., a plasticity rule where each synapse has only access to the activity of the neurons it connects (as opposed to the activity of other neurons or other synapses). Here, we propose a local plasticity rule producing ongoing metastable dynamics in a deterministic, recurrent network of spiking neurons. The metastable dynamics co-exists with ongoing plasticity and is the consequence of a self-tuning mechanism that keeps the synaptic weights close to the instability line where memories are spontaneously reactivated. In turn, the synaptic structure is stable to ongoing dynamics and random perturbations, yet it remains sufficiently plastic to remap sensory representations to encode new sets of stimuli. Both the plasticity rule and the metastable dynamics scale well with network size, with synaptic stability increasing with the number of neurons. Overall, our results show that it is possible to generate metastable dynamics over meaningful hidden states using a simple but biologically plausible plasticity rule which co-exists with ongoing neural dynamics.
Collapse
Affiliation(s)
- Xiaoyu Yang
- Graduate Program in Physics and Astronomy, Stony Brook University, Stony Brook, New York, United States of America
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
- Center for Neural Circuit Dynamics, Stony Brook University, Stony Brook, New York, United States of America
| | - Giancarlo La Camera
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
- Center for Neural Circuit Dynamics, Stony Brook University, Stony Brook, New York, United States of America
| |
Collapse
|
3
|
Yang X, La Camera G. Co-existence of synaptic plasticity and metastable dynamics in a spiking model of cortical circuits. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.12.07.570692. [PMID: 38106233 PMCID: PMC10723399 DOI: 10.1101/2023.12.07.570692] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/19/2023]
Abstract
Evidence for metastable dynamics and its role in brain function is emerging at a fast pace and is changing our understanding of neural coding by putting an emphasis on hidden states of transient activity. Clustered networks of spiking neurons have enhanced synaptic connections among groups of neurons forming structures called cell assemblies; such networks are capable of producing metastable dynamics that is in agreement with many experimental results. However, it is unclear how a clustered network structure producing metastable dynamics may emerge from a fully local plasticity rule, i.e., a plasticity rule where each synapse has only access to the activity of the neurons it connects (as opposed to the activity of other neurons or other synapses). Here, we propose a local plasticity rule producing ongoing metastable dynamics in a deterministic, recurrent network of spiking neurons. The metastable dynamics co-exists with ongoing plasticity and is the consequence of a self-tuning mechanism that keeps the synaptic weights close to the instability line where memories are spontaneously reactivated. In turn, the synaptic structure is stable to ongoing dynamics and random perturbations, yet it remains sufficiently plastic to remap sensory representations to encode new sets of stimuli. Both the plasticity rule and the metastable dynamics scale well with network size, with synaptic stability increasing with the number of neurons. Overall, our results show that it is possible to generate metastable dynamics over meaningful hidden states using a simple but biologically plausible plasticity rule which co-exists with ongoing neural dynamics.
Collapse
Affiliation(s)
- Xiaoyu Yang
- Graduate Program in Physics and Astronomy, Stony Brook University
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| | - Giancarlo La Camera
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| |
Collapse
|
4
|
Jauch J, Becker M, Tetzlaff C, Fauth MJ. Differences in the consolidation by spontaneous and evoked ripples in the presence of active dendrites. PLoS Comput Biol 2024; 20:e1012218. [PMID: 38917228 PMCID: PMC11230591 DOI: 10.1371/journal.pcbi.1012218] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2023] [Revised: 07/08/2024] [Accepted: 05/31/2024] [Indexed: 06/27/2024] Open
Abstract
Ripples are a typical form of neural activity in hippocampal neural networks associated with the replay of episodic memories during sleep as well as sleep-related plasticity and memory consolidation. The emergence of ripples has been observed both dependent as well as independent of input from other brain areas and often coincides with dendritic spikes. Yet, it is unclear how input-evoked and spontaneous ripples as well as dendritic excitability affect plasticity and consolidation. Here, we use mathematical modeling to compare these cases. We find that consolidation as well as the emergence of spontaneous ripples depends on a reliable propagation of activity in feed-forward structures which constitute memory representations. This propagation is facilitated by excitable dendrites, which entail that a few strong synapses are sufficient to trigger neuronal firing. In this situation, stimulation-evoked ripples lead to the potentiation of weak synapses within the feed-forward structure and, thus, to a consolidation of a more general sequence memory. However, spontaneous ripples that occur without stimulation, only consolidate a sparse backbone of the existing strong feed-forward structure. Based on this, we test a recently hypothesized scenario in which the excitability of dendrites is transiently enhanced after learning, and show that such a transient increase can strengthen, restructure and consolidate even weak hippocampal memories, which would be forgotten otherwise. Hence, a transient increase in dendritic excitability would indeed provide a mechanism for stabilizing memories.
Collapse
Affiliation(s)
- Jannik Jauch
- Third Institute for Physics, Georg-August-University, Göttingen, Germany
| | - Moritz Becker
- Group of Computational Synaptic Physiology, Department for Neuro- and Sensory Physiology, University Medical Center Göttingen, Göttingen, Germany
| | - Christian Tetzlaff
- Group of Computational Synaptic Physiology, Department for Neuro- and Sensory Physiology, University Medical Center Göttingen, Göttingen, Germany
| | - Michael Jan Fauth
- Third Institute for Physics, Georg-August-University, Göttingen, Germany
| |
Collapse
|
5
|
Tamosiunaite M, Tetzlaff C, Wörgötter F. Unsupervised learning of perceptual feature combinations. PLoS Comput Biol 2024; 20:e1011926. [PMID: 38442095 PMCID: PMC10942261 DOI: 10.1371/journal.pcbi.1011926] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2023] [Revised: 03/15/2024] [Accepted: 02/19/2024] [Indexed: 03/07/2024] Open
Abstract
In many situations it is behaviorally relevant for an animal to respond to co-occurrences of perceptual, possibly polymodal features, while these features alone may have no importance. Thus, it is crucial for animals to learn such feature combinations in spite of the fact that they may occur with variable intensity and occurrence frequency. Here, we present a novel unsupervised learning mechanism that is largely independent of these contingencies and allows neurons in a network to achieve specificity for different feature combinations. This is achieved by a novel correlation-based (Hebbian) learning rule, which allows for linear weight growth and which is combined with a mechanism for gradually reducing the learning rate as soon as the neuron's response becomes feature combination specific. In a set of control experiments, we show that other existing advanced learning rules cannot satisfactorily form ordered multi-feature representations. In addition, we show that networks, which use this type of learning always stabilize and converge to subsets of neurons with different feature-combination specificity. Neurons with this property may, thus, serve as an initial stage for the processing of ecologically relevant real world situations for an animal.
Collapse
Affiliation(s)
- Minija Tamosiunaite
- Department for Computational Neuroscience, Third Physics Institute, University of Göttingen, Göttingen, Germany
- Vytautas Magnus University, Faculty of Informatics, Kaunas, Lithuania
| | - Christian Tetzlaff
- Computational Synaptic Physiology, Department for Neuro- and Sensory Physiology, University Medical Center Göttingen, Göttingen, Germany
- Campus Institute Data Science, Göttingen, Germany
| | - Florentin Wörgötter
- Department for Computational Neuroscience, Third Physics Institute, University of Göttingen, Göttingen, Germany
| |
Collapse
|
6
|
Bouhadjar Y, Wouters DJ, Diesmann M, Tetzlaff T. Coherent noise enables probabilistic sequence replay in spiking neuronal networks. PLoS Comput Biol 2023; 19:e1010989. [PMID: 37130121 PMCID: PMC10153753 DOI: 10.1371/journal.pcbi.1010989] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 03/02/2023] [Indexed: 05/03/2023] Open
Abstract
Animals rely on different decision strategies when faced with ambiguous or uncertain cues. Depending on the context, decisions may be biased towards events that were most frequently experienced in the past, or be more explorative. A particular type of decision making central to cognition is sequential memory recall in response to ambiguous cues. A previously developed spiking neuronal network implementation of sequence prediction and recall learns complex, high-order sequences in an unsupervised manner by local, biologically inspired plasticity rules. In response to an ambiguous cue, the model deterministically recalls the sequence shown most frequently during training. Here, we present an extension of the model enabling a range of different decision strategies. In this model, explorative behavior is generated by supplying neurons with noise. As the model relies on population encoding, uncorrelated noise averages out, and the recall dynamics remain effectively deterministic. In the presence of locally correlated noise, the averaging effect is avoided without impairing the model performance, and without the need for large noise amplitudes. We investigate two forms of correlated noise occurring in nature: shared synaptic background inputs, and random locking of the stimulus to spatiotemporal oscillations in the network activity. Depending on the noise characteristics, the network adopts various recall strategies. This study thereby provides potential mechanisms explaining how the statistics of learned sequences affect decision making, and how decision strategies can be adjusted after learning.
Collapse
Affiliation(s)
- Younes Bouhadjar
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Peter Grünberg Institute (PGI-7,10), Jülich Research Centre and JARA, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Dirk J Wouters
- Institute of Electronic Materials (IWE 2) & JARA-FIT, RWTH Aachen University, Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, & Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
7
|
Wilmes KA, Clopath C. Dendrites help mitigate the plasticity-stability dilemma. Sci Rep 2023; 13:6543. [PMID: 37085642 PMCID: PMC10121616 DOI: 10.1038/s41598-023-32410-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2022] [Accepted: 03/27/2023] [Indexed: 04/23/2023] Open
Abstract
With Hebbian learning 'who fires together wires together', well-known problems arise. Hebbian plasticity can cause unstable network dynamics and overwrite stored memories. Because the known homeostatic plasticity mechanisms tend to be too slow to combat unstable dynamics, it has been proposed that plasticity must be highly gated and synaptic strengths limited. While solving the issue of stability, gating and limiting plasticity does not solve the stability-plasticity dilemma. We propose that dendrites enable both stable network dynamics and considerable synaptic changes, as they allow the gating of plasticity in a compartment-specific manner. We investigate how gating plasticity influences network stability in plastic balanced spiking networks of neurons with dendrites. We compare how different ways to gate plasticity, namely via modulating excitability, learning rate, and inhibition increase stability. We investigate how dendritic versus perisomatic gating allows for different amounts of weight changes in stable networks. We suggest that the compartmentalisation of pyramidal cells enables dendritic synaptic changes while maintaining stability. We show that the coupling between dendrite and soma is critical for the plasticity-stability trade-off. Finally, we show that spatially restricted plasticity additionally improves stability.
Collapse
Affiliation(s)
- Katharina A Wilmes
- Imperial College London, London, United Kingdom.
- University of Bern, Bern, Switzerland.
| | | |
Collapse
|
8
|
Creation of Neuronal Ensembles and Cell-Specific Homeostatic Plasticity through Chronic Sparse Optogenetic Stimulation. J Neurosci 2023; 43:82-92. [PMID: 36400529 PMCID: PMC9838708 DOI: 10.1523/jneurosci.1104-22.2022] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Revised: 09/15/2022] [Accepted: 10/16/2022] [Indexed: 11/19/2022] Open
Abstract
Cortical computations emerge from the dynamics of neurons embedded in complex cortical circuits. Within these circuits, neuronal ensembles, which represent subnetworks with shared functional connectivity, emerge in an experience-dependent manner. Here we induced ensembles in ex vivo cortical circuits from mice of either sex by differentially activating subpopulations through chronic optogenetic stimulation. We observed a decrease in voltage correlation, and importantly a synaptic decoupling between the stimulated and nonstimulated populations. We also observed a decrease in firing rate during Up-states in the stimulated population. These ensemble-specific changes were accompanied by decreases in intrinsic excitability in the stimulated population, and a decrease in connectivity between stimulated and nonstimulated pyramidal neurons. By incorporating the empirically observed changes in intrinsic excitability and connectivity into a spiking neural network model, we were able to demonstrate that changes in both intrinsic excitability and connectivity accounted for the decreased firing rate, but only changes in connectivity accounted for the observed decorrelation. Our findings help ascertain the mechanisms underlying the ability of chronic patterned stimulation to create ensembles within cortical circuits and, importantly, show that while Up-states are a global network-wide phenomenon, functionally distinct ensembles can preserve their identity during Up-states through differential firing rates and correlations.SIGNIFICANCE STATEMENT The connectivity and activity patterns of local cortical circuits are shaped by experience. This experience-dependent reorganization of cortical circuits is driven by complex interactions between different local learning rules, external input, and reciprocal feedback between many distinct brain areas. Here we used an ex vivo approach to demonstrate how simple forms of chronic external stimulation can shape local cortical circuits in terms of their correlated activity and functional connectivity. The absence of feedback between different brain areas and full control of external input allowed for a tractable system to study the underlying mechanisms and development of a computational model. Results show that differential stimulation of subpopulations of neurons significantly reshapes cortical circuits and forms subnetworks referred to as neuronal ensembles.
Collapse
|
9
|
Schumm SN, Gabrieli D, Meaney DF. Plasticity impairment alters community structure but permits successful pattern separation in a hippocampal network model. Front Cell Neurosci 2022; 16:977769. [PMID: 36505514 PMCID: PMC9729278 DOI: 10.3389/fncel.2022.977769] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Accepted: 10/25/2022] [Indexed: 11/25/2022] Open
Abstract
Patients who suffer from traumatic brain injury (TBI) often complain of learning and memory problems. Their symptoms are principally mediated by the hippocampus and the ability to adapt to stimulus, also known as neural plasticity. Therefore, one plausible injury mechanism is plasticity impairment, which currently lacks comprehensive investigation across TBI research. For these studies, we used a computational network model of the hippocampus that includes the dentate gyrus, CA3, and CA1 with neuron-scale resolution. We simulated mild injury through weakened spike-timing-dependent plasticity (STDP), which modulates synaptic weights according to causal spike timing. In preliminary work, we found functional deficits consisting of decreased firing rate and broadband power in areas CA3 and CA1 after STDP impairment. To address structural changes with these studies, we applied modularity analysis to evaluate how STDP impairment modifies community structure in the hippocampal network. We also studied the emergent function of network-based learning and found that impaired networks could acquire conditioned responses after training, but the magnitude of the response was significantly lower. Furthermore, we examined pattern separation, a prerequisite of learning, by entraining two overlapping patterns. Contrary to our initial hypothesis, impaired networks did not exhibit deficits in pattern separation with either population- or rate-based coding. Collectively, these results demonstrate how a mechanism of injury that operates at the synapse regulates circuit function.
Collapse
Affiliation(s)
- Samantha N. Schumm
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA, United States
| | - David Gabrieli
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA, United States
| | - David F. Meaney
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA, United States
- Department of Neurosurgery, Penn Center for Brain Injury and Repair, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States
| |
Collapse
|
10
|
Miehl C, Onasch S, Festa D, Gjorgjieva J. Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Christoph Miehl
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Sebastian Onasch
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Dylan Festa
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| |
Collapse
|
11
|
Hofmann M, Mader P. Synaptic Scaling-An Artificial Neural Network Regularization Inspired by Nature. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2022; 33:3094-3108. [PMID: 33502984 DOI: 10.1109/tnnls.2021.3050422] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Nature has always inspired the human spirit and scientists frequently developed new methods based on observations from nature. Recent advances in imaging and sensing technology allow fascinating insights into biological neural processes. With the objective of finding new strategies to enhance the learning capabilities of neural networks, we focus on a phenomenon that is closely related to learning tasks and neural stability in biological neural networks, called homeostatic plasticity. Among the theories that have been developed to describe homeostatic plasticity, synaptic scaling has been found to be the most mature and applicable. We systematically discuss previous studies on the synaptic scaling theory and how they could be applied to artificial neural networks. Therefore, we utilize information theory to analytically evaluate how mutual information is affected by synaptic scaling. Based on these analytic findings, we propose two flavors in which synaptic scaling can be applied in the training process of simple and complex, feedforward, and recurrent neural networks. We compare our approach with state-of-the-art regularization techniques on standard benchmarks. We found that the proposed method yields the lowest error in both regression and classification tasks compared to previous regularization approaches in our experiments across a wide range of network feedforward and recurrent topologies and data sets.
Collapse
|
12
|
Bouhadjar Y, Wouters DJ, Diesmann M, Tetzlaff T. Sequence learning, prediction, and replay in networks of spiking neurons. PLoS Comput Biol 2022; 18:e1010233. [PMID: 35727857 PMCID: PMC9273101 DOI: 10.1371/journal.pcbi.1010233] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Revised: 07/11/2022] [Accepted: 05/20/2022] [Indexed: 11/24/2022] Open
Abstract
Sequence learning, prediction and replay have been proposed to constitute the universal computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes these forms of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, permits a context specific prediction of future sequence elements, and generates mismatch signals in case the predictions are not met. While the HTM algorithm accounts for a number of biological features such as topographic receptive fields, nonlinear dendritic processing, and sparse connectivity, it is based on abstract discrete-time neuron and synapse dynamics, as well as on plasticity mechanisms that can only partly be related to known biological mechanisms. Here, we devise a continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters. The model learns high-order sequences by means of a structural Hebbian synaptic plasticity mechanism supplemented with a rate-based homeostatic control. In combination with nonlinear dendritic input integration and local inhibitory feedback, this type of plasticity leads to the dynamic self-organization of narrow sequence-specific subnetworks. These subnetworks provide the substrate for a faithful propagation of sparse, synchronous activity, and, thereby, for a robust, context specific prediction of future sequence elements as well as for the autonomous replay of previously learned sequences. By strengthening the link to biology, our implementation facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities. The continuous-time implementation of the TM algorithm permits, in particular, an investigation of the role of sequence timing for sequence learning, prediction and replay. We demonstrate this aspect by studying the effect of the sequence speed on the sequence learning performance and on the speed of autonomous sequence replay. Essentially all data processed by mammals and many other living organisms is sequential. This holds true for all types of sensory input data as well as motor output activity. Being able to form memories of such sequential data, to predict future sequence elements, and to replay learned sequences is a necessary prerequisite for survival. It has been hypothesized that sequence learning, prediction and replay constitute the fundamental computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) constitutes an abstract powerful algorithm implementing this form of computation and has been proposed to serve as a model of neocortical processing. In this study, we are reformulating this algorithm in terms of known biological ingredients and mechanisms to foster the verifiability of the HTM hypothesis based on electrophysiological and behavioral data. The proposed model learns continuously in an unsupervised manner by biologically plausible, local plasticity mechanisms, and successfully predicts and replays complex sequences. Apart from establishing contact to biology, the study sheds light on the mechanisms determining at what speed we can process sequences and provides an explanation of fast sequence replay observed in the hippocampus and in the neocortex.
Collapse
Affiliation(s)
- Younes Bouhadjar
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Peter Grünberg Institute (PGI-7,10), Jülich Research Centre and JARA, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
- * E-mail:
| | - Dirk J. Wouters
- Institute of Electronic Materials (IWE 2) & JARA-FIT, RWTH Aachen University, Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, & Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
13
|
Gu J, Lim S. Unsupervised learning for robust working memory. PLoS Comput Biol 2022; 18:e1009083. [PMID: 35500033 PMCID: PMC9098088 DOI: 10.1371/journal.pcbi.1009083] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Revised: 05/12/2022] [Accepted: 03/16/2022] [Indexed: 11/18/2022] Open
Abstract
Working memory is a core component of critical cognitive functions such as planning and decision-making. Persistent activity that lasts long after the stimulus offset has been considered a neural substrate for working memory. Attractor dynamics based on network interactions can successfully reproduce such persistent activity. However, it requires a fine-tuning of network connectivity, in particular, to form continuous attractors which were suggested for encoding continuous signals in working memory. Here, we investigate whether a specific form of synaptic plasticity rules can mitigate such tuning problems in two representative working memory models, namely, rate-coded and location-coded persistent activity. We consider two prominent types of plasticity rules, differential plasticity correcting the rapid activity changes and homeostatic plasticity regularizing the long-term average of activity, both of which have been proposed to fine-tune the weights in an unsupervised manner. Consistent with the findings of previous works, differential plasticity alone was enough to recover a graded-level persistent activity after perturbations in the connectivity. For the location-coded memory, differential plasticity could also recover persistent activity. However, its pattern can be irregular for different stimulus locations under slow learning speed or large perturbation in the connectivity. On the other hand, homeostatic plasticity shows a robust recovery of smooth spatial patterns under particular types of synaptic perturbations, such as perturbations in incoming synapses onto the entire or local populations. However, homeostatic plasticity was not effective against perturbations in outgoing synapses from local populations. Instead, combining it with differential plasticity recovers location-coded persistent activity for a broader range of perturbations, suggesting compensation between two plasticity rules.
Collapse
Affiliation(s)
- Jintao Gu
- Neural Science, New York University Shanghai, Shanghai, China
| | - Sukbin Lim
- Neural Science, New York University Shanghai, Shanghai, China
- NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, Shanghai, China
- * E-mail:
| |
Collapse
|
14
|
Jaiton V, Rothomphiwat K, Ebeid E, Manoonpong P. Neural Control and Online Learning for Speed Adaptation of Unmanned Aerial Vehicles. Front Neural Circuits 2022; 16:839361. [PMID: 35547643 PMCID: PMC9082606 DOI: 10.3389/fncir.2022.839361] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2021] [Accepted: 03/25/2022] [Indexed: 11/13/2022] Open
Abstract
Unmanned aerial vehicles (UAVs) are involved in critical tasks such as inspection and exploration. Thus, they have to perform several intelligent functions. Various control approaches have been proposed to implement these functions. Most classical UAV control approaches, such as model predictive control, require a dynamic model to determine the optimal control parameters. Other control approaches use machine learning techniques that require multiple learning trials to obtain the proper control parameters. All these approaches are computationally expensive. Our goal is to develop an efficient control system for UAVs that does not require a dynamic model and allows them to learn control parameters online with only a few trials and inexpensive computations. To achieve this, we developed a neural control method with fast online learning. Neural control is based on a three-neuron network, whereas the online learning algorithm is derived from a neural correlation-based learning principle with predictive and reflexive sensory information. This neural control technique is used here for the speed adaptation of the UAV. The control technique relies on a simple input signal from a compact optical distance measurement sensor that can be converted into predictive and reflexive sensory information for the learning algorithm. Such speed adaptation is a fundamental function that can be used as part of other complex control functions, such as obstacle avoidance. The proposed technique was implemented on a real UAV system. Consequently, the UAV can quickly learn within 3–4 trials to proactively adapt its flying speed to brake at a safe distance from the obstacle or target in the horizontal and vertical planes. This speed adaptation is also robust against wind perturbation. We also demonstrated a combination of speed adaptation and obstacle avoidance for UAV navigations, which is an important intelligent function toward inspection and exploration.
Collapse
Affiliation(s)
- Vatsanai Jaiton
- Bio-Inspired Robotics and Neural Engineering Laboratory, School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology, Rayong, Thailand
| | - Kongkiat Rothomphiwat
- Bio-Inspired Robotics and Neural Engineering Laboratory, School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology, Rayong, Thailand
| | - Emad Ebeid
- SDU UAS Centre (Unmanned Aerial Systems), The Mærsk Mc-Kinney Møller Institute, University of Southern Denmark, Odense, Denmark
| | - Poramate Manoonpong
- Bio-Inspired Robotics and Neural Engineering Laboratory, School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology, Rayong, Thailand
- Embodied AI and Neurorobotics Laboratory, SDU Biorobotics, The Mærsk Mc-Kinney Møller Institute, University of Southern Denmark, Odense, Denmark
- *Correspondence: Poramate Manoonpong
| |
Collapse
|
15
|
Schumm SN, Gabrieli D, Meaney DF. Plasticity impairment exposes CA3 vulnerability in a hippocampal network model of mild traumatic brain injury. Hippocampus 2022; 32:231-250. [PMID: 34978378 DOI: 10.1002/hipo.23402] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2020] [Revised: 11/08/2021] [Accepted: 11/18/2021] [Indexed: 11/10/2022]
Abstract
Proper function of the hippocampus is critical for executing cognitive tasks such as learning and memory. Traumatic brain injury (TBI) and other neurological disorders are commonly associated with cognitive deficits and hippocampal dysfunction. Although there are many existing models of individual subregions of the hippocampus, few models attempt to integrate the primary areas into one system. In this work, we developed a computational model of the hippocampus, including the dentate gyrus, CA3, and CA1. The subregions are represented as an interconnected neuronal network, incorporating well-characterized ex vivo slice electrophysiology into the functional neuron models and well-documented anatomical connections into the network structure. In addition, since plasticity is foundational to the role of the hippocampus in learning and memory as well as necessary for studying adaptation to injury, we implemented spike-timing-dependent plasticity among the synaptic connections. Our model mimics key features of hippocampal activity, including signal frequencies in the theta and gamma bands and phase-amplitude coupling in area CA1. We also studied the effects of spike-timing-dependent plasticity impairment, a potential consequence of TBI, in our model and found that impairment decreases broadband power in CA3 and CA1 and reduces phase coherence between these two subregions, yet phase-amplitude coupling in CA1 remains intact. Altogether, our work demonstrates characteristic hippocampal activity with a scaled network model of spiking neurons and reveals the sensitive balance of plasticity mechanisms in the circuit through one manifestation of mild traumatic injury.
Collapse
Affiliation(s)
- Samantha N Schumm
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - David Gabrieli
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - David F Meaney
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, Pennsylvania, USA.,Department of Neurosurgery, Penn Center for Brain Injury and Repair, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| |
Collapse
|
16
|
Barnes SJ, Keller GB, Keck T. Homeostatic regulation through strengthening of neuronal network-correlated synaptic inputs. eLife 2022; 11:81958. [PMID: 36515269 PMCID: PMC9803349 DOI: 10.7554/elife.81958] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 11/30/2022] [Indexed: 12/15/2022] Open
Abstract
Homeostatic regulation is essential for stable neuronal function. Several synaptic mechanisms of homeostatic plasticity have been described, but the functional properties of synapses involved in homeostasis are unknown. We used longitudinal two-photon functional imaging of dendritic spine calcium signals in visual and retrosplenial cortices of awake adult mice to quantify the sensory deprivation-induced changes in the responses of functionally identified spines. We found that spines whose activity selectively correlated with intrinsic network activity underwent tumor necrosis factor alpha (TNF-α)-dependent homeostatic increases in their response amplitudes, but spines identified as responsive to sensory stimulation did not. We observed an increase in the global sensory-evoked responses following sensory deprivation, despite the fact that the identified sensory inputs did not strengthen. Instead, global sensory-evoked responses correlated with the strength of network-correlated inputs. Our results suggest that homeostatic regulation of global responses is mediated through changes to intrinsic network-correlated inputs rather than changes to identified sensory inputs thought to drive sensory processing.
Collapse
Affiliation(s)
- Samuel J Barnes
- Department of Brain Sciences, Division of Neuroscience, Imperial College London, Hammersmith Hospital CampusLondonUnited Kingdom,UK Dementia Research Institute at Imperial CollegeLondonUnited Kingdom
| | - Georg B Keller
- Friedrich Miescher Institute for Biomedical ResearchBaselSwitzerland
| | - Tara Keck
- Department of Neuroscience, Physiology and Pharmacology, University College LondonLondonUnited Kingdom
| |
Collapse
|
17
|
Amorim FE, Chapot RL, Moulin TC, Lee JLC, Amaral OB. Memory destabilization during reconsolidation: a consequence of homeostatic plasticity? ACTA ACUST UNITED AC 2021; 28:371-389. [PMID: 34526382 DOI: 10.1101/lm.053418.121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Accepted: 07/14/2021] [Indexed: 11/24/2022]
Abstract
Remembering is not a static process: When retrieved, a memory can be destabilized and become prone to modifications. This phenomenon has been demonstrated in a number of brain regions, but the neuronal mechanisms that rule memory destabilization and its boundary conditions remain elusive. Using two distinct computational models that combine Hebbian plasticity and synaptic downscaling, we show that homeostatic plasticity can function as a destabilization mechanism, accounting for behavioral results of protein synthesis inhibition upon reactivation with different re-exposure times. Furthermore, by performing systematic reviews, we identify a series of overlapping molecular mechanisms between memory destabilization and synaptic downscaling, although direct experimental links between both phenomena remain scarce. In light of these results, we propose a theoretical framework where memory destabilization can emerge as an epiphenomenon of homeostatic adaptations prompted by memory retrieval.
Collapse
Affiliation(s)
- Felippe E Amorim
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| | - Renata L Chapot
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| | - Thiago C Moulin
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala 751 24, Sweden
| | - Jonathan L C Lee
- University of Birmingham, School of Psychology, Edgbaston, Birmingham B15 2TT, United Kingdom
| | - Olavo B Amaral
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| |
Collapse
|
18
|
Weidel P, Duarte R, Morrison A. Unsupervised Learning and Clustered Connectivity Enhance Reinforcement Learning in Spiking Neural Networks. Front Comput Neurosci 2021; 15:543872. [PMID: 33746728 PMCID: PMC7970044 DOI: 10.3389/fncom.2021.543872] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Accepted: 02/08/2021] [Indexed: 11/13/2022] Open
Abstract
Reinforcement learning is a paradigm that can account for how organisms learn to adapt their behavior in complex environments with sparse rewards. To partition an environment into discrete states, implementations in spiking neuronal networks typically rely on input architectures involving place cells or receptive fields specified ad hoc by the researcher. This is problematic as a model for how an organism can learn appropriate behavioral sequences in unknown environments, as it fails to account for the unsupervised and self-organized nature of the required representations. Additionally, this approach presupposes knowledge on the part of the researcher on how the environment should be partitioned and represented and scales poorly with the size or complexity of the environment. To address these issues and gain insights into how the brain generates its own task-relevant mappings, we propose a learning architecture that combines unsupervised learning on the input projections with biologically motivated clustered connectivity within the representation layer. This combination allows input features to be mapped to clusters; thus the network self-organizes to produce clearly distinguishable activity patterns that can serve as the basis for reinforcement learning on the output projections. On the basis of the MNIST and Mountain Car tasks, we show that our proposed model performs better than either a comparable unclustered network or a clustered network with static input projections. We conclude that the combination of unsupervised learning and clustered connectivity provides a generic representational substrate suitable for further computation.
Collapse
Affiliation(s)
- Philipp Weidel
- Institute of Neuroscience and Medicine (INM-6) & Institute for Advanced Simulation (IAS-6) & JARA-Institute Brain Structure-Function Relationship (JBI-1 / INM-10), Research Centre Jülich, Jülich, Germany.,Department of Computer Science 3 - Software Engineering, RWTH Aachen University, Aachen, Germany
| | - Renato Duarte
- Institute of Neuroscience and Medicine (INM-6) & Institute for Advanced Simulation (IAS-6) & JARA-Institute Brain Structure-Function Relationship (JBI-1 / INM-10), Research Centre Jülich, Jülich, Germany
| | - Abigail Morrison
- Institute of Neuroscience and Medicine (INM-6) & Institute for Advanced Simulation (IAS-6) & JARA-Institute Brain Structure-Function Relationship (JBI-1 / INM-10), Research Centre Jülich, Jülich, Germany.,Department of Computer Science 3 - Software Engineering, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
19
|
Schubert F, Gros C. Local Homeostatic Regulation of the Spectral Radius of Echo-State Networks. Front Comput Neurosci 2021; 15:587721. [PMID: 33732127 PMCID: PMC7958921 DOI: 10.3389/fncom.2021.587721] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2020] [Accepted: 01/25/2020] [Indexed: 12/02/2022] Open
Abstract
Recurrent cortical networks provide reservoirs of states that are thought to play a crucial role for sequential information processing in the brain. However, classical reservoir computing requires manual adjustments of global network parameters, particularly of the spectral radius of the recurrent synaptic weight matrix. It is hence not clear if the spectral radius is accessible to biological neural networks. Using random matrix theory, we show that the spectral radius is related to local properties of the neuronal dynamics whenever the overall dynamical state is only weakly correlated. This result allows us to introduce two local homeostatic synaptic scaling mechanisms, termed flow control and variance control, that implicitly drive the spectral radius toward the desired value. For both mechanisms the spectral radius is autonomously adapted while the network receives and processes inputs under working conditions. We demonstrate the effectiveness of the two adaptation mechanisms under different external input protocols. Moreover, we evaluated the network performance after adaptation by training the network to perform a time-delayed XOR operation on binary sequences. As our main result, we found that flow control reliably regulates the spectral radius for different types of input statistics. Precise tuning is however negatively affected when interneural correlations are substantial. Furthermore, we found a consistent task performance over a wide range of input strengths/variances. Variance control did however not yield the desired spectral radii with the same precision, being less consistent across different input strengths. Given the effectiveness and remarkably simple mathematical form of flow control, we conclude that self-consistent local control of the spectral radius via an implicit adaptation scheme is an interesting and biological plausible alternative to conventional methods using set point homeostatic feedback controls of neural firing.
Collapse
Affiliation(s)
- Fabian Schubert
- Institute for Theoretical Physics, Goethe University Frankfurt am Main, Frankfurt am Main, Germany
| | | |
Collapse
|
20
|
Torrado Pacheco A, Bottorff J, Gao Y, Turrigiano GG. Sleep Promotes Downward Firing Rate Homeostasis. Neuron 2021; 109:530-544.e6. [PMID: 33232655 PMCID: PMC7864886 DOI: 10.1016/j.neuron.2020.11.001] [Citation(s) in RCA: 58] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2019] [Revised: 09/15/2020] [Accepted: 10/30/2020] [Indexed: 12/11/2022]
Abstract
Homeostatic plasticity is hypothesized to bidirectionally regulate neuronal activity around a stable set point to compensate for learning-related plasticity, but to date only upward firing rate homeostasis (FRH) has been demonstrated in vivo. We combined chronic electrophysiology in freely behaving animals with an eye-reopening paradigm to enhance firing in primary visual cortex (V1) and found that neurons bidirectionally regulate firing rates around an individual set point. Downward FRH did not require N-methyl-D-aspartate receptor (NMDAR) signaling and was associated with homeostatic scaling down of synaptic strengths. Like upward FRH, downward FRH was gated by arousal state but in the opposite direction: it occurred during sleep, not during wake. In contrast, firing rate depression associated with Hebbian plasticity happened independently of sleep and wake. Thus, sleep and wake states temporally segregate upward and downward FRH, which might prevent interference or provide unopposed homeostatic compensation when it is needed most.
Collapse
Affiliation(s)
| | - Juliet Bottorff
- Department of Biology, Brandeis University, Waltham, MA 02453, USA
| | - Ya Gao
- Department of Biology, Brandeis University, Waltham, MA 02453, USA
| | | |
Collapse
|
21
|
Energetics of stochastic BCM type synaptic plasticity and storing of accurate information. J Comput Neurosci 2021; 49:71-106. [PMID: 33528721 PMCID: PMC8046702 DOI: 10.1007/s10827-020-00775-0] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Revised: 10/04/2020] [Accepted: 12/13/2020] [Indexed: 11/10/2022]
Abstract
Excitatory synaptic signaling in cortical circuits is thought to be metabolically expensive. Two fundamental brain functions, learning and memory, are associated with long-term synaptic plasticity, but we know very little about energetics of these slow biophysical processes. This study investigates the energy requirement of information storing in plastic synapses for an extended version of BCM plasticity with a decay term, stochastic noise, and nonlinear dependence of neuron’s firing rate on synaptic current (adaptation). It is shown that synaptic weights in this model exhibit bistability. In order to analyze the system analytically, it is reduced to a simple dynamic mean-field for a population averaged plastic synaptic current. Next, using the concepts of nonequilibrium thermodynamics, we derive the energy rate (entropy production rate) for plastic synapses and a corresponding Fisher information for coding presynaptic input. That energy, which is of chemical origin, is primarily used for battling fluctuations in the synaptic weights and presynaptic firing rates, and it increases steeply with synaptic weights, and more uniformly though nonlinearly with presynaptic firing. At the onset of synaptic bistability, Fisher information and memory lifetime both increase sharply, by a few orders of magnitude, but the plasticity energy rate changes only mildly. This implies that a huge gain in the precision of stored information does not have to cost large amounts of metabolic energy, which suggests that synaptic information is not directly limited by energy consumption. Interestingly, for very weak synaptic noise, such a limit on synaptic coding accuracy is imposed instead by a derivative of the plasticity energy rate with respect to the mean presynaptic firing, and this relationship has a general character that is independent of the plasticity type. An estimate for primate neocortex reveals that a relative metabolic cost of BCM type synaptic plasticity, as a fraction of neuronal cost related to fast synaptic transmission and spiking, can vary from negligible to substantial, depending on the synaptic noise level and presynaptic firing.
Collapse
|
22
|
Li KT, Liang J, Zhou C. Gamma Oscillations Facilitate Effective Learning in Excitatory-Inhibitory Balanced Neural Circuits. Neural Plast 2021; 2021:6668175. [PMID: 33542728 PMCID: PMC7840255 DOI: 10.1155/2021/6668175] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2020] [Revised: 12/19/2020] [Accepted: 01/07/2021] [Indexed: 12/26/2022] Open
Abstract
Gamma oscillation in neural circuits is believed to associate with effective learning in the brain, while the underlying mechanism is unclear. This paper aims to study how spike-timing-dependent plasticity (STDP), a typical mechanism of learning, with its interaction with gamma oscillation in neural circuits, shapes the network dynamics properties and the network structure formation. We study an excitatory-inhibitory (E-I) integrate-and-fire neuronal network with triplet STDP, heterosynaptic plasticity, and a transmitter-induced plasticity. Our results show that the performance of plasticity is diverse in different synchronization levels. We find that gamma oscillation is beneficial to synaptic potentiation among stimulated neurons by forming a special network structure where the sum of excitatory input synaptic strength is correlated with the sum of inhibitory input synaptic strength. The circuit can maintain E-I balanced input on average, whereas the balance is temporal broken during the learning-induced oscillations. Our study reveals a potential mechanism about the benefits of gamma oscillation on learning in biological neural circuits.
Collapse
Affiliation(s)
- Kwan Tung Li
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
| | - Junhao Liang
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
| |
Collapse
|
23
|
Auth JM, Nachstedt T, Tetzlaff C. The Interplay of Synaptic Plasticity and Scaling Enables Self-Organized Formation and Allocation of Multiple Memory Representations. Front Neural Circuits 2020; 14:541728. [PMID: 33117130 PMCID: PMC7575689 DOI: 10.3389/fncir.2020.541728] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2020] [Accepted: 08/19/2020] [Indexed: 12/23/2022] Open
Abstract
It is commonly assumed that memories about experienced stimuli are represented by groups of highly interconnected neurons called cell assemblies. This requires allocating and storing information in the neural circuitry, which happens through synaptic weight adaptations at different types of synapses. In general, memory allocation is associated with synaptic changes at feed-forward synapses while memory storage is linked with adaptation of recurrent connections. It remains, however, largely unknown how memory allocation and storage can be achieved and the adaption of the different synapses involved be coordinated to allow for a faithful representation of multiple memories without disruptive interference between them. In this theoretical study, by using network simulations and phase space analyses, we show that the interplay between long-term synaptic plasticity and homeostatic synaptic scaling organizes simultaneously the adaptations of feed-forward and recurrent synapses such that a new stimulus forms a new memory and where different stimuli are assigned to distinct cell assemblies. The resulting dynamics can reproduce experimental in-vivo data, focusing on how diverse factors, such as neuronal excitability and network connectivity, influence memory formation. Thus, the here presented model suggests that a few fundamental synaptic mechanisms may suffice to implement memory allocation and storage in neural circuitry.
Collapse
Affiliation(s)
- Johannes Maria Auth
- Department of Computational Neuroscience, Third Institute of Physics, Georg-August-Universität, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Timo Nachstedt
- Department of Computational Neuroscience, Third Institute of Physics, Georg-August-Universität, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics, Georg-August-Universität, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
24
|
Krüppel S, Tetzlaff C. The self-organized learning of noisy environmental stimuli requires distinct phases of plasticity. Netw Neurosci 2020; 4:174-199. [PMID: 32166207 PMCID: PMC7055647 DOI: 10.1162/netn_a_00118] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2019] [Accepted: 12/09/2019] [Indexed: 11/25/2022] Open
Abstract
Along sensory pathways, representations of environmental stimuli become increasingly sparse and expanded. If additionally the feed-forward synaptic weights are structured according to the inherent organization of stimuli, the increase in sparseness and expansion leads to a reduction of sensory noise. However, it is unknown how the synapses in the brain form the required structure, especially given the omnipresent noise of environmental stimuli. Here, we employ a combination of synaptic plasticity and intrinsic plasticity—adapting the excitability of each neuron individually—and present stimuli with an inherent organization to a feed-forward network. We observe that intrinsic plasticity maintains the sparseness of the neural code and thereby allows synaptic plasticity to learn the organization of stimuli in low-noise environments. Nevertheless, even high levels of noise can be handled after a subsequent phase of readaptation of the neuronal excitabilities by intrinsic plasticity. Interestingly, during this phase the synaptic structure has to be maintained. These results demonstrate that learning and recalling in the presence of noise requires the coordinated interplay between plasticity mechanisms adapting different properties of the neuronal circuit. Everyday life requires living beings to continuously recognize and categorize perceived stimuli from the environment. To master this task, the representations of these stimuli become increasingly sparse and expanded along the sensory pathways of the brain. In addition, the underlying neuronal network has to be structured according to the inherent organization of the environmental stimuli. However, how the neuronal network learns the required structure even in the presence of noise remains unknown. In this theoretical study, we show that the interplay between synaptic plasticity—controlling the synaptic efficacies—and intrinsic plasticity—adapting the neuronal excitabilities—enables the network to encode the organization of environmental stimuli. It thereby structures the network to correctly categorize stimuli even in the presence of noise. After having encoded the stimuli’s organization, consolidating the synaptic structure while keeping the neuronal excitabilities dynamic enables the neuronal system to readapt to arbitrary levels of noise resulting in a near-optimal classification performance for all noise levels. These results provide new insights into the interplay between different plasticity mechanisms and how this interplay enables sensory systems to reliably learn and categorize stimuli from the surrounding environment.
Collapse
Affiliation(s)
- Steffen Krüppel
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
| |
Collapse
|
25
|
Ma Z, Turrigiano GG, Wessel R, Hengen KB. Cortical Circuit Dynamics Are Homeostatically Tuned to Criticality In Vivo. Neuron 2019; 104:655-664.e4. [PMID: 31601510 DOI: 10.1016/j.neuron.2019.08.031] [Citation(s) in RCA: 123] [Impact Index Per Article: 20.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2019] [Revised: 06/26/2019] [Accepted: 08/19/2019] [Indexed: 11/26/2022]
Abstract
Homeostatic mechanisms stabilize neuronal activity in vivo, but whether this process gives rise to balanced network dynamics is unknown. Here, we continuously monitored the statistics of network spiking in visual cortical circuits in freely behaving rats for 9 days. Under control conditions in light and dark, networks were robustly organized around criticality, a regime that maximizes information capacity and transmission. When input was perturbed by visual deprivation, network criticality was severely disrupted and subsequently restored to criticality over 48 h. Unexpectedly, the recovery of excitatory dynamics preceded homeostatic plasticity of firing rates by >30 h. We utilized model investigations to manipulate firing rate homeostasis in a cell-type-specific manner at the onset of visual deprivation. Our results suggest that criticality in excitatory networks is established by inhibitory plasticity and architecture. These data establish that criticality is consistent with a homeostatic set point for visual cortical dynamics and suggest a key role for homeostatic regulation of inhibition.
Collapse
Affiliation(s)
- Zhengyu Ma
- Department of Physics, Washington University in St. Louis, St. Louis, MO 63130, USA
| | | | - Ralf Wessel
- Department of Physics, Washington University in St. Louis, St. Louis, MO 63130, USA
| | - Keith B Hengen
- Department of Biology, Washington University in St. Louis, St. Louis, MO 63130, USA.
| |
Collapse
|
26
|
Deger M, Seeholzer A, Gerstner W. Multicontact Co-operativity in Spike-Timing-Dependent Structural Plasticity Stabilizes Networks. Cereb Cortex 2019; 28:1396-1415. [PMID: 29300903 PMCID: PMC6041941 DOI: 10.1093/cercor/bhx339] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2017] [Accepted: 11/30/2017] [Indexed: 12/12/2022] Open
Abstract
Excitatory synaptic connections in the adult neocortex consist of multiple synaptic contacts, almost exclusively formed on dendritic spines. Changes of spine volume, a correlate of synaptic strength, can be tracked in vivo for weeks. Here, we present a combined model of structural and spike-timing–dependent plasticity that explains the multicontact configuration of synapses in adult neocortical networks under steady-state and lesion-induced conditions. Our plasticity rule with Hebbian and anti-Hebbian terms stabilizes both the postsynaptic firing rate and correlations between the pre- and postsynaptic activity at an active synaptic contact. Contacts appear spontaneously at a low rate and disappear if their strength approaches zero. Many presynaptic neurons compete to make strong synaptic connections onto a postsynaptic neuron, whereas the synaptic contacts of a given presynaptic neuron co-operate via postsynaptic firing. We find that co-operation of multiple synaptic contacts is crucial for stable, long-term synaptic memories. In simulations of a simplified network model of barrel cortex, our plasticity rule reproduces whisker-trimming–induced rewiring of thalamocortical and recurrent synaptic connectivity on realistic time scales.
Collapse
Affiliation(s)
- Moritz Deger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.,Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, 50674 Cologne, Germany
| | - Alexander Seeholzer
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
27
|
Lappalainen J, Herpich J, Tetzlaff C. A Theoretical Framework to Derive Simple, Firing-Rate-Dependent Mathematical Models of Synaptic Plasticity. Front Comput Neurosci 2019; 13:26. [PMID: 31133837 PMCID: PMC6517541 DOI: 10.3389/fncom.2019.00026] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Accepted: 04/10/2019] [Indexed: 11/13/2022] Open
Abstract
Synaptic plasticity serves as an essential mechanism underlying cognitive processes as learning and memory. For a better understanding detailed theoretical models combine experimental underpinnings of synaptic plasticity and match experimental results. However, these models are mathematically complex impeding the comprehensive investigation of their link to cognitive processes generally executed on the neuronal network level. Here, we derive a mathematical framework enabling the simplification of such detailed models of synaptic plasticity facilitating further mathematical analyses. By this framework we obtain a compact, firing-rate-dependent mathematical formulation, which includes the essential dynamics of the detailed model and, thus, of experimentally verified properties of synaptic plasticity. Amongst others, by testing our framework by abstracting the dynamics of two well-established calcium-dependent synaptic plasticity models, we derived that the synaptic changes depend on the square of the presynaptic firing rate, which is in contrast to previous assumptions. Thus, the here-presented framework enables the derivation of biologically plausible but simple mathematical models of synaptic plasticity allowing to analyze the underlying dependencies of synaptic dynamics from neuronal properties such as the firing rate and to investigate their implications in complex neuronal networks.
Collapse
Affiliation(s)
- Janne Lappalainen
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany
| | - Juliane Herpich
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| |
Collapse
|
28
|
Fauth MJ, van Rossum MC. Self-organized reactivation maintains and reinforces memories despite synaptic turnover. eLife 2019; 8:43717. [PMID: 31074745 PMCID: PMC6546393 DOI: 10.7554/elife.43717] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2018] [Accepted: 04/30/2019] [Indexed: 01/21/2023] Open
Abstract
Long-term memories are believed to be stored in the synapses of cortical neuronal networks. However, recent experiments report continuous creation and removal of cortical synapses, which raises the question how memories can survive on such a variable substrate. Here, we study the formation and retention of associative memory in a computational model based on Hebbian cell assemblies in the presence of both synaptic and structural plasticity. During rest periods, such as may occur during sleep, the assemblies reactivate spontaneously, reinforcing memories against ongoing synapse removal and replacement. Brief daily reactivations during rest-periods suffice to not only maintain the assemblies, but even strengthen them, and improve pattern completion, consistent with offline memory gains observed experimentally. While the connectivity inside memory representations is strengthened during rest phases, connections in the rest of the network decay and vanish thus reconciling apparently conflicting hypotheses of the influence of sleep on cortical connectivity.
Collapse
Affiliation(s)
- Michael Jan Fauth
- School of Informatics, University of Edinburgh, Edinburgh, United Kingdom.,Third Physics Institute, University of Göttingen, Göttingen, Germany
| | - Mark Cw van Rossum
- School of Psychology, University of Nottingham, Nottingham, United Kingdom.,School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| |
Collapse
|
29
|
Herpich J, Tetzlaff C. Principles underlying the input-dependent formation and organization of memories. Netw Neurosci 2019; 3:606-634. [PMID: 31157312 PMCID: PMC6542621 DOI: 10.1162/netn_a_00086] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 03/21/2019] [Indexed: 11/29/2022] Open
Abstract
The neuronal system exhibits the remarkable ability to dynamically store and organize incoming information into a web of memory representations (items), which is essential for the generation of complex behaviors. Central to memory function is that such memory items must be (1) discriminated from each other, (2) associated to each other, or (3) brought into a sequential order. However, how these three basic mechanisms are robustly implemented in an input-dependent manner by the underlying complex neuronal and synaptic dynamics is still unknown. Here, we develop a mathematical framework, which provides a direct link between different synaptic mechanisms, determining the neuronal and synaptic dynamics of the network, to create a network that emulates the above mechanisms. Combining correlation-based synaptic plasticity and homeostatic synaptic scaling, we demonstrate that these mechanisms enable the reliable formation of sequences and associations between two memory items still missing the capability for discrimination. We show that this shortcoming can be removed by additionally considering inhibitory synaptic plasticity. Thus, the here-presented framework provides a new, functionally motivated link between different known synaptic mechanisms leading to the self-organization of fundamental memory mechanisms.
Collapse
Affiliation(s)
- Juliane Herpich
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| |
Collapse
|
30
|
Hoke KL, Adkins-Regan E, Bass AH, McCune AR, Wolfner MF. Co-opting evo-devo concepts for new insights into mechanisms of behavioural diversity. ACTA ACUST UNITED AC 2019; 222:222/8/jeb190058. [PMID: 30988051 DOI: 10.1242/jeb.190058] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
We propose that insights from the field of evolutionary developmental biology (or 'evo-devo') provide a framework for an integrated understanding of the origins of behavioural diversity and its underlying mechanisms. Towards that goal, in this Commentary, we frame key questions in behavioural evolution in terms of molecular, cellular and network-level properties with a focus on the nervous system. In this way, we highlight how mechanistic properties central to evo-devo analyses - such as weak linkage, versatility, exploratory mechanisms, criticality, degeneracy, redundancy and modularity - affect neural circuit function and hence the range of behavioural variation that can be filtered by selection. We outline why comparative studies of molecular and neural systems throughout ontogeny will provide novel insights into diversity in neural circuits and behaviour.
Collapse
Affiliation(s)
- Kim L Hoke
- Department of Biology, Colorado State University, Fort Collins, CO 80523, USA
| | - Elizabeth Adkins-Regan
- Department of Psychology, Cornell University, Ithaca, NY 14853, USA.,Department of Neurobiology and Behavior, Cornell University, Ithaca, NY 14853, USA
| | - Andrew H Bass
- Department of Neurobiology and Behavior, Cornell University, Ithaca, NY 14853, USA
| | - Amy R McCune
- Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY 14853, USA
| | - Mariana F Wolfner
- Department of Molecular Biology and Genetics, Cornell University, Ithaca, NY 14853, USA
| |
Collapse
|
31
|
Goodhill GJ. Theoretical Models of Neural Development. iScience 2018; 8:183-199. [PMID: 30321813 PMCID: PMC6197653 DOI: 10.1016/j.isci.2018.09.017] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2018] [Revised: 08/06/2018] [Accepted: 09/19/2018] [Indexed: 12/22/2022] Open
Abstract
Constructing a functioning nervous system requires the precise orchestration of a vast array of mechanical, molecular, and neural-activity-dependent cues. Theoretical models can play a vital role in helping to frame quantitative issues, reveal mathematical commonalities between apparently diverse systems, identify what is and what is not possible in principle, and test the abilities of specific mechanisms to explain the data. This review focuses on the progress that has been made over the last decade in our theoretical understanding of neural development.
Collapse
Affiliation(s)
- Geoffrey J Goodhill
- Queensland Brain Institute and School of Mathematics and Physics, The University of Queensland, St Lucia, QLD 4072, Australia.
| |
Collapse
|
32
|
Faghihi F, Moustafa AA. Combined Computational Systems Biology and Computational Neuroscience Approaches Help Develop of Future "Cognitive Developmental Robotics". Front Neurorobot 2017; 11:63. [PMID: 29276486 PMCID: PMC5727420 DOI: 10.3389/fnbot.2017.00063] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2017] [Accepted: 10/24/2017] [Indexed: 11/13/2022] Open
Affiliation(s)
- Faramarz Faghihi
- Department for Cognitive Modeling, Institute for Cognitive and Brain Sciences, Shahid Beheshti University, Tehran, Iran
| | - Ahmed A Moustafa
- School of Social Sciences and Psychology and Marcs Institute for Brain and Behavior, Western Sydney University, Sydney, NSW, Australia
| |
Collapse
|
33
|
Working Memory Requires a Combination of Transient and Attractor-Dominated Dynamics to Process Unreliably Timed Inputs. Sci Rep 2017; 7:2473. [PMID: 28559576 PMCID: PMC5449410 DOI: 10.1038/s41598-017-02471-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2016] [Accepted: 04/11/2017] [Indexed: 12/20/2022] Open
Abstract
Working memory stores and processes information received as a stream of continuously incoming stimuli. This requires accurate sequencing and it remains puzzling how this can be reliably achieved by the neuronal system as our perceptual inputs show a high degree of temporal variability. One hypothesis is that accurate timing is achieved by purely transient neuronal dynamics; by contrast a second hypothesis states that the underlying network dynamics are dominated by attractor states. In this study, we resolve this contradiction by theoretically investigating the performance of the system using stimuli with differently accurate timing. Interestingly, only the combination of attractor and transient dynamics enables the network to perform with a low error rate. Further analysis reveals that the transient dynamics of the system are used to process information, while the attractor states store it. The interaction between both types of dynamics yields experimentally testable predictions and we show that this way the system can reliably interact with a timing-unreliable Hebbian-network representing long-term memory. Thus, this study provides a potential solution to the long-standing problem of the basic neuronal dynamics underlying working memory.
Collapse
|
34
|
Keck T, Toyoizumi T, Chen L, Doiron B, Feldman DE, Fox K, Gerstner W, Haydon PG, Hübener M, Lee HK, Lisman JE, Rose T, Sengpiel F, Stellwagen D, Stryker MP, Turrigiano GG, van Rossum MC. Integrating Hebbian and homeostatic plasticity: the current state of the field and future research directions. Philos Trans R Soc Lond B Biol Sci 2017; 372:20160158. [PMID: 28093552 PMCID: PMC5247590 DOI: 10.1098/rstb.2016.0158] [Citation(s) in RCA: 112] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/20/2016] [Indexed: 11/12/2022] Open
Abstract
We summarize here the results presented and subsequent discussion from the meeting on Integrating Hebbian and Homeostatic Plasticity at the Royal Society in April 2016. We first outline the major themes and results presented at the meeting. We next provide a synopsis of the outstanding questions that emerged from the discussion at the end of the meeting and finally suggest potential directions of research that we believe are most promising to develop an understanding of how these two forms of plasticity interact to facilitate functional changes in the brain.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'.
Collapse
Affiliation(s)
- Tara Keck
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | | | - Lu Chen
- Department of Neurosurgery, Stanford University, Stanford, CA, USA
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| | - Daniel E Feldman
- Department of Molecular and Cell Biology, University of California, Berkeley, CA, USA
| | - Kevin Fox
- Division of Neuroscience, University of Cardiff, Cardiff, Wales, UK
| | - Wulfram Gerstner
- Brain Mind Institute, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | | | - Mark Hübener
- Department of Cellular and Systems Neuroscience, Max Planck Institute of Neurobiology, Martinsried, Bayern, Germany
| | - Hey-Kyoung Lee
- The Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, USA
| | - John E Lisman
- Department of Biology, Brandeis University, Waltham, MA, USA
| | - Tobias Rose
- Department of Cellular and Systems Neuroscience, Max Planck Institute of Neurobiology, Martinsried, Bayern, Germany
| | - Frank Sengpiel
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
- Division of Neuroscience, University of Cardiff, Cardiff, Wales, UK
| | - David Stellwagen
- Centre for Research in Neuroscience, McGill University, Montreal, Quebec, Canada
| | - Michael P Stryker
- Sandler Neurosciences Center, University of California, San Francisco, CA, USA
| | | | | |
Collapse
|
35
|
Keck T, Hübener M, Bonhoeffer T. Interactions between synaptic homeostatic mechanisms: an attempt to reconcile BCM theory, synaptic scaling, and changing excitation/inhibition balance. Curr Opin Neurobiol 2017; 43:87-93. [PMID: 28236778 DOI: 10.1016/j.conb.2017.02.003] [Citation(s) in RCA: 60] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2016] [Revised: 12/03/2016] [Accepted: 02/01/2017] [Indexed: 11/17/2022]
Abstract
Homeostatic plasticity is proposed to be mediated by synaptic changes, such as synaptic scaling and shifts in the excitation/inhibition balance. These mechanisms are thought to be separate from the Bienenstock, Cooper, Munro (BCM) learning rule, where the threshold for the induction of long-term potentiation and long-term depression slides in response to changes in activity levels. Yet, both sets of mechanisms produce a homeostatic response of a relative increase (or decrease) in strength of excitatory synapses in response to overall activity-level changes. Here we review recent studies, with a focus on in vivo experiments, to re-examine the overlap and differences between these two mechanisms and we suggest how they may interact to facilitate firing-rate homeostasis, while maintaining functional properties of neurons.
Collapse
Affiliation(s)
- Tara Keck
- Department of Neuroscience, Physiology and Pharmacology, University College London, 21 University Street, London, WC1E 6DE, UK
| | - Mark Hübener
- Max Planck Institute of Neurobiology, Am Klopferspitz 18, 82152 Martinsried, Germany
| | - Tobias Bonhoeffer
- Max Planck Institute of Neurobiology, Am Klopferspitz 18, 82152 Martinsried, Germany.
| |
Collapse
|
36
|
Li Y, Kulvicius T, Tetzlaff C. Induction and Consolidation of Calcium-Based Homo- and Heterosynaptic Potentiation and Depression. PLoS One 2016; 11:e0161679. [PMID: 27560350 PMCID: PMC4999190 DOI: 10.1371/journal.pone.0161679] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2015] [Accepted: 08/10/2016] [Indexed: 11/19/2022] Open
Abstract
The adaptive mechanisms of homo- and heterosynaptic plasticity play an important role in learning and memory. In order to maintain plasticity-induced changes for longer time scales (up to several days), they have to be consolidated by transferring them from a short-lasting early-phase to a long-lasting late-phase state. The underlying processes of this synaptic consolidation are already well-known for homosynaptic plasticity, however, it is not clear whether the same processes also enable the induction and consolidation of heterosynaptic plasticity. In this study, by extending a generic calcium-based plasticity model with the processes of synaptic consolidation, we show in simulations that indeed heterosynaptic plasticity can be induced and, furthermore, consolidated by the same underlying processes as for homosynaptic plasticity. Furthermore, we show that by local diffusion processes the heterosynaptic effect can be restricted to a few synapses neighboring the homosynaptically changed ones. Taken together, this generic model reproduces many experimental results of synaptic tagging and consolidation, provides several predictions for heterosynaptic induction and consolidation, and yields insights into the complex interactions between homo- and heterosynaptic plasticity over a broad variety of time (minutes to days) and spatial scales (several micrometers).
Collapse
Affiliation(s)
- Yinyun Li
- III. Institute of Physics – Biophysics, Georg-August-University, 37077 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, 37077 Göttingen, Germany
- School of System Science, Beijing Normal University, 100875 Beijing, China
- * E-mail:
| | - Tomas Kulvicius
- III. Institute of Physics – Biophysics, Georg-August-University, 37077 Göttingen, Germany
- Maersk Mc-Kinney Moller Institute, University of Southern Denmark, 5230 Odense, Denmark
| | - Christian Tetzlaff
- Bernstein Center for Computational Neuroscience, Georg-August-University, 37077 Göttingen, Germany
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
- Department of Neurobiology, Weizmann Institute of Science, 76100 Rehovot, Israel
| |
Collapse
|
37
|
Fauth M, Tetzlaff C. Opposing Effects of Neuronal Activity on Structural Plasticity. Front Neuroanat 2016; 10:75. [PMID: 27445713 PMCID: PMC4923203 DOI: 10.3389/fnana.2016.00075] [Citation(s) in RCA: 52] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2015] [Accepted: 06/16/2016] [Indexed: 12/21/2022] Open
Abstract
The connectivity of the brain is continuously adjusted to new environmental influences by several activity-dependent adaptive processes. The most investigated adaptive mechanism is activity-dependent functional or synaptic plasticity regulating the transmission efficacy of existing synapses. Another important but less prominently discussed adaptive process is structural plasticity, which changes the connectivity by the formation and deletion of synapses. In this review, we show, based on experimental evidence, that structural plasticity can be classified similar to synaptic plasticity into two categories: (i) Hebbian structural plasticity, which leads to an increase (decrease) of the number of synapses during phases of high (low) neuronal activity and (ii) homeostatic structural plasticity, which balances these changes by removing and adding synapses. Furthermore, based on experimental and theoretical insights, we argue that each type of structural plasticity fulfills a different function. While Hebbian structural changes enhance memory lifetime, storage capacity, and memory robustness, homeostatic structural plasticity self-organizes the connectivity of the neural network to assure stability. However, the link between functional synaptic and structural plasticity as well as the detailed interactions between Hebbian and homeostatic structural plasticity are more complex. This implies even richer dynamics requiring further experimental and theoretical investigations.
Collapse
Affiliation(s)
- Michael Fauth
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August UniversityGöttingen, Germany; Bernstein Center for Computational NeuroscienceGöttingen, Germany
| | - Christian Tetzlaff
- Bernstein Center for Computational NeuroscienceGöttingen, Germany; Max Planck Institute for Dynamics and Self-OrganizationGöttingen, Germany
| |
Collapse
|
38
|
Garrido JA, Luque NR, Tolu S, D’Angelo E. Oscillation-Driven Spike-Timing Dependent Plasticity Allows Multiple Overlapping Pattern Recognition in Inhibitory Interneuron Networks. Int J Neural Syst 2016; 26:1650020. [DOI: 10.1142/s0129065716500209] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
The majority of operations carried out by the brain require learning complex signal patterns for future recognition, retrieval and reuse. Although learning is thought to depend on multiple forms of long-term synaptic plasticity, the way this latter contributes to pattern recognition is still poorly understood. Here, we have used a simple model of afferent excitatory neurons and interneurons with lateral inhibition, reproducing a network topology found in many brain areas from the cerebellum to cortical columns. When endowed with spike-timing dependent plasticity (STDP) at the excitatory input synapses and at the inhibitory interneuron–interneuron synapses, the interneurons rapidly learned complex input patterns. Interestingly, induction of plasticity required that the network be entrained into theta-frequency band oscillations, setting the internal phase-reference required to drive STDP. Inhibitory plasticity effectively distributed multiple patterns among available interneurons, thus allowing the simultaneous detection of multiple overlapping patterns. The addition of plasticity in intrinsic excitability made the system more robust allowing self-adjustment and rescaling in response to a broad range of input patterns. The combination of plasticity in lateral inhibitory connections and homeostatic mechanisms in the inhibitory interneurons optimized mutual information (MI) transfer. The storage of multiple complex patterns in plastic interneuron networks could be critical for the generation of sparse representations of information in excitatory neuron populations falling under their control.
Collapse
Affiliation(s)
- Jesús A. Garrido
- Department of Computer Architecture and Technology, University of Granada, Periodista Daniel Saucedo Aranda s/n, Granada, 18071, Spain
| | - Niceto R. Luque
- Institut National de la Santé et de la Recherche Médicale, U968 and Centre National de la Recherche Scientifique, UMR_7210, Institut de la Vision, rue Moreau, 17, Paris, F75012, France
- Sorbonne Universités, Université Pierre et Marie Curie Paris 06, UMR_S 968, Place Jussieu, 4, Paris, F75252, France
| | - Silvia Tolu
- Center for Playware, Department of Electrical Engineering, Technical University of Denmark, Richard Petersens Plads, Elektrovej, Building 326, Lyngby, Copenhagen, 2800, Denmark
| | - Egidio D’Angelo
- Department of Brain and Behavioral Sciences, University of Pavia, Via Forlanini, 6, Pavia, I27100, Italy
- Brain Connectivity Center, Istituto Neurologico IRCCS Fondazione Casimiro Mondino, Via Mondino, 2 Pavia, I27100, Italy
| |
Collapse
|
39
|
Plasticity-Driven Self-Organization under Topological Constraints Accounts for Non-random Features of Cortical Synaptic Wiring. PLoS Comput Biol 2016; 12:e1004759. [PMID: 26866369 PMCID: PMC4750861 DOI: 10.1371/journal.pcbi.1004759] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2015] [Accepted: 01/18/2016] [Indexed: 11/19/2022] Open
Abstract
Understanding the structure and dynamics of cortical connectivity is vital to understanding cortical function. Experimental data strongly suggest that local recurrent connectivity in the cortex is significantly non-random, exhibiting, for example, above-chance bidirectionality and an overrepresentation of certain triangular motifs. Additional evidence suggests a significant distance dependency to connectivity over a local scale of a few hundred microns, and particular patterns of synaptic turnover dynamics, including a heavy-tailed distribution of synaptic efficacies, a power law distribution of synaptic lifetimes, and a tendency for stronger synapses to be more stable over time. Understanding how many of these non-random features simultaneously arise would provide valuable insights into the development and function of the cortex. While previous work has modeled some of the individual features of local cortical wiring, there is no model that begins to comprehensively account for all of them. We present a spiking network model of a rodent Layer 5 cortical slice which, via the interactions of a few simple biologically motivated intrinsic, synaptic, and structural plasticity mechanisms, qualitatively reproduces these non-random effects when combined with simple topological constraints. Our model suggests that mechanisms of self-organization arising from a small number of plasticity rules provide a parsimonious explanation for numerous experimentally observed non-random features of recurrent cortical wiring. Interestingly, similar mechanisms have been shown to endow recurrent networks with powerful learning abilities, suggesting that these mechanism are central to understanding both structure and function of cortical synaptic wiring. The problem of how the brain wires itself up has important implications for the understanding of both brain development and cognition. The microscopic structure of the circuits of the adult neocortex, often considered the seat of our highest cognitive abilities, is still poorly understood. Recent experiments have provided a first set of findings on the structural features of these circuits, but it is unknown how these features come about and how they are maintained. Here we present a neural network model that shows how these features might come about. It gives rise to numerous connectivity features, which have been observed in experiments, but never before simultaneously produced by a single model. Our model explains the development of these structural features as the result of a process of self-organization. The results imply that only a few simple mechanisms and constraints are required to produce, at least to the first approximation, various characteristic features of a typical fragment of brain microcircuitry. In the absence of any of these mechanisms, simultaneous production of all desired features fails, suggesting a minimal set of necessary mechanisms for their production.
Collapse
|
40
|
Herpich J, Wörgötter F, Tetzlaff C. Interaction between memories in an abstract mathematical model based on the Hebbian cell assembly hypothesis. BMC Neurosci 2015. [PMCID: PMC4699137 DOI: 10.1186/1471-2202-16-s1-p253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
41
|
Yger P, Gilson M. Models of Metaplasticity: A Review of Concepts. Front Comput Neurosci 2015; 9:138. [PMID: 26617512 PMCID: PMC4639700 DOI: 10.3389/fncom.2015.00138] [Citation(s) in RCA: 56] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2015] [Accepted: 10/27/2015] [Indexed: 11/16/2022] Open
Abstract
Part of hippocampal and cortical plasticity is characterized by synaptic modifications that depend on the joint activity of the pre- and post-synaptic neurons. To which extent those changes are determined by the exact timing and the average firing rates is still a matter of debate; this may vary from brain area to brain area, as well as across neuron types. However, it has been robustly observed both in vitro and in vivo that plasticity itself slowly adapts as a function of the dynamical context, a phenomena commonly referred to as metaplasticity. An alternative concept considers the regulation of groups of synapses with an objective at the neuronal level, for example, maintaining a given average firing rate. In that case, the change in the strength of a particular synapse of the group (e.g., due to Hebbian learning) affects others' strengths, which has been coined as heterosynaptic plasticity. Classically, Hebbian synaptic plasticity is paired in neuron network models with such mechanisms in order to stabilize the activity and/or the weight structure. Here, we present an oriented review that brings together various concepts from heterosynaptic plasticity to metaplasticity, and show how they interact with Hebbian-type learning. We focus on approaches that are nowadays used to incorporate those mechanisms to state-of-the-art models of spiking plasticity inspired by experimental observations in the hippocampus and cortex. Making the point that metaplasticity is an ubiquitous mechanism acting on top of classical Hebbian learning and promoting the stability of neural function over multiple timescales, we stress the need for incorporating it as a key element in the framework of plasticity models. Bridging theoretical and experimental results suggests a more functional role for metaplasticity mechanisms than simply stabilizing neural activity.
Collapse
Affiliation(s)
- Pierre Yger
- Sorbonne Université, UPMC Univ Paris06 UMRS968 Paris, France ; Institut de la Vision, INSERM, U968, Centre National de la Recherche Scientifique, UMR7210 Paris, France
| | - Matthieu Gilson
- Computational Neurosciences Group, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra Barcelona, Spain
| |
Collapse
|
42
|
Grinke E, Tetzlaff C, Wörgötter F, Manoonpong P. Synaptic plasticity in a recurrent neural network for versatile and adaptive behaviors of a walking robot. Front Neurorobot 2015; 9:11. [PMID: 26528176 PMCID: PMC4602151 DOI: 10.3389/fnbot.2015.00011] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2015] [Accepted: 09/22/2015] [Indexed: 11/27/2022] Open
Abstract
Walking animals, like insects, with little neural computing can effectively perform complex behaviors. For example, they can walk around their environment, escape from corners/deadlocks, and avoid or climb over obstacles. While performing all these behaviors, they can also adapt their movements to deal with an unknown situation. As a consequence, they successfully navigate through their complex environment. The versatile and adaptive abilities are the result of an integration of several ingredients embedded in their sensorimotor loop. Biological studies reveal that the ingredients include neural dynamics, plasticity, sensory feedback, and biomechanics. Generating such versatile and adaptive behaviors for a many degrees-of-freedom (DOFs) walking robot is a challenging task. Thus, in this study, we present a bio-inspired approach to solve this task. Specifically, the approach combines neural mechanisms with plasticity, exteroceptive sensory feedback, and biomechanics. The neural mechanisms consist of adaptive neural sensory processing and modular neural locomotion control. The sensory processing is based on a small recurrent neural network consisting of two fully connected neurons. Online correlation-based learning with synaptic scaling is applied to adequately change the connections of the network. By doing so, we can effectively exploit neural dynamics (i.e., hysteresis effects and single attractors) in the network to generate different turning angles with short-term memory for a walking robot. The turning information is transmitted as descending steering signals to the neural locomotion control which translates the signals into motor actions. As a result, the robot can walk around and adapt its turning angle for avoiding obstacles in different situations. The adaptation also enables the robot to effectively escape from sharp corners or deadlocks. Using backbone joint control embedded in the the locomotion control allows the robot to climb over small obstacles. Consequently, it can successfully explore and navigate in complex environments. We firstly tested our approach on a physical simulation environment and then applied it to our real biomechanical walking robot AMOSII with 19 DOFs to adaptively avoid obstacles and navigate in the real world.
Collapse
Affiliation(s)
- Eduard Grinke
- Bernstein Center for Computational Neuroscience, Third Institute of Physics, Georg-August-Universität Göttingen Göttingen, Germany
| | - Christian Tetzlaff
- Bernstein Center for Computational Neuroscience, Third Institute of Physics, Georg-August-Universität Göttingen Göttingen, Germany ; Department of Neurobiology, Weizmann Institute of Science Rehovot, Israel
| | - Florentin Wörgötter
- Bernstein Center for Computational Neuroscience, Third Institute of Physics, Georg-August-Universität Göttingen Göttingen, Germany
| | - Poramate Manoonpong
- Embodied AI and Neurorobotics Lab, Center for BioRobotics, The Mærsk Mc-Kinney Møller Institute, University of Southern Denmark Odense M, Denmark
| |
Collapse
|
43
|
Effenberger F, Jost J, Levina A. Self-organization in Balanced State Networks by STDP and Homeostatic Plasticity. PLoS Comput Biol 2015; 11:e1004420. [PMID: 26335425 PMCID: PMC4559467 DOI: 10.1371/journal.pcbi.1004420] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2014] [Accepted: 06/30/2015] [Indexed: 11/18/2022] Open
Abstract
Structural inhomogeneities in synaptic efficacies have a strong impact on population response dynamics of cortical networks and are believed to play an important role in their functioning. However, little is known about how such inhomogeneities could evolve by means of synaptic plasticity. Here we present an adaptive model of a balanced neuronal network that combines two different types of plasticity, STDP and synaptic scaling. The plasticity rules yield both long-tailed distributions of synaptic weights and firing rates. Simultaneously, a highly connected subnetwork of driver neurons with strong synapses emerges. Coincident spiking activity of several driver cells can evoke population bursts and driver cells have similar dynamical properties as leader neurons found experimentally. Our model allows us to observe the delicate interplay between structural and dynamical properties of the emergent inhomogeneities. It is simple, robust to parameter changes and able to explain a multitude of different experimental findings in one basic network. It is widely believed that the structure of neuronal circuits plays a major role in brain functioning. Although the full synaptic connectivity for larger populations is not yet assessable even by current experimental techniques, available data show that neither synaptic strengths nor the number of synapses per neuron are homogeneously distributed. Several studies have found long-tailed distributions of synaptic weights with many weak and a few exceptionally strong synaptic connections, as well as strongly connected cells and subnetworks that may play a decisive role for data processing in neural circuits. Little is known about how inhomogeneities could arise in the developing brain and we hypothesize that there is a self-organizing principle behind their appearance. In this study we show how structural inhomogeneities can emerge by simple synaptic plasticity mechanisms from an initially homogeneous network. We perform numerical simulations and show analytically how a small imbalance in the initial structure is amplified by the synaptic plasticities and their interplay. Our network can simultaneously explain several experimental observations that were previously not linked.
Collapse
Affiliation(s)
- Felix Effenberger
- Max-Planck-Institute for Mathematics in the Sciences, Leipzig, Germany
- * E-mail:
| | - Jürgen Jost
- Max-Planck-Institute for Mathematics in the Sciences, Leipzig, Germany
| | - Anna Levina
- Max-Planck-Institute for Mathematics in the Sciences, Leipzig, Germany
- Bernstein Center for Computational Neuroscience Göttingen, Göttingen, Germany
| |
Collapse
|
44
|
Tetzlaff C, Dasgupta S, Kulvicius T, Wörgötter F. The Use of Hebbian Cell Assemblies for Nonlinear Computation. Sci Rep 2015; 5:12866. [PMID: 26249242 PMCID: PMC4650703 DOI: 10.1038/srep12866] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2015] [Accepted: 07/10/2015] [Indexed: 11/25/2022] Open
Abstract
When learning a complex task our nervous system self-organizes large groups of neurons into coherent dynamic activity patterns. During this, a network with multiple, simultaneously active, and computationally powerful cell assemblies is created. How such ordered structures are formed while preserving a rich diversity of neural dynamics needed for computation is still unknown. Here we show that the combination of synaptic plasticity with the slower process of synaptic scaling achieves (i) the formation of cell assemblies and (ii) enhances the diversity of neural dynamics facilitating the learning of complex calculations. Due to synaptic scaling the dynamics of different cell assemblies do not interfere with each other. As a consequence, this type of self-organization allows executing a difficult, six degrees of freedom, manipulation task with a robot where assemblies need to learn computing complex non-linear transforms and – for execution – must cooperate with each other without interference. This mechanism, thus, permits the self-organization of computationally powerful sub-structures in dynamic networks for behavior control.
Collapse
Affiliation(s)
- Christian Tetzlaff
- 1] Institute for Physics - Biophysics, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany [2] Bernstein Center for Computational Neuroscience, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany [3]
| | - Sakyasingha Dasgupta
- 1] Institute for Physics - Biophysics, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany [2] Bernstein Center for Computational Neuroscience, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany [3]
| | - Tomas Kulvicius
- 1] Institute for Physics - Biophysics, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany [2] Bernstein Center for Computational Neuroscience, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany [3]
| | - Florentin Wörgötter
- 1] Institute for Physics - Biophysics, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany [2] Bernstein Center for Computational Neuroscience, Georg-August-University, Friedrich-Hund Platz 1, 37077, Göttingen, Germany
| |
Collapse
|
45
|
Fauth M, Wörgötter F, Tetzlaff C. The formation of multi-synaptic connections by the interaction of synaptic and structural plasticity and their functional consequences. PLoS Comput Biol 2015; 11:e1004031. [PMID: 25590330 PMCID: PMC4295841 DOI: 10.1371/journal.pcbi.1004031] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2014] [Accepted: 11/06/2014] [Indexed: 11/19/2022] Open
Abstract
Cortical connectivity emerges from the permanent interaction between neuronal activity and synaptic as well as structural plasticity. An important experimentally observed feature of this connectivity is the distribution of the number of synapses from one neuron to another, which has been measured in several cortical layers. All of these distributions are bimodal with one peak at zero and a second one at a small number (3–8) of synapses. In this study, using a probabilistic model of structural plasticity, which depends on the synaptic weights, we explore how these distributions can emerge and which functional consequences they have. We find that bimodal distributions arise generically from the interaction of structural plasticity with synaptic plasticity rules that fulfill the following biological realistic constraints: First, the synaptic weights have to grow with the postsynaptic activity. Second, this growth curve and/or the input-output relation of the postsynaptic neuron have to change sub-linearly (negative curvature). As most neurons show such input-output-relations, these constraints can be fulfilled by many biological reasonable systems. Given such a system, we show that the different activities, which can explain the layer-specific distributions, correspond to experimentally observed activities. Considering these activities as working point of the system and varying the pre- or postsynaptic stimulation reveals a hysteresis in the number of synapses. As a consequence of this, the connectivity between two neurons can be controlled by activity but is also safeguarded against overly fast changes. These results indicate that the complex dynamics between activity and plasticity will, already between a pair of neurons, induce a variety of possible stable synaptic distributions, which could support memory mechanisms. The connectivity between neurons is modified by different mechanisms. On a time scale of minutes to hours one finds synaptic plasticity, whereas mechanisms for structural changes at axons or dendrites may take days. One main factor determining structural changes is the weight of a connection, which, in turn, is adapted by synaptic plasticity. Both mechanisms, synaptic and structural plasticity, are influenced and determined by the activity pattern in the network. Hence, it is important to understand how activity and the different plasticity mechanisms influence each other. Especially how activity influences rewiring in adult networks is still an open question. We present a model, which captures these complex interactions by abstracting structural plasticity with weight-dependent probabilities. This allows for calculating the distribution of the number of synapses between two neurons analytically. We report that biologically realistic connection patterns for different cortical layers generically arise with synaptic plasticity rules in which the synaptic weights grow with postsynaptic activity. The connectivity patterns also lead to different activity levels resembling those found in the different cortical layers. Interestingly such a system exhibits a hysteresis by which connections remain stable longer than expected, which may add to the stability of information storage in the network.
Collapse
Affiliation(s)
- Michael Fauth
- Georg-August University Göttingen, Third Institute of Physics, Bernstein Center for Computational Neuroscience, Göttingen, Germany
- * E-mail:
| | - Florentin Wörgötter
- Georg-August University Göttingen, Third Institute of Physics, Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Christian Tetzlaff
- Georg-August University Göttingen, Third Institute of Physics, Bernstein Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
46
|
Synaptic plasticity enables adaptive self-tuning critical networks. PLoS Comput Biol 2015; 11:e1004043. [PMID: 25590427 PMCID: PMC4295840 DOI: 10.1371/journal.pcbi.1004043] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2014] [Accepted: 11/17/2014] [Indexed: 11/19/2022] Open
Abstract
During rest, the mammalian cortex displays spontaneous neural activity. Spiking of single neurons during rest has been described as irregular and asynchronous. In contrast, recent in vivo and in vitro population measures of spontaneous activity, using the LFP, EEG, MEG or fMRI suggest that the default state of the cortex is critical, manifested by spontaneous, scale-invariant, cascades of activity known as neuronal avalanches. Criticality keeps a network poised for optimal information processing, but this view seems to be difficult to reconcile with apparently irregular single neuron spiking. Here, we simulate a 10,000 neuron, deterministic, plastic network of spiking neurons. We show that a combination of short- and long-term synaptic plasticity enables these networks to exhibit criticality in the face of intrinsic, i.e. self-sustained, asynchronous spiking. Brief external perturbations lead to adaptive, long-term modification of intrinsic network connectivity through long-term excitatory plasticity, whereas long-term inhibitory plasticity enables rapid self-tuning of the network back to a critical state. The critical state is characterized by a branching parameter oscillating around unity, a critical exponent close to -3/2 and a long tail distribution of a self-similarity parameter between 0.5 and 1.
Collapse
|
47
|
Guzman-Karlsson MC, Meadows JP, Gavin CF, Hablitz JJ, Sweatt JD. Transcriptional and epigenetic regulation of Hebbian and non-Hebbian plasticity. Neuropharmacology 2014; 80:3-17. [PMID: 24418102 DOI: 10.1016/j.neuropharm.2014.01.001] [Citation(s) in RCA: 56] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2013] [Revised: 12/30/2013] [Accepted: 01/01/2014] [Indexed: 01/02/2023]
Abstract
The epigenome is uniquely positioned as a point of convergence, integrating multiple intracellular signaling cascades into a cohesive gene expression profile necessary for long-term behavioral change. The last decade of neuroepigenetic research has primarily focused on learning-induced changes in DNA methylation and chromatin modifications. Numerous studies have independently demonstrated the importance of epigenetic modifications in memory formation and retention as well as Hebbian plasticity. However, how these mechanisms operate in the context of other forms of plasticity is largely unknown. In this review, we examine evidence for epigenetic regulation of Hebbian plasticity. We then discuss how non-Hebbian forms of plasticity, such as intrinsic plasticity and synaptic scaling, may also be involved in producing the cellular adaptations necessary for learning-related behavioral change. Furthermore, we consider the likely roles for transcriptional and epigenetic mechanisms in the regulation of these plasticities. In doing so, we aim to expand upon the idea that epigenetic mechanisms are critical regulators of both Hebbian and non-Hebbian forms of plasticity that ultimately drive learning and memory.
Collapse
Affiliation(s)
| | - Jarrod P Meadows
- Department of Neurobiology, University of Alabama at Birmingham, Birmingham, AL, USA
| | - Cristin F Gavin
- Department of Neurobiology, University of Alabama at Birmingham, Birmingham, AL, USA
| | - John J Hablitz
- Department of Neurobiology, University of Alabama at Birmingham, Birmingham, AL, USA
| | - J David Sweatt
- Department of Neurobiology, University of Alabama at Birmingham, Birmingham, AL, USA.
| |
Collapse
|
48
|
Mechanisms for stable, robust, and adaptive development of orientation maps in the primary visual cortex. J Neurosci 2013; 33:15747-66. [PMID: 24089483 DOI: 10.1523/jneurosci.1037-13.2013] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Development of orientation maps in ferret and cat primary visual cortex (V1) has been shown to be stable, in that the earliest measurable maps are similar in form to the eventual adult map, robust, in that similar maps develop in both dark rearing and in a variety of normal visual environments, and yet adaptive, in that the final map pattern reflects the statistics of the specific visual environment. How can these three properties be reconciled? Using mechanistic models of the development of neural connectivity in V1, we show for the first time that realistic stable, robust, and adaptive map development can be achieved by including two low-level mechanisms originally motivated from single-neuron results. Specifically, contrast-gain control in the retinal ganglion cells and the lateral geniculate nucleus reduces variation in the presynaptic drive due to differences in input patterns, while homeostatic plasticity of V1 neuron excitability reduces the postsynaptic variability in firing rates. Together these two mechanisms, thought to be applicable across sensory systems in general, lead to biological maps that develop stably and robustly, yet adapt to the visual environment. The modeling results suggest that topographic map stability is a natural outcome of low-level processes of adaptation and normalization. The resulting model is more realistic, simpler, and far more robust, and is thus a good starting point for future studies of cortical map development.
Collapse
|
49
|
Zenke F, Hennequin G, Gerstner W. Synaptic plasticity in neural networks needs homeostasis with a fast rate detector. PLoS Comput Biol 2013; 9:e1003330. [PMID: 24244138 PMCID: PMC3828150 DOI: 10.1371/journal.pcbi.1003330] [Citation(s) in RCA: 95] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2013] [Accepted: 09/25/2013] [Indexed: 01/17/2023] Open
Abstract
Hebbian changes of excitatory synapses are driven by and further enhance correlations between pre- and postsynaptic activities. Hence, Hebbian plasticity forms a positive feedback loop that can lead to instability in simulated neural networks. To keep activity at healthy, low levels, plasticity must therefore incorporate homeostatic control mechanisms. We find in numerical simulations of recurrent networks with a realistic triplet-based spike-timing-dependent plasticity rule (triplet STDP) that homeostasis has to detect rate changes on a timescale of seconds to minutes to keep the activity stable. We confirm this result in a generic mean-field formulation of network activity and homeostatic plasticity. Our results strongly suggest the existence of a homeostatic regulatory mechanism that reacts to firing rate changes on the order of seconds to minutes. Learning and memory in the brain are thought to be mediated through Hebbian plasticity. When a group of neurons is repetitively active together, their connections get strengthened. This can cause co-activation even in the absence of the stimulus that triggered the change. To avoid run-away behavior it is important to prevent neurons from forming excessively strong connections. This is achieved by regulatory homeostatic mechanisms that constrain the overall activity. Here we study the stability of background activity in a recurrent network model with a plausible Hebbian learning rule and homeostasis. We find that the activity in our model is unstable unless homeostasis reacts to rate changes on a timescale of minutes or faster. Since this timescale is incompatible with most known forms of homeostasis, this implies the existence of a previously unknown, rapid homeostatic regulatory mechanism capable of either gating the rate of plasticity, or affecting synaptic efficacies otherwise on a short timescale.
Collapse
Affiliation(s)
- Friedemann Zenke
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| | - Guillaume Hennequin
- Computational and Biological Learning Laboratory, Department of Engineering, University of Cambridge, Cambridge, United Kingdom
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
50
|
Tetzlaff C, Kolodziejski C, Timme M, Tsodyks M, Wörgötter F. Synaptic scaling enables dynamically distinct short- and long-term memory formation. PLoS Comput Biol 2013; 9:e1003307. [PMID: 24204240 PMCID: PMC3814677 DOI: 10.1371/journal.pcbi.1003307] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2013] [Accepted: 09/11/2013] [Indexed: 01/17/2023] Open
Abstract
Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and synaptic differentiation is simultaneously achieved remains unclear. Here we show that synaptic scaling - a slow process usually associated with the maintenance of activity homeostasis - combined with synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.
Collapse
Affiliation(s)
- Christian Tetzlaff
- Faculty of Physics – Biophysics, Georg August University Friedrich-Hund Platz 1, Göttingen, Germany
- Network Dynamics Group, Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University Friedrich-Hund Platz 1, Göttingen, Germany
- * E-mail:
| | - Christoph Kolodziejski
- Network Dynamics Group, Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University Friedrich-Hund Platz 1, Göttingen, Germany
- Faculty of Physics – Nonlinear Dynamics, Georg August University Friedrich-Hund Platz 1, Göttingen, Germany
| | - Marc Timme
- Network Dynamics Group, Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University Friedrich-Hund Platz 1, Göttingen, Germany
- Faculty of Physics – Nonlinear Dynamics, Georg August University Friedrich-Hund Platz 1, Göttingen, Germany
| | - Misha Tsodyks
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| | - Florentin Wörgötter
- Faculty of Physics – Biophysics, Georg August University Friedrich-Hund Platz 1, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University Friedrich-Hund Platz 1, Göttingen, Germany
| |
Collapse
|