1
|
Gillett M, Brunel N. Dynamic control of sequential retrieval speed in networks with heterogeneous learning rules. eLife 2024; 12:RP88805. [PMID: 39197099 PMCID: PMC11357343 DOI: 10.7554/elife.88805] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/30/2024] Open
Abstract
Temporal rescaling of sequential neural activity has been observed in multiple brain areas during behaviors involving time estimation and motor execution at variable speeds. Temporally asymmetric Hebbian rules have been used in network models to learn and retrieve sequential activity, with characteristics that are qualitatively consistent with experimental observations. However, in these models sequential activity is retrieved at a fixed speed. Here, we investigate the effects of a heterogeneity of plasticity rules on network dynamics. In a model in which neurons differ by the degree of temporal symmetry of their plasticity rule, we find that retrieval speed can be controlled by varying external inputs to the network. Neurons with temporally symmetric plasticity rules act as brakes and tend to slow down the dynamics, while neurons with temporally asymmetric rules act as accelerators of the dynamics. We also find that such networks can naturally generate separate 'preparatory' and 'execution' activity patterns with appropriate external inputs.
Collapse
Affiliation(s)
- Maxwell Gillett
- Department of Neurobiology, Duke UniversityDurhamUnited States
| | - Nicolas Brunel
- Department of Neurobiology, Duke UniversityDurhamUnited States
- Department of Physics, Duke UniversityDurhamUnited States
| |
Collapse
|
2
|
de Brito CSN, Gerstner W. Learning what matters: Synaptic plasticity with invariance to second-order input correlations. PLoS Comput Biol 2024; 20:e1011844. [PMID: 38346073 PMCID: PMC10890752 DOI: 10.1371/journal.pcbi.1011844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 02/23/2024] [Accepted: 01/18/2024] [Indexed: 02/25/2024] Open
Abstract
Cortical populations of neurons develop sparse representations adapted to the statistics of the environment. To learn efficient population codes, synaptic plasticity mechanisms must differentiate relevant latent features from spurious input correlations, which are omnipresent in cortical networks. Here, we develop a theory for sparse coding and synaptic plasticity that is invariant to second-order correlations in the input. Going beyond classical Hebbian learning, our learning objective explains the functional form of observed excitatory plasticity mechanisms, showing how Hebbian long-term depression (LTD) cancels the sensitivity to second-order correlations so that receptive fields become aligned with features hidden in higher-order statistics. Invariance to second-order correlations enhances the versatility of biologically realistic learning models, supporting optimal decoding from noisy inputs and sparse population coding from spatially correlated stimuli. In a spiking model with triplet spike-timing-dependent plasticity (STDP), we show that individual neurons can learn localized oriented receptive fields, circumventing the need for input preprocessing, such as whitening, or population-level lateral inhibition. The theory advances our understanding of local unsupervised learning in cortical circuits, offers new interpretations of the Bienenstock-Cooper-Munro and triplet STDP models, and assigns a specific functional role to synaptic LTD mechanisms in pyramidal neurons.
Collapse
Affiliation(s)
- Carlos Stein Naves de Brito
- École Polytechnique Fédérale de Lausanne, EPFL, Lusanne, Switzerland
- Champalimaud Research, Champalimaud Centre for the Unknown, Lisbon, Portugal
| | - Wulfram Gerstner
- École Polytechnique Fédérale de Lausanne, EPFL, Lusanne, Switzerland
| |
Collapse
|
3
|
Miehl C, Onasch S, Festa D, Gjorgjieva J. Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Christoph Miehl
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Sebastian Onasch
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Dylan Festa
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| |
Collapse
|
4
|
Wang B, Aljadeff J. Multiplicative Shot-Noise: A New Route to Stability of Plastic Networks. PHYSICAL REVIEW LETTERS 2022; 129:068101. [PMID: 36018633 DOI: 10.1103/physrevlett.129.068101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 06/30/2022] [Indexed: 06/15/2023]
Abstract
Fluctuations of synaptic weights, among many other physical, biological, and ecological quantities, are driven by coincident events of two "parent" processes. We propose a multiplicative shot-noise model that can capture the behaviors of a broad range of such natural phenomena, and analytically derive an approximation that accurately predicts its statistics. We apply our results to study the effects of a multiplicative synaptic plasticity rule that was recently extracted from measurements in physiological conditions. Using mean-field theory analysis and network simulations, we investigate how this rule shapes the connectivity and dynamics of recurrent spiking neural networks. The multiplicative plasticity rule is shown to support efficient learning of input stimuli, and it gives a stable, unimodal synaptic-weight distribution with a large fraction of strong synapses. The strong synapses remain stable over long times but do not "run away." Our results suggest that the multiplicative shot-noise offers a new route to understand the tradeoff between flexibility and stability in neural circuits and other dynamic networks.
Collapse
Affiliation(s)
- Bin Wang
- Department of Physics, University of California San Diego, La Jolla, California 92093, USA
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California San Diego, La Jolla, California 92093, USA
| |
Collapse
|
5
|
Drifting assemblies for persistent memory: Neuron transitions and unsupervised compensation. Proc Natl Acad Sci U S A 2021; 118:2023832118. [PMID: 34772802 DOI: 10.1073/pnas.2023832118] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/11/2021] [Indexed: 11/18/2022] Open
Abstract
Change is ubiquitous in living beings. In particular, the connectome and neural representations can change. Nevertheless, behaviors and memories often persist over long times. In a standard model, associative memories are represented by assemblies of strongly interconnected neurons. For faithful storage these assemblies are assumed to consist of the same neurons over time. Here we propose a contrasting memory model with complete temporal remodeling of assemblies, based on experimentally observed changes of synapses and neural representations. The assemblies drift freely as noisy autonomous network activity and spontaneous synaptic turnover induce neuron exchange. The gradual exchange allows activity-dependent and homeostatic plasticity to conserve the representational structure and keep inputs, outputs, and assemblies consistent. This leads to persistent memory. Our findings explain recent experimental results on temporal evolution of fear memory representations and suggest that memory systems need to be understood in their completeness as individual parts may constantly change.
Collapse
|
6
|
Aljadeff J, Gillett M, Pereira Obilinovic U, Brunel N. From synapse to network: models of information storage and retrieval in neural circuits. Curr Opin Neurobiol 2021; 70:24-33. [PMID: 34175521 DOI: 10.1016/j.conb.2021.05.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2021] [Revised: 05/06/2021] [Accepted: 05/25/2021] [Indexed: 10/21/2022]
Abstract
The mechanisms of information storage and retrieval in brain circuits are still the subject of debate. It is widely believed that information is stored at least in part through changes in synaptic connectivity in networks that encode this information and that these changes lead in turn to modifications of network dynamics, such that the stored information can be retrieved at a later time. Here, we review recent progress in deriving synaptic plasticity rules from experimental data and in understanding how plasticity rules affect the dynamics of recurrent networks. We show that the dynamics generated by such networks exhibit a large degree of diversity, depending on parameters, similar to experimental observations in vivo during delayed response tasks.
Collapse
Affiliation(s)
- Johnatan Aljadeff
- Neurobiology Section, Division of Biological Sciences, UC San Diego, USA
| | | | | | - Nicolas Brunel
- Department of Neurobiology, Duke University, USA; Department of Physics, Duke University, USA.
| |
Collapse
|
7
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
8
|
Characteristics of sequential activity in networks with temporally asymmetric Hebbian learning. Proc Natl Acad Sci U S A 2020; 117:29948-29958. [PMID: 33177232 PMCID: PMC7703604 DOI: 10.1073/pnas.1918674117] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Sequential activity is a prominent feature of many neural systems, in multiple behavioral contexts. Here, we investigate how Hebbian rules lead to storage and recall of random sequences of inputs in both rate and spiking recurrent networks. In the case of the simplest (bilinear) rule, we characterize extensively the regions in parameter space that allow sequence retrieval and compute analytically the storage capacity of the network. We show that nonlinearities in the learning rule can lead to sparse sequences and find that sequences maintain robust decoding but display highly labile dynamics to continuous changes in the connectivity matrix, similar to recent observations in hippocampus and parietal cortex. Sequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here, we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by nonlinearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations.
Collapse
|
9
|
Montangie L, Miehl C, Gjorgjieva J. Autonomous emergence of connectivity assemblies via spike triplet interactions. PLoS Comput Biol 2020; 16:e1007835. [PMID: 32384081 PMCID: PMC7239496 DOI: 10.1371/journal.pcbi.1007835] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2019] [Revised: 05/20/2020] [Accepted: 03/31/2020] [Indexed: 01/08/2023] Open
Abstract
Non-random connectivity can emerge without structured external input driven by activity-dependent mechanisms of synaptic plasticity based on precise spiking patterns. Here we analyze the emergence of global structures in recurrent networks based on a triplet model of spike timing dependent plasticity (STDP), which depends on the interactions of three precisely-timed spikes, and can describe plasticity experiments with varying spike frequency better than the classical pair-based STDP rule. We derive synaptic changes arising from correlations up to third-order and describe them as the sum of structural motifs, which determine how any spike in the network influences a given synaptic connection through possible connectivity paths. This motif expansion framework reveals novel structural motifs under the triplet STDP rule, which support the formation of bidirectional connections and ultimately the spontaneous emergence of global network structure in the form of self-connected groups of neurons, or assemblies. We propose that under triplet STDP assembly structure can emerge without the need for externally patterned inputs or assuming a symmetric pair-based STDP rule common in previous studies. The emergence of non-random network structure under triplet STDP occurs through internally-generated higher-order correlations, which are ubiquitous in natural stimuli and neuronal spiking activity, and important for coding. We further demonstrate how neuromodulatory mechanisms that modulate the shape of the triplet STDP rule or the synaptic transmission function differentially promote structural motifs underlying the emergence of assemblies, and quantify the differences using graph theoretic measures. Emergent non-random connectivity structures in different brain regions are tightly related to specific patterns of neural activity and support diverse brain functions. For instance, self-connected groups of neurons, known as assemblies, have been proposed to represent functional units in brain circuits and can emerge even without patterned external instruction. Here we investigate the emergence of non-random connectivity in recurrent networks using a particular plasticity rule, triplet STDP, which relies on the interaction of spike triplets and can capture higher-order statistical dependencies in neural activity. We derive the evolution of the synaptic strengths in the network and explore the conditions for the self-organization of connectivity into assemblies. We demonstrate key differences of the triplet STDP rule compared to the classical pair-based rule in terms of how assemblies are formed, including the realistic asymmetric shape and influence of novel connectivity motifs on network plasticity driven by higher-order correlations. Assembly formation depends on the specific shape of the STDP window and synaptic transmission function, pointing towards an important role of neuromodulatory signals on formation of intrinsically generated assemblies.
Collapse
Affiliation(s)
- Lisandro Montangie
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
| | - Christoph Miehl
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
- * E-mail:
| |
Collapse
|
10
|
Shamir M. Theories of rhythmogenesis. Curr Opin Neurobiol 2019; 58:70-77. [PMID: 31408837 DOI: 10.1016/j.conb.2019.07.005] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2019] [Accepted: 07/14/2019] [Indexed: 12/31/2022]
Abstract
Rhythmogenesis is the process that develops the capacity for rhythmic activity in a non-rhythmic system. Theoretical works suggested a wide array of possible mechanisms for rhythmogenesis ranging from the regulation of cellular properties to top-down control. Here we discuss theories of rhythmogenesis with an emphasis on spike timing-dependent plasticity. We argue that even though the specifics of different mechanisms vary greatly they all share certain key features. Namely, rhythmogenesis can be described as a flow on the phase diagram leading the system into a rhythmic region and stabilizing it on a specific manifold characterized by the desired rhythmic activity. Functionality is retained despite biological diversity by forcing the system into a specific manifold, but allowing fluctuations within that manifold.
Collapse
Affiliation(s)
- Maoz Shamir
- Department of Physiology and Cell Biology, Faculty of Health Sciences, Department of Physics, Faculty of Natural Sciences, Zlotowski Center for Neuroscience, Ben-Gurion University of the Negev, Be'er-Sheva, Israel; The Kavli Institute for Theoretical Physics, University of California, Santa Barbara, USA.
| |
Collapse
|
11
|
Lappalainen J, Herpich J, Tetzlaff C. A Theoretical Framework to Derive Simple, Firing-Rate-Dependent Mathematical Models of Synaptic Plasticity. Front Comput Neurosci 2019; 13:26. [PMID: 31133837 PMCID: PMC6517541 DOI: 10.3389/fncom.2019.00026] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Accepted: 04/10/2019] [Indexed: 11/13/2022] Open
Abstract
Synaptic plasticity serves as an essential mechanism underlying cognitive processes as learning and memory. For a better understanding detailed theoretical models combine experimental underpinnings of synaptic plasticity and match experimental results. However, these models are mathematically complex impeding the comprehensive investigation of their link to cognitive processes generally executed on the neuronal network level. Here, we derive a mathematical framework enabling the simplification of such detailed models of synaptic plasticity facilitating further mathematical analyses. By this framework we obtain a compact, firing-rate-dependent mathematical formulation, which includes the essential dynamics of the detailed model and, thus, of experimentally verified properties of synaptic plasticity. Amongst others, by testing our framework by abstracting the dynamics of two well-established calcium-dependent synaptic plasticity models, we derived that the synaptic changes depend on the square of the presynaptic firing rate, which is in contrast to previous assumptions. Thus, the here-presented framework enables the derivation of biologically plausible but simple mathematical models of synaptic plasticity allowing to analyze the underlying dependencies of synaptic dynamics from neuronal properties such as the firing rate and to investigate their implications in complex neuronal networks.
Collapse
Affiliation(s)
- Janne Lappalainen
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany
| | - Juliane Herpich
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| |
Collapse
|
12
|
Damicelli F, Hilgetag CC, Hütt MT, Messé A. Topological reinforcement as a principle of modularity emergence in brain networks. Netw Neurosci 2019; 3:589-605. [PMID: 31157311 PMCID: PMC6542620 DOI: 10.1162/netn_a_00085] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Accepted: 03/21/2019] [Indexed: 12/02/2022] Open
Abstract
Modularity is a ubiquitous topological feature of structural brain networks at various scales. Although a variety of potential mechanisms have been proposed, the fundamental principles by which modularity emerges in neural networks remain elusive. We tackle this question with a plasticity model of neural networks derived from a purely topological perspective. Our topological reinforcement model acts enhancing the topological overlap between nodes, that is, iteratively allowing connections between non-neighbor nodes with high neighborhood similarity. This rule reliably evolves synthetic random networks toward a modular architecture. Such final modular structure reflects initial "proto-modules," thus allowing to predict the modules of the evolved graph. Subsequently, we show that this topological selection principle might be biologically implemented as a Hebbian rule. Concretely, we explore a simple model of excitable dynamics, where the plasticity rule acts based on the functional connectivity (co-activations) between nodes. Results produced by the activity-based model are consistent with the ones from the purely topological rule in terms of the final network configuration and modules composition. Our findings suggest that the selective reinforcement of topological overlap may be a fundamental mechanism contributing to modularity emergence in brain networks.
Collapse
Affiliation(s)
- Fabrizio Damicelli
- Institute of Computational Neuroscience, University Medical Center Eppendorf, Hamburg University, Hamburg, Germany
| | - Claus C. Hilgetag
- Institute of Computational Neuroscience, University Medical Center Eppendorf, Hamburg University, Hamburg, Germany
- Department of Health Sciences, Boston University, Boston, Massachusetts, United States of America
| | - Marc-Thorsten Hütt
- Department of Life Sciences and Chemistry, Jacobs University, Bremen, Germany
| | - Arnaud Messé
- Institute of Computational Neuroscience, University Medical Center Eppendorf, Hamburg University, Hamburg, Germany
| |
Collapse
|
13
|
Ocker GK, Doiron B. Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
14
|
Rhythmogenesis evolves as a consequence of long-term plasticity of inhibitory synapses. Sci Rep 2018; 8:13050. [PMID: 30158555 PMCID: PMC6115462 DOI: 10.1038/s41598-018-31412-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2018] [Accepted: 08/07/2018] [Indexed: 11/08/2022] Open
Abstract
Brain rhythms are widely believed to reflect numerous cognitive processes. Changes in rhythmicity have been associated with pathological states. However, the mechanism underlying these rhythms remains unknown. Here, we present a theoretical analysis of the evolvement of rhythm generating capabilities in neuronal circuits. We tested the hypothesis that brain rhythms can be acquired via an intrinsic unsupervised learning process of activity dependent plasticity. Specifically, we focused on spike timing dependent plasticity (STDP) of inhibitory synapses. We detail how rhythmicity can develop via STDP under certain conditions that serve as a natural prediction of the hypothesis. We show how global features of the STDP rule govern and stabilize the resultant rhythmic activity. Finally, we demonstrate how rhythmicity is retained even in the face of synaptic variability. This study suggests a role for inhibitory plasticity that is beyond homeostatic processes.
Collapse
|
15
|
Interplay of multiple pathways and activity-dependent rules in STDP. PLoS Comput Biol 2018; 14:e1006184. [PMID: 30106953 PMCID: PMC6112684 DOI: 10.1371/journal.pcbi.1006184] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2018] [Revised: 08/28/2018] [Accepted: 05/09/2018] [Indexed: 12/13/2022] Open
Abstract
Hebbian plasticity describes a basic mechanism for synaptic plasticity whereby synaptic weights evolve depending on the relative timing of paired activity of the pre- and postsynaptic neurons. Spike-timing-dependent plasticity (STDP) constitutes a central experimental and theoretical synaptic Hebbian learning rule. Various mechanisms, mostly calcium-based, account for the induction and maintenance of STDP. Classically STDP is assumed to gradually emerge in a monotonic way as the number of pairings increases. However, non-monotonic STDP accounting for fast associative learning led us to challenge this monotonicity hypothesis and explore how the existence of multiple plasticity pathways affects the dynamical establishment of plasticity. To account for distinct forms of STDP emerging from increasing numbers of pairings and the variety of signaling pathways involved, we developed a general class of simple mathematical models of plasticity based on calcium transients and accommodating various calcium-based plasticity mechanisms. These mechanisms can either compete or cooperate for the establishment of long-term potentiation (LTP) and depression (LTD), that emerge depending on past calcium activity. Our model reproduces accurately the striatal STDP that involves endocannabinoid and NMDAR signaling pathways. Moreover, we predict how stimulus frequency alters plasticity, and how triplet rules are affected by the number of pairings. We further investigate the general model with an arbitrary number of pathways and show that depending on those pathways and their properties, a variety of plasticities may emerge upon variation of the number and/or the frequency of pairings, even when the outcome after large numbers of pairings is identical. These findings, built upon a biologically realistic example and generalized to other applications, argue that in order to fully describe synaptic plasticity it is not sufficient to record STDP curves at fixed pairing numbers and frequencies. In fact, considering the whole spectrum of activity-dependent parameters could have a great impact on the description of plasticity, and a better understanding of the engram. The brain’s capacity to treat information, learn and store memory relies on synaptic connectivity patterns, which are altered through synaptic plasticity mechanisms. Experimentally, such plasticities were evidenced through protocols involving numerous repetitive stimulations of a given synapse, and were shown to be supported by multiple pathways. Using a simple biologically grounded mathematical model, we show how activation timescales and inactivation levels of each pathway interact and alter plasticity in an intricate manner as stimuli are presented. Building upon data from the synapse between cortex and striatum, we show that synaptic changes may revert or re-emerge as stimuli are presented, and predict specific responses to changes in stimulus frequency or to distinct simulation patterns. Our general model shows that a given plasticity profile emerging in response to a repetitive stimulation protocol can unfold into various scenarii upon variations of the number of stimulus presentations or patterns, which tightly depends on the underlying activated pathways. Altogether, these results argue that in order to better understand learning and memory, single plasticity responses obtained through intensive stimulations do not reveal the complexity of the responses for smaller number of presentations, which may have a strong impact in fast learning of stimuli with low numbers of presentations.
Collapse
|
16
|
Cacha LA, Ali J, Rizvi ZH, Yupapin PP, Poznanski RR. Nonsynaptic plasticity model of long-term memory engrams. J Integr Neurosci 2018; 16:493-509. [PMID: 28891529 DOI: 10.3233/jin-170038] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Using steady-state electrical properties of non-ohmic dendrite based on cable theory, we derive electrotonic potentials that do not change over time and are localized in space. We hypothesize that clusters of such stationary, local and permanent pulses are the electrical signatures of enduring memories which are imprinted through nonsynaptic plasticity, encoded through epigenetic mechanisms, and decoded through electrotonic processing. We further hypothesize how retrieval of an engram is made possible by integration of these permanently imprinted standing pulses in a neural circuit through neurotransmission in the extracellular space as part of conscious recall that acts as a guiding template in the reconsolidation of long-term memories through novelty characterized by uncertainty that arises when new fragments of memories reinstate an engram by way of nonsynaptic plasticity that permits its destabilization. Collectively, these findings seem to reinforce this hypothesis that electrotonic processing in non-ohmic dendrites yield insights into permanent electrical signatures that could reflect upon enduring memories as fragments of long-term memory engrams.
Collapse
Affiliation(s)
- L A Cacha
- Laser Centre, Ibnu Sina ISIR, Universiti Teknologi Malaysia, Johor Bahru 81310, Malaysia
| | - J Ali
- Laser Centre, Ibnu Sina ISIR, Universiti Teknologi Malaysia, Johor Bahru 81310, Malaysia.,Faculty of Science, Universiti Teknologi Malaysia, Johor Bahru 81310, Malaysia
| | - Z H Rizvi
- Laser Centre, Ibnu Sina ISIR, Universiti Teknologi Malaysia, Johor Bahru 81310, Malaysia
| | - P P Yupapin
- Faculty of Electrical & Electronics Engineering, Ton Duc Thang University, Ho Chi Minh City, District 7, Vietnam
| | - R R Poznanski
- Faculty of Biosciences & Medical Engineering, Universiti Teknologi Malaysia, Johor Bahru 81310, Malaysia
| |
Collapse
|
17
|
Min B, Zhou D, Cai D. Effects of Firing Variability on Network Structures with Spike-Timing-Dependent Plasticity. Front Comput Neurosci 2018; 12:1. [PMID: 29410621 PMCID: PMC5787127 DOI: 10.3389/fncom.2018.00001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2017] [Accepted: 01/03/2018] [Indexed: 11/17/2022] Open
Abstract
Synaptic plasticity is believed to be the biological substrate underlying learning and memory. One of the most widespread forms of synaptic plasticity, spike-timing-dependent plasticity (STDP), uses the spike timing information of presynaptic and postsynaptic neurons to induce synaptic potentiation or depression. An open question is how STDP organizes the connectivity patterns in neuronal circuits. Previous studies have placed much emphasis on the role of firing rate in shaping connectivity patterns. Here, we go beyond the firing rate description to develop a self-consistent linear response theory that incorporates the information of both firing rate and firing variability. By decomposing the pairwise spike correlation into one component associated with local direct connections and the other associated with indirect connections, we identify two distinct regimes regarding the network structures learned through STDP. In one regime, the contribution of the direct-connection correlations dominates over that of the indirect-connection correlations in the learning dynamics; this gives rise to a network structure consistent with the firing rate description. In the other regime, the contribution of the indirect-connection correlations dominates in the learning dynamics, leading to a network structure different from the firing rate description. We demonstrate that the heterogeneity of firing variability across neuronal populations induces a temporally asymmetric structure of indirect-connection correlations. This temporally asymmetric structure underlies the emergence of the second regime. Our study provides a new perspective that emphasizes the role of high-order statistics of spiking activity in the spike-correlation-sensitive learning dynamics.
Collapse
Affiliation(s)
- Bin Min
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.,School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
18
|
Abstract
We expand the theory of Hawkes processes to the nonstationary case, in which the mutually exciting point processes receive time-dependent inputs. We derive an analytical expression for the time-dependent correlations, which can be applied to networks with arbitrary connectivity, and inputs with arbitrary statistics. The expression shows how the network correlations are determined by the interplay between the network topology, the transfer functions relating units within the network, and the pattern and statistics of the external inputs. We illustrate the correlation structure using several examples in which neural network dynamics are modeled as a Hawkes process. In particular, we focus on the interplay between internally and externally generated oscillations and their signatures in the spike and rate correlation functions.
Collapse
Affiliation(s)
- Neta Ravid Tannenbaum
- Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel
| | - Yoram Burak
- Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel and Racah Institute of Physics, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel
| |
Collapse
|
19
|
Richter LMA, Gjorgjieva J. Understanding neural circuit development through theory and models. Curr Opin Neurobiol 2017; 46:39-47. [DOI: 10.1016/j.conb.2017.07.004] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2017] [Revised: 07/07/2017] [Accepted: 07/10/2017] [Indexed: 11/25/2022]
|
20
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
21
|
Abstract
Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks’ spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities—including those of different cell types—combine with connectivity to shape population activity and function. Neuronal networks, like many biological systems, exhibit variable activity. This activity is shaped by both the underlying biology of the component neurons and the structure of their interactions. How can we combine knowledge of these two things—that is, models of individual neurons and of their interactions—to predict the statistics of single- and multi-neuron activity? Current approaches rely on linearizing neural activity around a stationary state. In the face of neural nonlinearities, however, these linear methods can fail to predict spiking statistics and even fail to correctly predict whether activity is stable or pathological. Here, we show how to calculate any spike train cumulant in a broad class of models, while systematically accounting for nonlinear effects. We then study a fundamental effect of nonlinear input-rate transfer–coupling between different orders of spiking statistic–and how this depends on single-neuron and network properties.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Krešimir Josić
- Department of Mathematics and Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- Department of BioSciences, Rice University, Houston, Texas, United States of America
| | - Eric Shea-Brown
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, and UW Institute of Neuroengineering, University of Washington, Seattle, Washington, United States of America
| | - Michael A. Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
22
|
Murray JM, Escola GS. Learning multiple variable-speed sequences in striatum via cortical tutoring. eLife 2017; 6. [PMID: 28481200 PMCID: PMC5446244 DOI: 10.7554/elife.26084] [Citation(s) in RCA: 66] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2017] [Accepted: 05/07/2017] [Indexed: 01/16/2023] Open
Abstract
Sparse, sequential patterns of neural activity have been observed in numerous brain areas during timekeeping and motor sequence tasks. Inspired by such observations, we construct a model of the striatum, an all-inhibitory circuit where sequential activity patterns are prominent, addressing the following key challenges: (i) obtaining control over temporal rescaling of the sequence speed, with the ability to generalize to new speeds; (ii) facilitating flexible expression of distinct sequences via selective activation, concatenation, and recycling of specific subsequences; and (iii) enabling the biologically plausible learning of sequences, consistent with the decoupling of learning and execution suggested by lesion studies showing that cortical circuits are necessary for learning, but that subcortical circuits are sufficient to drive learned behaviors. The same mechanisms that we describe can also be applied to circuits with both excitatory and inhibitory populations, and hence may underlie general features of sequential neural activity pattern generation in the brain. DOI:http://dx.doi.org/10.7554/eLife.26084.001
Collapse
Affiliation(s)
- James M Murray
- Center for Theoretical Neuroscience, Columbia University, New York, United States
| | - G Sean Escola
- Center for Theoretical Neuroscience, Columbia University, New York, United States
| |
Collapse
|
23
|
Damicelli F, Hilgetag CC, Hütt MT, Messé A. Modular topology emerges from plasticity in a minimalistic excitable network model. CHAOS (WOODBURY, N.Y.) 2017; 27:047406. [PMID: 28456166 DOI: 10.1063/1.4979561] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Topological features play a major role in the emergence of complex brain network dynamics underlying brain function. Specific topological properties of brain networks, such as their modular organization, have been widely studied in recent years and shown to be ubiquitous across spatial scales and species. However, the mechanisms underlying the generation and maintenance of such features are still unclear. Using a minimalistic network model with excitable nodes and discrete deterministic dynamics, we studied the effects of a local Hebbian plasticity rule on global network topology. We found that, despite the simple model set-up, the plasticity rule was able to reorganize the global network topology into a modular structure. The structural reorganization was accompanied by enhanced correlations between structural and functional connectivity, and the final network organization reflected features of the dynamical model. These findings demonstrate the potential of simple plasticity rules for structuring the topology of brain connectivity.
Collapse
Affiliation(s)
- Fabrizio Damicelli
- Institute of Computational Neuroscience, University Medical Center Hamburg-Eppendorf, Hamburg University, Hamburg, Germany
| | - Claus C Hilgetag
- Institute of Computational Neuroscience, University Medical Center Hamburg-Eppendorf, Hamburg University, Hamburg, Germany
| | - Marc-Thorsten Hütt
- School of Engineering and Science, Jacobs University Bremen, Bremen, Germany
| | - Arnaud Messé
- Institute of Computational Neuroscience, University Medical Center Hamburg-Eppendorf, Hamburg University, Hamburg, Germany
| |
Collapse
|