1
|
Mattera A, Alfieri V, Granato G, Baldassarre G. Chaotic recurrent neural networks for brain modelling: A review. Neural Netw 2025; 184:107079. [PMID: 39756119 DOI: 10.1016/j.neunet.2024.107079] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2024] [Revised: 11/25/2024] [Accepted: 12/19/2024] [Indexed: 01/07/2025]
Abstract
Even in the absence of external stimuli, the brain is spontaneously active. Indeed, most cortical activity is internally generated by recurrence. Both theoretical and experimental studies suggest that chaotic dynamics characterize this spontaneous activity. While the precise function of brain chaotic activity is still puzzling, we know that chaos confers many advantages. From a computational perspective, chaos enhances the complexity of network dynamics. From a behavioural point of view, chaotic activity could generate the variability required for exploration. Furthermore, information storage and transfer are maximized at the critical border between order and chaos. Despite these benefits, many computational brain models avoid incorporating spontaneous chaotic activity due to the challenges it poses for learning algorithms. In recent years, however, multiple approaches have been proposed to overcome this limitation. As a result, many different algorithms have been developed, initially within the reservoir computing paradigm. Over time, the field has evolved to increase the biological plausibility and performance of the algorithms, sometimes going beyond the reservoir computing framework. In this review article, we examine the computational benefits of chaos and the unique properties of chaotic recurrent neural networks, with a particular focus on those typically utilized in reservoir computing. We also provide a detailed analysis of the algorithms designed to train chaotic RNNs, tracing their historical evolution and highlighting key milestones in their development. Finally, we explore the applications and limitations of chaotic RNNs for brain modelling, consider their potential broader impacts beyond neuroscience, and outline promising directions for future research.
Collapse
Affiliation(s)
- Andrea Mattera
- Institute of Cognitive Sciences and Technology, National Research Council, Via Romagnosi 18a, I-00196, Rome, Italy.
| | - Valerio Alfieri
- Institute of Cognitive Sciences and Technology, National Research Council, Via Romagnosi 18a, I-00196, Rome, Italy; International School of Advanced Studies, Center for Neuroscience, University of Camerino, Via Gentile III Da Varano, 62032, Camerino, Italy
| | - Giovanni Granato
- Institute of Cognitive Sciences and Technology, National Research Council, Via Romagnosi 18a, I-00196, Rome, Italy
| | - Gianluca Baldassarre
- Institute of Cognitive Sciences and Technology, National Research Council, Via Romagnosi 18a, I-00196, Rome, Italy
| |
Collapse
|
2
|
Kim SH, Choi H. Inhibitory cell type heterogeneity in a spatially structured mean-field model of V1. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2025:2025.03.13.643046. [PMID: 40161661 PMCID: PMC11952513 DOI: 10.1101/2025.03.13.643046] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 04/02/2025]
Abstract
Inhibitory interneurons in the cortex are classified into cell types differing in their morphology, electrophysiology, and connectivity. Although it is known that parvalbumin (PV), somatostatin (SST), and vasoactive intestinal polypeptide-expressing neurons (VIP), the major inhibitory neuron subtypes in the cortex, have distinct modulatory effects on excitatory neurons, how heterogeneous spatial connectivity properties relate to network computations is not well understood. Here, we study the implications of heterogeneous inhibitory neurons on the dynamics and computations of spatially-structured neural networks. We develop a mean-field model of the system in order to systematically examine excitation-inhibition balance, dynamical stability, and cell-type specific gain modulations. The model incorporates three inhibitory cell types and excitatory neurons with distinct connectivity probabilities and recent evidence of long-range spatial projections of SST neurons. Position-dependent firing rate predictions are validated against simulations, and balanced solutions under Gaussian assumptions are derived from scaling arguments. Stability analysis shows that while long-range inhibitory projections in E-I circuits with a homogeneous inhibitory population result in instability, the heterogeneous network maintains stability with long-range SST projections. This suggests that a mixture of short and long-range inhibitions may be key to providing diverse computations while maintaining stability. We further find that conductance-based synaptic transmissions are necessary to reproduce experimentally observed cell-type-specific gain modulations of inhibition by PV and SST neurons. The mechanisms underlying cell-type-specific gain changes are elucidated using linear response theory. Our theoretical approach offers insight into the computational function of cell-type-specific and distance-dependent network structure.
Collapse
Affiliation(s)
- Soon Ho Kim
- School of Mathematics, Georgia Institute of Technology, Atlanta, GA 30332-0160
| | - Hannah Choi
- School of Mathematics, Georgia Institute of Technology, Atlanta, GA 30332-0160
| |
Collapse
|
3
|
Bagatelas ED, Kavalali ET. Chronic modulation of cAMP signaling elicits synaptic scaling irrespective of activity. iScience 2024; 27:110176. [PMID: 38989459 PMCID: PMC11233962 DOI: 10.1016/j.isci.2024.110176] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Revised: 04/11/2024] [Accepted: 05/31/2024] [Indexed: 07/12/2024] Open
Abstract
Homeostatic plasticity mechanisms act in a negative feedback manner to stabilize neuronal firing around a set point. Classically, homeostatic synaptic plasticity is elicited via rather drastic manipulation of activity in a neuronal population. Here, we employed a chemogenetic approach to regulate activity via eliciting G protein-coupled receptor (GPCR) signaling in hippocampal neurons to trigger homeostatic synaptic plasticity. We demonstrate that chronic activation of hM4D(Gi) signaling induces mild and transient activity suppression, yet still triggers synaptic upscaling akin to tetrodotoxin (TTX)-induced complete activity suppression. Therefore, this homeostatic regulation was irrespective of Gi-signaling regulation of activity, but it was mimicked or occluded by direct manipulation of cyclic AMP (cAMP) signaling in a manner that intersected with the retinoic acid receptor alpha (RARα) signaling pathway. Our data suggest chemogenetic tools can uniquely be used to probe cell-autonomous mechanisms of synaptic scaling and operate via direct modulation of second messenger signaling bypassing activity regulation.
Collapse
Affiliation(s)
- Elena D. Bagatelas
- Department of Pharmacology and the Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN 37209, USA
| | - Ege T. Kavalali
- Department of Pharmacology and the Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN 37209, USA
| |
Collapse
|
4
|
Politi A, Torcini A. A robust balancing mechanism for spiking neural networks. CHAOS (WOODBURY, N.Y.) 2024; 34:041102. [PMID: 38639569 DOI: 10.1063/5.0199298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Accepted: 02/03/2024] [Indexed: 04/20/2024]
Abstract
Dynamical balance of excitation and inhibition is usually invoked to explain the irregular low firing activity observed in the cortex. We propose a robust nonlinear balancing mechanism for a random network of spiking neurons, which works also in the absence of strong external currents. Biologically, the mechanism exploits the plasticity of excitatory-excitatory synapses induced by short-term depression. Mathematically, the nonlinear response of the synaptic activity is the key ingredient responsible for the emergence of a stable balanced regime. Our claim is supported by a simple self-consistent analysis accompanied by extensive simulations performed for increasing network sizes. The observed regime is essentially fluctuation driven and characterized by highly irregular spiking dynamics of all neurons.
Collapse
Affiliation(s)
- Antonio Politi
- Institute for Complex Systems and Mathematical Biology and Department of Physics, Aberdeen AB24 3UE, United Kingdom
- CNR-Consiglio Nazionale delle Ricerche-Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
| | - Alessandro Torcini
- CNR-Consiglio Nazionale delle Ricerche-Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
- Laboratoire de Physique Théorique et Modélisation, CY Cergy Paris Université, CNRS UMR 8089, 95302 Cergy-Pontoise cedex, France
- INFN Sezione di Firenze, Via Sansone 1 50019 Sesto Fiorentino, Italy
| |
Collapse
|
5
|
Doubovikov ED, Serdyukova NA, Greenberg SB, Gascoigne DA, Minhaj MM, Aksenov DP. Electric Field Effects on Brain Activity: Implications for Epilepsy and Burst Suppression. Cells 2023; 12:2229. [PMID: 37759452 PMCID: PMC10527339 DOI: 10.3390/cells12182229] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2023] [Revised: 08/07/2023] [Accepted: 09/04/2023] [Indexed: 09/29/2023] Open
Abstract
Electric fields are now considered a major mechanism of epileptiform activity. However, it is not clear if another electrophysiological phenomenon, burst suppression, utilizes the same mechanism for its bursting phase. Thus, the purpose of this study was to compare the role of ephaptic coupling-the recruitment of neighboring cells via electric fields-in generating bursts in epilepsy and burst suppression. We used local injections of the GABA-antagonist picrotoxin to elicit epileptic activity and a general anesthetic, sevoflurane, to elicit burst suppression in rabbits. Then, we applied an established computational model of pyramidal cells to simulate neuronal activity in a 3-dimensional grid, with an additional parameter to trigger a suppression phase based on extra-cellular calcium dynamics. We discovered that coupling via electric fields was sufficient to produce bursting in scenarios where inhibitory control of excitatory neurons was sufficiently low. Under anesthesia conditions, bursting occurs with lower neuronal recruitment in comparison to seizures. Our model predicts that due to the effect of electric fields, the magnitude of bursts during seizures should be roughly 2-3 times the magnitude of bursts that occur during burst suppression, which is consistent with our in vivo experimental results. The resulting difference in magnitude between bursts during anesthesia and epileptiform bursts reflects the strength of the electric field effect, which suggests that burst suppression and epilepsy share the same ephaptic coupling mechanism.
Collapse
Affiliation(s)
- Evan D. Doubovikov
- Department of Radiology, NorthShore University HealthSystem, Evanston, IL 60201, USA
| | - Natalya A. Serdyukova
- Department of Biomedical Engineering, Northwestern University, Evanston, IL 60208, USA
- Department of Pediatrics, NorthShore University HealthSystem, Evanston, IL 60201, USA
| | - Steven B. Greenberg
- Department of Anesthesiology, NorthShore University HealthSystem, Evanston, IL 60201, USA
| | - David A. Gascoigne
- Department of Radiology, NorthShore University HealthSystem, Evanston, IL 60201, USA
| | - Mohammed M. Minhaj
- Department of Anesthesiology, NorthShore University HealthSystem, Evanston, IL 60201, USA
| | - Daniil P. Aksenov
- Department of Radiology, NorthShore University HealthSystem, Evanston, IL 60201, USA
- Department of Biomedical Engineering, Northwestern University, Evanston, IL 60208, USA
- Department of Anesthesiology, NorthShore University HealthSystem, Evanston, IL 60201, USA
- Pritzker School of Medicine, University of Chicago, Chicago, IL 60637, USA
| |
Collapse
|
6
|
Pérez-Cervera A, Gutkin B, Thomas PJ, Lindner B. A universal description of stochastic oscillators. Proc Natl Acad Sci U S A 2023; 120:e2303222120. [PMID: 37432992 PMCID: PMC10629544 DOI: 10.1073/pnas.2303222120] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2023] [Accepted: 05/18/2023] [Indexed: 07/13/2023] Open
Abstract
Many systems in physics, chemistry, and biology exhibit oscillations with a pronounced random component. Such stochastic oscillations can emerge via different mechanisms, for example, linear dynamics of a stable focus with fluctuations, limit-cycle systems perturbed by noise, or excitable systems in which random inputs lead to a train of pulses. Despite their diverse origins, the phenomenology of random oscillations can be strikingly similar. Here, we introduce a nonlinear transformation of stochastic oscillators to a complex-valued function [Formula: see text](x) that greatly simplifies and unifies the mathematical description of the oscillator's spontaneous activity, its response to an external time-dependent perturbation, and the correlation statistics of different oscillators that are weakly coupled. The function [Formula: see text] (x) is the eigenfunction of the Kolmogorov backward operator with the least negative (but nonvanishing) eigenvalue λ1 = μ1 + iω1. The resulting power spectrum of the complex-valued function is exactly given by a Lorentz spectrum with peak frequency ω1 and half-width μ1; its susceptibility with respect to a weak external forcing is given by a simple one-pole filter, centered around ω1; and the cross-spectrum between two coupled oscillators can be easily expressed by a combination of the spontaneous power spectra of the uncoupled systems and their susceptibilities. Our approach makes qualitatively different stochastic oscillators comparable, provides simple characteristics for the coherence of the random oscillation, and gives a framework for the description of weakly coupled oscillators.
Collapse
Affiliation(s)
- Alberto Pérez-Cervera
- Department of Applied Mathematics, Instituto de Matemática Interdisciplinar, Universidad Complutense de Madrid, Madrid28040, Spain
| | - Boris Gutkin
- Group for Neural Theory, LNC2 INSERM U960, Département d’Etudes Cognitives, Ecole Normale Supérieure - Paris Science Letters University, Paris75005, France
| | - Peter J. Thomas
- Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University, Cleveland, OH44106
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin10115, Germany
- Department of Physics, Humboldt Universität zu Berlin, BerlinD-12489, Germany
| |
Collapse
|
7
|
Jeon I, Kim T. Distinctive properties of biological neural networks and recent advances in bottom-up approaches toward a better biologically plausible neural network. Front Comput Neurosci 2023; 17:1092185. [PMID: 37449083 PMCID: PMC10336230 DOI: 10.3389/fncom.2023.1092185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Accepted: 06/12/2023] [Indexed: 07/18/2023] Open
Abstract
Although it may appear infeasible and impractical, building artificial intelligence (AI) using a bottom-up approach based on the understanding of neuroscience is straightforward. The lack of a generalized governing principle for biological neural networks (BNNs) forces us to address this problem by converting piecemeal information on the diverse features of neurons, synapses, and neural circuits into AI. In this review, we described recent attempts to build a biologically plausible neural network by following neuroscientifically similar strategies of neural network optimization or by implanting the outcome of the optimization, such as the properties of single computational units and the characteristics of the network architecture. In addition, we proposed a formalism of the relationship between the set of objectives that neural networks attempt to achieve, and neural network classes categorized by how closely their architectural features resemble those of BNN. This formalism is expected to define the potential roles of top-down and bottom-up approaches for building a biologically plausible neural network and offer a map helping the navigation of the gap between neuroscience and AI engineering.
Collapse
Affiliation(s)
| | - Taegon Kim
- Brain Science Institute, Korea Institute of Science and Technology, Seoul, Republic of Korea
| |
Collapse
|
8
|
Ekelmans P, Kraynyukovas N, Tchumatchenko T. Targeting operational regimes of interest in recurrent neural networks. PLoS Comput Biol 2023; 19:e1011097. [PMID: 37186668 DOI: 10.1371/journal.pcbi.1011097] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Revised: 05/25/2023] [Accepted: 04/11/2023] [Indexed: 05/17/2023] Open
Abstract
Neural computations emerge from local recurrent neural circuits or computational units such as cortical columns that comprise hundreds to a few thousand neurons. Continuous progress in connectomics, electrophysiology, and calcium imaging require tractable spiking network models that can consistently incorporate new information about the network structure and reproduce the recorded neural activity features. However, for spiking networks, it is challenging to predict which connectivity configurations and neural properties can generate fundamental operational states and specific experimentally reported nonlinear cortical computations. Theoretical descriptions for the computational state of cortical spiking circuits are diverse, including the balanced state where excitatory and inhibitory inputs balance almost perfectly or the inhibition stabilized state (ISN) where the excitatory part of the circuit is unstable. It remains an open question whether these states can co-exist with experimentally reported nonlinear computations and whether they can be recovered in biologically realistic implementations of spiking networks. Here, we show how to identify spiking network connectivity patterns underlying diverse nonlinear computations such as XOR, bistability, inhibitory stabilization, supersaturation, and persistent activity. We establish a mapping between the stabilized supralinear network (SSN) and spiking activity which allows us to pinpoint the location in parameter space where these activity regimes occur. Notably, we find that biologically-sized spiking networks can have irregular asynchronous activity that does not require strong excitation-inhibition balance or large feedforward input and we show that the dynamic firing rate trajectories in spiking networks can be precisely targeted without error-driven training algorithms.
Collapse
Affiliation(s)
- Pierre Ekelmans
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany
| | - Nataliya Kraynyukovas
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, Universitätsklinikum Bonn, Bonn, Germany
| | - Tatjana Tchumatchenko
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, Universitätsklinikum Bonn, Bonn, Germany
- Institute of physiological chemistry, Medical center of the Johannes Gutenberg-University Mainz, Mainz, Germany
| |
Collapse
|
9
|
Engelken R, Ingrosso A, Khajeh R, Goedeke S, Abbott LF. Input correlations impede suppression of chaos and learning in balanced firing-rate networks. PLoS Comput Biol 2022; 18:e1010590. [PMID: 36469504 PMCID: PMC9754616 DOI: 10.1371/journal.pcbi.1010590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 12/15/2022] [Accepted: 09/20/2022] [Indexed: 12/12/2022] Open
Abstract
Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks.
Collapse
Affiliation(s)
- Rainer Engelken
- Zuckerman Mind, Brain, Behavior Institute, Columbia University, New York, New York, United States of America
| | - Alessandro Ingrosso
- The Abdus Salam International Centre for Theoretical Physics, Trieste, Italy
| | - Ramin Khajeh
- Zuckerman Mind, Brain, Behavior Institute, Columbia University, New York, New York, United States of America
| | - Sven Goedeke
- Neural Network Dynamics and Computation, Institute of Genetics, University of Bonn, Bonn, Germany
| | - L. F. Abbott
- Zuckerman Mind, Brain, Behavior Institute, Columbia University, New York, New York, United States of America
| |
Collapse
|