1
|
Pietras B, Clusella P, Montbrió E. Low-dimensional model for adaptive networks of spiking neurons. Phys Rev E 2025; 111:014422. [PMID: 39972912 DOI: 10.1103/physreve.111.014422] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2024] [Accepted: 12/06/2024] [Indexed: 02/21/2025]
Abstract
We investigate a large ensemble of quadratic integrate-and-fire neurons with heterogeneous input currents and adaptation variables. Our analysis reveals that, for a specific class of adaptation, termed quadratic spike-frequency adaptation, the high-dimensional system can be exactly reduced to a low-dimensional system of ordinary differential equations, which describes the dynamics of three mean-field variables: the population's firing rate, the mean membrane potential, and a mean adaptation variable. The resulting low-dimensional firing rate equations (FREs) uncover a key generic feature of heterogeneous networks with spike-frequency adaptation: Both the center and width of the distribution of the neurons' firing frequencies are reduced, and this largely promotes the emergence of collective synchronization in the network. Our findings are further supported by the bifurcation analysis of the FREs, which accurately captures the collective dynamics of the spiking neuron network, including phenomena such as collective oscillations, bursting, and macroscopic chaos.
Collapse
Affiliation(s)
- Bastian Pietras
- Universitat Pompeu Fabra, Neuronal Dynamics Group, Department of Engineering, 08018 Barcelona, Spain
| | - Pau Clusella
- Universitat Politècnica de Catalunya, EPSEM, Departament de Matemàtiques, 08242 Manresa, Spain
| | - Ernest Montbrió
- Universitat Pompeu Fabra, Neuronal Dynamics Group, Department of Engineering, 08018 Barcelona, Spain
| |
Collapse
|
2
|
Salfenmoser L, Obermayer K. A framework for optimal control of oscillations and synchrony applied to non-linear models of neural population dynamics. Front Comput Neurosci 2024; 18:1483100. [PMID: 39712002 PMCID: PMC11658993 DOI: 10.3389/fncom.2024.1483100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2024] [Accepted: 11/18/2024] [Indexed: 12/24/2024] Open
Abstract
We adapt non-linear optimal control theory (OCT) to control oscillations and network synchrony and apply it to models of neural population dynamics. OCT is a mathematical framework to compute an efficient stimulation for dynamical systems. In its standard formulation, it requires a well-defined reference trajectory as target state. This requirement, however, may be overly restrictive for oscillatory targets, where the exact trajectory shape might not be relevant. To overcome this limitation, we introduce three alternative cost functionals to target oscillations and synchrony without specification of a reference trajectory. We successfully apply these cost functionals to single-node and network models of neural populations, in which each node is described by either the Wilson-Cowan model or a biophysically realistic high-dimensional mean-field model of exponential integrate-and-fire neurons. We compute efficient control strategies for four different control tasks. First, we drive oscillations from a stable stationary state at a particular frequency. Second, we switch between stationary and oscillatory stable states and find a translational invariance of the state-switching control signals. Third, we switch between in-phase and out-of-phase oscillations in a two-node network, where all cost functionals lead to identical OC signals in the minimum-energy limit. Finally, we (de-) synchronize an (a-) synchronously oscillating six-node network. In this setup, for the desynchronization task, we find very different control strategies for the three cost functionals. The suggested methods represent a toolbox that enables to include oscillatory phenomena into the framework of non-linear OCT without specification of an exact reference trajectory. However, task-specific adjustments of the optimization parameters have to be performed to obtain informative results.
Collapse
Affiliation(s)
- Lena Salfenmoser
- Institute of Software Engineering and Theoretical Computer Science, Technische Universitaet Berlin, Berlin, Germany
| | - Klaus Obermayer
- Institute of Software Engineering and Theoretical Computer Science, Technische Universitaet Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
3
|
Vinci GV, Mattia M. Rosetta stone for the population dynamics of spiking neuron networks. Phys Rev E 2024; 110:034303. [PMID: 39425388 DOI: 10.1103/physreve.110.034303] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Accepted: 07/30/2024] [Indexed: 10/21/2024]
Abstract
Populations of spiking neuron models have densities of their microscopic variables (e.g., single-cell membrane potentials) whose evolution fully capture the collective dynamics of biological networks, even outside equilibrium. Despite its general applicability, the Fokker-Planck equation governing such evolution is mainly studied within the borders of the linear response theory, although alternative spectral expansion approaches offer some advantages in the study of the out-of-equilibrium dynamics. This is mainly due to the difficulty in computing the state-dependent coefficients of the expanded system of differential equations. Here, we address this issue by deriving analytic expressions for such coefficients by pairing perturbative solutions of the Fokker-Planck approach with their counterparts from the spectral expansion. A tight relationship emerges between several of these coefficients and the Laplace transform of the interspike interval density (i.e., the distribution of first-passage times). "Coefficients" like the current-to-rate gain function, the eigenvalues of the Fokker-Planck operator and its eigenfunctions at the boundaries are derived without resorting to integral expressions. For the leaky integrate-and-fire neurons, the coupling terms between stationary and nonstationary modes are also worked out paving the way to accurately characterize the critical points and the relaxation timescales in networks of interacting populations.
Collapse
|
4
|
Spaeth A, Haussler D, Teodorescu M. Model-agnostic neural mean field with a data-driven transfer function. NEUROMORPHIC COMPUTING AND ENGINEERING 2024; 4:034013. [PMID: 39310743 PMCID: PMC11413991 DOI: 10.1088/2634-4386/ad787f] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/04/2024] [Revised: 09/02/2024] [Accepted: 09/09/2024] [Indexed: 09/25/2024]
Abstract
As one of the most complex systems known to science, modeling brain behavior and function is both fascinating and extremely difficult. Empirical data is increasingly available from ex vivo human brain organoids and surgical samples, as well as in vivo animal models, so the problem of modeling the behavior of large-scale neuronal systems is more relevant than ever. The statistical physics concept of a mean-field model offers a tractable way to bridge the gap between single-neuron and population-level descriptions of neuronal activity, by modeling the behavior of a single representative neuron and extending this to the population. However, existing neural mean-field methods typically either take the limit of small interaction sizes, or are applicable only to the specific neuron models for which they were derived. This paper derives a mean-field model by fitting a transfer function called Refractory SoftPlus, which is simple yet applicable to a broad variety of neuron types. The transfer function is fitted numerically to simulated spike time data, and is entirely agnostic to the underlying neuronal dynamics. The resulting mean-field model predicts the response of a network of randomly connected neurons to a time-varying external stimulus with a high degree of accuracy. Furthermore, it enables an accurate approximate bifurcation analysis as a function of the level of recurrent input. This model does not assume large presynaptic rates or small postsynaptic potential size, allowing mean-field models to be developed even for populations with large interaction terms.
Collapse
Affiliation(s)
- Alex Spaeth
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| | - David Haussler
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| | - Mircea Teodorescu
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| |
Collapse
|
5
|
Qi Y. Moment neural network and an efficient numerical method for modeling irregular spiking activity. Phys Rev E 2024; 110:024310. [PMID: 39295055 DOI: 10.1103/physreve.110.024310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2024] [Accepted: 07/01/2024] [Indexed: 09/21/2024]
Abstract
Continuous rate-based neural networks have been widely applied to modeling the dynamics of cortical circuits. However, cortical neurons in the brain exhibit irregular spiking activity with complex correlation structures that cannot be captured by mean firing rate alone. To close this gap, we consider a framework for modeling irregular spiking activity, called the moment neural network, which naturally generalizes rate models to second-order moments and can accurately capture the firing statistics of spiking neural networks. We propose an efficient numerical method that allows for rapid evaluation of moment mappings for neuronal activations without solving the underlying Fokker-Planck equation. This allows simulation of coupled interactions of mean firing rate and firing variability of large-scale neural circuits while retaining the advantage of analytical tractability of continuous rate models. We demonstrate how the moment neural network can explain a range of phenomena including diverse Fano factor in networks with quenched disorder and the emergence of irregular oscillatory dynamics in excitation-inhibition networks with delay.
Collapse
Affiliation(s)
- Yang Qi
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Shanghai 200433, China; and MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China
| |
Collapse
|
6
|
Metzner C, Dimulescu C, Kamp F, Fromm S, Uhlhaas PJ, Obermayer K. Exploring global and local processes underlying alterations in resting-state functional connectivity and dynamics in schizophrenia. Front Psychiatry 2024; 15:1352641. [PMID: 38414495 PMCID: PMC10897003 DOI: 10.3389/fpsyt.2024.1352641] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Accepted: 01/19/2024] [Indexed: 02/29/2024] Open
Abstract
Introduction We examined changes in large-scale functional connectivity and temporal dynamics and their underlying mechanisms in schizophrenia (ScZ) through measurements of resting-state functional magnetic resonance imaging (rs-fMRI) data and computational modelling. Methods The rs-fMRI measurements from patients with chronic ScZ (n=38) and matched healthy controls (n=43), were obtained through the public schizConnect repository. Computational models were constructed based on diffusion-weighted MRI scans and fit to the experimental rs-fMRI data. Results We found decreased large-scale functional connectivity across sensory and association areas and for all functional subnetworks for the ScZ group. Additionally global synchrony was reduced in patients while metastability was unaltered. Perturbations of the computational model revealed that decreased global coupling and increased background noise levels both explained the experimentally found deficits better than local changes to the GABAergic or glutamatergic system. Discussion The current study suggests that large-scale alterations in ScZ are more likely the result of global rather than local network changes.
Collapse
Affiliation(s)
- Christoph Metzner
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Department of Child and Adolescent Psychiatry, Charité – Universitätsmedizin Berlin, Berlin, Germany
- School of Physics, Engineering and Computer Science, University of Hertfordshire, Hatfield, United Kingdom
| | - Cristiana Dimulescu
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Fabian Kamp
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Max Planck School of Cognition, Max Planck Institute for Human Cognitive and Brain Science, Leipzig, Germany
- Center for Lifespan Psychology, Max Planck Institute for Human Development, Berlin, Germany
| | - Sophie Fromm
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Department of Psychiatry and Psychotherapy, Charité – Universitätsmedizin Berlin, Berlin, Germany
| | - Peter J. Uhlhaas
- Department of Child and Adolescent Psychiatry, Charité – Universitätsmedizin Berlin, Berlin, Germany
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, United Kingdom
| | - Klaus Obermayer
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
7
|
Spaeth A, Haussler D, Teodorescu M. Model-Agnostic Neural Mean Field With The Refractory SoftPlus Transfer Function. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.05.579047. [PMID: 38370695 PMCID: PMC10871173 DOI: 10.1101/2024.02.05.579047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/20/2024]
Abstract
Due to the complexity of neuronal networks and the nonlinear dynamics of individual neurons, it is challenging to develop a systems-level model which is accurate enough to be useful yet tractable enough to apply. Mean-field models which extrapolate from single-neuron descriptions to large-scale models can be derived from the neuron's transfer function, which gives its firing rate as a function of its synaptic input. However, analytically derived transfer functions are applicable only to the neurons and noise models from which they were originally derived. In recent work, approximate transfer functions have been empirically derived by fitting a sigmoidal curve, which imposes a maximum firing rate and applies only in the diffusion limit, restricting applications. In this paper, we propose an approximate transfer function called Refractory SoftPlus, which is simple yet applicable to a broad variety of neuron types. Refractory SoftPlus activation functions allow the derivation of simple empirically approximated mean-field models using simulation results, which enables prediction of the response of a network of randomly connected neurons to a time-varying external stimulus with a high degree of accuracy. These models also support an accurate approximate bifurcation analysis as a function of the level of recurrent input. Finally, the model works without assuming large presynaptic rates or small postsynaptic potential size, allowing mean-field models to be developed even for populations with large interaction terms.
Collapse
Affiliation(s)
- Alex Spaeth
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - David Haussler
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - Mircea Teodorescu
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
| |
Collapse
|
8
|
Felsheim RC, Dietz M. An Adaptive Leaky-Integrate and Firing Probability Model of an Electrically Stimulated Auditory Nerve Fiber. Trends Hear 2024; 28:23312165241286742. [PMID: 39497532 PMCID: PMC11536406 DOI: 10.1177/23312165241286742] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2024] [Revised: 08/26/2024] [Accepted: 08/31/2024] [Indexed: 11/07/2024] Open
Abstract
Most neural models produce a spiking output and often represent the stochastic nature of the spike generation process via a stochastic output. Nonspiking neural models, on the other hand, predict the probability of a spike occurring in response to a stimulus. We propose a nonspiking model for an electrically stimulated auditory nerve fiber, which not only predicts the total probability of a spike occurring in response to a biphasic pulse but also the distribution of the spike time. Our adaptive leaky-integrate and firing probability (aLIFP) model can account for refractoriness, facilitation, accommodation, and long-term adaptation. All model parameters have been fitted to single cell recordings from electrically stimulated cat auditory nerve fibers. Afterward, the model was validated on recordings from auditory nerve fibers from cats and guinea pigs. The nonspiking nature of the model makes it fast and deterministic while still accounting for the stochastic nature of the spike generation process. Therefore, the relationship between the input to the model or model parameters and the model's output can be observed more directly than with stochastically spiking models.
Collapse
Affiliation(s)
- Rebecca C. Felsheim
- Department of Medical Physics and Acoustics, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Cluster of Excellence “Hearing4All”, Oldenburg, Germany
| | - Mathias Dietz
- Department of Medical Physics and Acoustics, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Cluster of Excellence “Hearing4All”, Oldenburg, Germany
| |
Collapse
|
9
|
Huang CH, Lin CCK. New biophysical rate-based modeling of long-term plasticity in mean-field neuronal population models. Comput Biol Med 2023; 163:107213. [PMID: 37413849 DOI: 10.1016/j.compbiomed.2023.107213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Revised: 05/20/2023] [Accepted: 06/25/2023] [Indexed: 07/08/2023]
Abstract
The formation of customized neural networks as the basis of brain functions such as receptive field selectivity, learning or memory depends heavily on the long-term plasticity of synaptic connections. However, the current mean-field population models commonly used to simulate large-scale neural network dynamics lack explicit links to the underlying cellular mechanisms of long-term plasticity. In this study, we developed a new mean-field population model, the plastic density-based neural mass model (pdNMM), by incorporating a newly developed rate-based plasticity model based on the calcium control hypothesis into an existing density-based neural mass model. Derivation of the plasticity model was carried out using population density methods. Our results showed that the synaptic plasticity represented by the resulting rate-based plasticity model exhibited Bienenstock-Cooper-Munro-like learning rules. Furthermore, we demonstrated that the pdNMM accurately reproduced previous experimental observations of long-term plasticity, including characteristics of Hebbian plasticity such as longevity, associativity and input specificity, on hippocampal slices, and the formation of receptive field selectivity in the visual cortex. In conclusion, the pdNMM is a novel approach that can confer long-term plasticity to conventional mean-field neuronal population models.
Collapse
Affiliation(s)
- Chih-Hsu Huang
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chou-Ching K Lin
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Innovation Center of Medical Devices and Technology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Medical Device Innovation Center, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|
10
|
Clusella P, Köksal-Ersöz E, Garcia-Ojalvo J, Ruffini G. Comparison between an exact and a heuristic neural mass model with second-order synapses. BIOLOGICAL CYBERNETICS 2023; 117:5-19. [PMID: 36454267 PMCID: PMC10160168 DOI: 10.1007/s00422-022-00952-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Accepted: 10/23/2022] [Indexed: 05/05/2023]
Abstract
Neural mass models (NMMs) are designed to reproduce the collective dynamics of neuronal populations. A common framework for NMMs assumes heuristically that the output firing rate of a neural population can be described by a static nonlinear transfer function (NMM1). However, a recent exact mean-field theory for quadratic integrate-and-fire (QIF) neurons challenges this view by showing that the mean firing rate is not a static function of the neuronal state but follows two coupled nonlinear differential equations (NMM2). Here we analyze and compare these two descriptions in the presence of second-order synaptic dynamics. First, we derive the mathematical equivalence between the two models in the infinitely slow synapse limit, i.e., we show that NMM1 is an approximation of NMM2 in this regime. Next, we evaluate the applicability of this limit in the context of realistic physiological parameter values by analyzing the dynamics of models with inhibitory or excitatory synapses. We show that NMM1 fails to reproduce important dynamical features of the exact model, such as the self-sustained oscillations of an inhibitory interneuron QIF network. Furthermore, in the exact model but not in the limit one, stimulation of a pyramidal cell population induces resonant oscillatory activity whose peak frequency and amplitude increase with the self-coupling gain and the external excitatory input. This may play a role in the enhanced response of densely connected networks to weak uniform inputs, such as the electric fields produced by noninvasive brain stimulation.
Collapse
Affiliation(s)
- Pau Clusella
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain.
| | - Elif Köksal-Ersöz
- LTSI - UMR 1099, INSERM, Univ Rennes, Campus Beaulieu, 35000, Rennes, France
| | - Jordi Garcia-Ojalvo
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain
| | - Giulio Ruffini
- Brain Modeling Department, Neuroelectrics, Av. Tibidabo, 47b, 08035, Barcelona, Spain.
| |
Collapse
|
11
|
Kang L, Ranft J, Hakim V. Beta oscillations and waves in motor cortex can be accounted for by the interplay of spatially structured connectivity and fluctuating inputs. eLife 2023; 12:e81446. [PMID: 36917621 PMCID: PMC10112891 DOI: 10.7554/elife.81446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2022] [Accepted: 03/02/2023] [Indexed: 03/15/2023] Open
Abstract
The beta rhythm (13-30 Hz) is a prominent brain rhythm. Recordings in primates during instructed-delay reaching tasks have shown that different types of traveling waves of oscillatory activity are associated with episodes of beta oscillations in motor cortex during movement preparation. We propose here a simple model of motor cortex based on local excitatory-inhibitory neuronal populations coupled by long-range excitation, where additionally inputs to the motor cortex from other neural structures are represented by stochastic inputs on the different model populations. We show that the model accurately reproduces the statistics of recording data when these external inputs are correlated on a short time scale (25 ms) and have two different components, one that targets the motor cortex locally and another one that targets it in a global and synchronized way. The model reproduces the distribution of beta burst durations, the proportion of the different observed wave types, and wave speeds, which we show not to be linked to axonal propagation speed. When the long-range connectivity or the local input targets are anisotropic, traveling waves are found to preferentially propagate along the axis where connectivity decays the fastest. Different from previously proposed mechanistic explanations, the model suggests that traveling waves in motor cortex are the reflection of the dephasing by external inputs, putatively of thalamic origin, of an oscillatory activity that would otherwise be spatially synchronized by recurrent connectivity.
Collapse
Affiliation(s)
- Ling Kang
- Laboratoire de Physique de l’Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de ParisParisFrance
- School of Physics and Electronic Science, East China Normal UniversityShanghaiChina
| | - Jonas Ranft
- Institut de Biologie de l’Ecole Normale Supérieure (IBENS), CNRS, Ecole Normale Supérieure, PSL UniversityParisFrance
| | - Vincent Hakim
- Laboratoire de Physique de l’Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de ParisParisFrance
| |
Collapse
|
12
|
Vinci GV, Benzi R, Mattia M. Self-Consistent Stochastic Dynamics for Finite-Size Networks of Spiking Neurons. PHYSICAL REVIEW LETTERS 2023; 130:097402. [PMID: 36930929 DOI: 10.1103/physrevlett.130.097402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 12/23/2022] [Accepted: 02/09/2023] [Indexed: 06/18/2023]
Abstract
Despite the huge number of neurons composing a brain network, ongoing activity of local cell assemblies is intrinsically stochastic. Fluctuations in their instantaneous rate of spike firing ν(t) scale with the size of the assembly and persist in isolated networks, i.e., in the absence of external sources of noise. Although deterministic chaos due to the quenched disorder of the synaptic couplings underlies this seemingly stochastic dynamics, an effective theory for the network dynamics of a finite assembly of spiking neurons is lacking. Here, we fill this gap by extending the so-called population density approach including an activity- and size-dependent stochastic source in the Fokker-Planck equation for the membrane potential density. The finite-size noise embedded in this stochastic partial derivative equation is analytically characterized leading to a self-consistent and nonperturbative description of ν(t) valid for a wide class of spiking neuron networks. Power spectra of ν(t) are found in excellent agreement with those from detailed simulations both in the linear regime and across a synchronization phase transition, when a size-dependent smearing of the critical dynamics emerges.
Collapse
Affiliation(s)
- Gianni V Vinci
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
- PhD Program in Physics, Dept. of Physics, "Tor Vergata" University of Rome, 00133 Roma, Italy
| | - Roberto Benzi
- Dept. of Physics and INFN, "Tor Vergata" University of Rome, 00133 Roma, Italy
- Centro Ricerche "E. Fermi," 00184, Roma, Italy
| | - Maurizio Mattia
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
| |
Collapse
|
13
|
Salfenmoser L, Obermayer K. Nonlinear optimal control of a mean-field model of neural population dynamics. Front Comput Neurosci 2022; 16:931121. [PMID: 35990368 PMCID: PMC9382303 DOI: 10.3389/fncom.2022.931121] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2022] [Accepted: 07/11/2022] [Indexed: 11/13/2022] Open
Abstract
We apply the framework of nonlinear optimal control to a biophysically realistic neural mass model, which consists of two mutually coupled populations of deterministic excitatory and inhibitory neurons. External control signals are realized by time-dependent inputs to both populations. Optimality is defined by two alternative cost functions that trade the deviation of the controlled variable from its target value against the “strength” of the control, which is quantified by the integrated 1- and 2-norms of the control signal. We focus on a bistable region in state space where one low- (“down state”) and one high-activity (“up state”) stable fixed points coexist. With methods of nonlinear optimal control, we search for the most cost-efficient control function to switch between both activity states. For a broad range of parameters, we find that cost-efficient control strategies consist of a pulse of finite duration to push the state variables only minimally into the basin of attraction of the target state. This strategy only breaks down once we impose time constraints that force the system to switch on a time scale comparable to the duration of the control pulse. Penalizing control strength via the integrated 1-norm (2-norm) yields control inputs targeting one or both populations. However, whether control inputs to the excitatory or the inhibitory population dominate, depends on the location in state space relative to the bifurcation lines. Our study highlights the applicability of nonlinear optimal control to understand neuronal processing under constraints better.
Collapse
|
14
|
Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 2022; 50:445-469. [PMID: 35834100 DOI: 10.1007/s10827-022-00825-9] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 06/15/2022] [Indexed: 10/17/2022]
Abstract
Networks of spiking neurons with adaption have been shown to be able to reproduce a wide range of neural activities, including the emergent population bursting and spike synchrony that underpin brain disorders and normal function. Exact mean-field models derived from spiking neural networks are extremely valuable, as such models can be used to determine how individual neurons and the network they reside within interact to produce macroscopic network behaviours. In the paper, we derive and analyze a set of exact mean-field equations for the neural network with spike frequency adaptation. Specifically, our model is a network of Izhikevich neurons, where each neuron is modeled by a two dimensional system consisting of a quadratic integrate and fire equation plus an equation which implements spike frequency adaptation. Previous work deriving a mean-field model for this type of network, relied on the assumption of sufficiently slow dynamics of the adaptation variable. However, this approximation did not succeed in establishing an exact correspondence between the macroscopic description and the realistic neural network, especially when the adaptation time constant was not large. The challenge lies in how to achieve a closed set of mean-field equations with the inclusion of the mean-field dynamics of the adaptation variable. We address this problem by using a Lorentzian ansatz combined with the moment closure approach to arrive at a mean-field system in the thermodynamic limit. The resulting macroscopic description is capable of qualitatively and quantitatively describing the collective dynamics of the neural network, including transition between states where the individual neurons exhibit asynchronous tonic firing and synchronous bursting. We extend the approach to a network of two populations of neurons and discuss the accuracy and efficacy of our mean-field approximations by examining all assumptions that are imposed during the derivation. Numerical bifurcation analysis of our mean-field models reveals bifurcations not previously observed in the models, including a novel mechanism for emergence of bursting in the network. We anticipate our results will provide a tractable and reliable tool to investigate the underlying mechanism of brain function and dysfunction from the perspective of computational neuroscience.
Collapse
|
15
|
Jajcay N, Cakan C, Obermayer K. Cross-Frequency Slow Oscillation–Spindle Coupling in a Biophysically Realistic Thalamocortical Neural Mass Model. Front Comput Neurosci 2022; 16:769860. [PMID: 35603132 PMCID: PMC9120371 DOI: 10.3389/fncom.2022.769860] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2021] [Accepted: 03/28/2022] [Indexed: 11/13/2022] Open
Abstract
Sleep manifests itself by the spontaneous emergence of characteristic oscillatory rhythms, which often time-lock and are implicated in memory formation. Here, we analyze a neural mass model of the thalamocortical loop in which the cortical node can generate slow oscillations (approximately 1 Hz) while its thalamic component can generate fast sleep spindles of σ-band activity (12–15 Hz). We study the dynamics for different coupling strengths between the thalamic and cortical nodes, for different conductance values of the thalamic node's potassium leak and hyperpolarization-activated cation-nonselective currents, and for different parameter regimes of the cortical node. The latter are listed as follows: (1) a low activity (DOWN) state with noise-induced, transient excursions into a high activity (UP) state, (2) an adaptation induced slow oscillation limit cycle with alternating UP and DOWN states, and (3) a high activity (UP) state with noise-induced, transient excursions into the low activity (DOWN) state. During UP states, thalamic spindling is abolished or reduced. During DOWN states, the thalamic node generates sleep spindles, which in turn can cause DOWN to UP transitions in the cortical node. Consequently, this leads to spindle-induced UP state transitions in parameter regime (1), thalamic spindles induced in some but not all DOWN states in regime (2), and thalamic spindles following UP to DOWN transitions in regime (3). The spindle-induced σ-band activity in the cortical node, however, is typically the strongest during the UP state, which follows a DOWN state “window of opportunity” for spindling. When the cortical node is parametrized in regime (3), the model well explains the interactions between slow oscillations and sleep spindles observed experimentally during Non-Rapid Eye Movement sleep. The model is computationally efficient and can be integrated into large-scale modeling frameworks to study spatial aspects like sleep wave propagation.
Collapse
Affiliation(s)
- Nikola Jajcay
- Neural Information Processing Group, Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Department of Complex Systems, Institute of Computer Science, Czech Academy of Sciences, Prague, Czechia
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- *Correspondence: Nikola Jajcay
| | - Caglar Cakan
- Neural Information Processing Group, Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Klaus Obermayer
- Neural Information Processing Group, Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
16
|
Metzner C, Mäki-Marttunen T, Karni G, McMahon-Cole H, Steuber V. The effect of alterations of schizophrenia-associated genes on gamma band oscillations. SCHIZOPHRENIA (HEIDELBERG, GERMANY) 2022; 8:46. [PMID: 35854005 PMCID: PMC9261091 DOI: 10.1038/s41537-022-00255-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/17/2021] [Accepted: 04/08/2022] [Indexed: 11/30/2022]
Abstract
Abnormalities in the synchronized oscillatory activity of neurons in general and, specifically in the gamma band, might play a crucial role in the pathophysiology of schizophrenia. While these changes in oscillatory activity have traditionally been linked to alterations at the synaptic level, we demonstrate here, using computational modeling, that common genetic variants of ion channels can contribute strongly to this effect. Our model of primary auditory cortex highlights multiple schizophrenia-associated genetic variants that reduce gamma power in an auditory steady-state response task. Furthermore, we show that combinations of several of these schizophrenia-associated variants can produce similar effects as the more traditionally considered synaptic changes. Overall, our study provides a mechanistic link between schizophrenia-associated common genetic variants, as identified by genome-wide association studies, and one of the most robust neurophysiological endophenotypes of schizophrenia.
Collapse
Affiliation(s)
- Christoph Metzner
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.
- Biocomputation Research Group, School of Physics, Engineering and Computer Science, University of Hertfordshire, Hatfield, United Kingdom.
| | | | - Gili Karni
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Minerva Schools at KGI, San Francisco, CA, USA
| | - Hana McMahon-Cole
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Minerva Schools at KGI, San Francisco, CA, USA
| | - Volker Steuber
- Biocomputation Research Group, School of Physics, Engineering and Computer Science, University of Hertfordshire, Hatfield, United Kingdom
| |
Collapse
|
17
|
Cakan C, Dimulescu C, Khakimova L, Obst D, Flöel A, Obermayer K. Spatiotemporal Patterns of Adaptation-Induced Slow Oscillations in a Whole-Brain Model of Slow-Wave Sleep. Front Comput Neurosci 2022; 15:800101. [PMID: 35095451 PMCID: PMC8790481 DOI: 10.3389/fncom.2021.800101] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2021] [Accepted: 12/16/2021] [Indexed: 11/13/2022] Open
Abstract
During slow-wave sleep, the brain is in a self-organized regime in which slow oscillations (SOs) between up- and down-states travel across the cortex. While an isolated piece of cortex can produce SOs, the brain-wide propagation of these oscillations are thought to be mediated by the long-range axonal connections. We address the mechanism of how SOs emerge and recruit large parts of the brain using a whole-brain model constructed from empirical connectivity data in which SOs are induced independently in each brain area by a local adaptation mechanism. Using an evolutionary optimization approach, good fits to human resting-state fMRI data and sleep EEG data are found at values of the adaptation strength close to a bifurcation where the model produces a balance between local and global SOs with realistic spatiotemporal statistics. Local oscillations are more frequent, last shorter, and have a lower amplitude. Global oscillations spread as waves of silence across the undirected brain graph, traveling from anterior to posterior regions. These traveling waves are caused by heterogeneities in the brain network in which the connection strengths between brain areas determine which areas transition to a down-state first, and thus initiate traveling waves across the cortex. Our results demonstrate the utility of whole-brain models for explaining the origin of large-scale cortical oscillations and how they are shaped by the connectome.
Collapse
Affiliation(s)
- Caglar Cakan
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Cristiana Dimulescu
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Liliia Khakimova
- Department of Neurology, University Medicine, Greifswald, Germany
| | - Daniela Obst
- Department of Neurology, University Medicine, Greifswald, Germany
| | - Agnes Flöel
- Department of Neurology, University Medicine, Greifswald, Germany
- German Center for Neurodegenerative Diseases, Greifswald, Germany
| | - Klaus Obermayer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
18
|
Osborne H, Deutz L, de Kamps M. Multidimensional Dynamical Systems with Noise. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:159-178. [DOI: 10.1007/978-3-030-89439-9_7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
19
|
Cakan C, Jajcay N, Obermayer K. neurolib: A Simulation Framework for Whole-Brain Neural Mass Modeling. Cognit Comput 2021. [DOI: 10.1007/s12559-021-09931-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Abstractneurolib is a computational framework for whole-brain modeling written in Python. It provides a set of neural mass models that represent the average activity of a brain region on a mesoscopic scale. In a whole-brain network model, brain regions are connected with each other based on biologically informed structural connectivity, i.e., the connectome of the brain. neurolib can load structural and functional datasets, set up a whole-brain model, manage its parameters, simulate it, and organize its outputs for later analysis. The activity of each brain region can be converted into a simulated BOLD signal in order to calibrate the model against empirical data from functional magnetic resonance imaging (fMRI). Extensive model analysis is made possible using a parameter exploration module, which allows one to characterize a model’s behavior as a function of changing parameters. An optimization module is provided for fitting models to multimodal empirical data using evolutionary algorithms. neurolib is designed to be extendable and allows for easy implementation of custom neural mass models, offering a versatile platform for computational neuroscientists for prototyping models, managing large numerical experiments, studying the structure–function relationship of brain networks, and for performing in-silico optimization of whole-brain models.
Collapse
|
20
|
Schwalger T. Mapping input noise to escape noise in integrate-and-fire neurons: a level-crossing approach. BIOLOGICAL CYBERNETICS 2021; 115:539-562. [PMID: 34668051 PMCID: PMC8551127 DOI: 10.1007/s00422-021-00899-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 09/27/2021] [Indexed: 06/13/2023]
Abstract
Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate ("escape noise"). While input noise lends itself to modeling biophysical noise processes, the phenomenological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with colored input noise and time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced across the sub- and suprathreshold firing regime compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise, enabling progress in the theory of neuronal population dynamics.
Collapse
Affiliation(s)
- Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623, Berlin, Germany.
- Bernstein Center for Computational Neuroscience Berlin, 10115, Berlin, Germany.
| |
Collapse
|
21
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel's time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
22
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
23
|
Huang CH, Lin CCK. A novel density-based neural mass model for simulating neuronal network dynamics with conductance-based synapses and membrane current adaptation. Neural Netw 2021; 143:183-197. [PMID: 34157643 DOI: 10.1016/j.neunet.2021.06.009] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2020] [Revised: 04/01/2021] [Accepted: 06/06/2021] [Indexed: 10/21/2022]
Abstract
Despite its success in understanding brain rhythms, the neural mass model, as a low-dimensional mean-field network model, is phenomenological in nature, so that it cannot replicate some of rich repertoire of responses seen in real neuronal tissues. Here, using a colored-synapse population density method, we derived a novel neural mass model, termed density-based neural mass model (dNMM), as the mean-field description of network dynamics of adaptive exponential integrate-and-fire (aEIF) neurons, in which two critical neuronal features, i.e., voltage-dependent conductance-based synaptic interactions and adaptation of firing rate responses, were included. Our results showed that the dNMM was capable of correctly estimating firing rate responses of a neuronal population of aEIF neurons receiving stationary or time-varying excitatory and inhibitory inputs. Finally, it was also able to quantitatively describe the effect of spike-frequency adaptation in the generation of asynchronous irregular activity of excitatory-inhibitory cortical networks. We conclude that in terms of its biological reality and calculation efficiency, the dNMM is a suitable candidate to build significantly large-scale network models involving multiple brain areas, where the neuronal population is the smallest dynamic unit.
Collapse
Affiliation(s)
- Chih-Hsu Huang
- Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chou-Ching K Lin
- Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|
24
|
Novikov N, Zakharov D, Moiseeva V, Gutkin B. Activity Stabilization in a Population Model of Working Memory by Sinusoidal and Noisy Inputs. Front Neural Circuits 2021; 15:647944. [PMID: 33967703 PMCID: PMC8096914 DOI: 10.3389/fncir.2021.647944] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2020] [Accepted: 03/19/2021] [Indexed: 01/22/2023] Open
Abstract
According to mechanistic theories of working memory (WM), information is retained as stimulus-dependent persistent spiking activity of cortical neural networks. Yet, how this activity is related to changes in the oscillatory profile observed during WM tasks remains a largely open issue. We explore joint effects of input gamma-band oscillations and noise on the dynamics of several firing rate models of WM. The considered models have a metastable active regime, i.e., they demonstrate long-lasting transient post-stimulus firing rate elevation. We start from a single excitatory-inhibitory circuit and demonstrate that either gamma-band or noise input could stabilize the active regime, thus supporting WM retention. We then consider a system of two circuits with excitatory intercoupling. We find that fast coupling allows for better stabilization by common noise compared to independent noise and stronger amplification of this effect by in-phase gamma inputs compared to anti-phase inputs. Finally, we consider a multi-circuit system comprised of two clusters, each containing a group of circuits receiving a common noise input and a group of circuits receiving independent noise. Each cluster is associated with its own local gamma generator, so all its circuits receive gamma-band input in the same phase. We find that gamma-band input differentially stabilizes the activity of the "common-noise" groups compared to the "independent-noise" groups. If the inter-cluster connections are fast, this effect is more pronounced when the gamma-band input is delivered to the clusters in the same phase rather than in the anti-phase. Assuming that the common noise comes from a large-scale distributed WM representation, our results demonstrate that local gamma oscillations can stabilize the activity of the corresponding parts of this representation, with stronger effect for fast long-range connections and synchronized gamma oscillations.
Collapse
Affiliation(s)
- Nikita Novikov
- Centre for Cognition and Decision Making, HSE University, Moscow, Russia
| | - Denis Zakharov
- Centre for Cognition and Decision Making, HSE University, Moscow, Russia
| | - Victoria Moiseeva
- Centre for Cognition and Decision Making, HSE University, Moscow, Russia
| | - Boris Gutkin
- Centre for Cognition and Decision Making, HSE University, Moscow, Russia.,Group for Neural Theory, LNC2 INSERM U960, Départment d'Études Cognitives, École Normale Supérieure, PSL Research Université, Paris, France
| |
Collapse
|
25
|
Liang J, Zhou T, Zhou C. Hopf Bifurcation in Mean Field Explains Critical Avalanches in Excitation-Inhibition Balanced Neuronal Networks: A Mechanism for Multiscale Variability. Front Syst Neurosci 2020; 14:580011. [PMID: 33324179 PMCID: PMC7725680 DOI: 10.3389/fnsys.2020.580011] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2020] [Accepted: 11/02/2020] [Indexed: 12/14/2022] Open
Abstract
Cortical neural circuits display highly irregular spiking in individual neurons but variably sized collective firing, oscillations and critical avalanches at the population level, all of which have functional importance for information processing. Theoretically, the balance of excitation and inhibition inputs is thought to account for spiking irregularity and critical avalanches may originate from an underlying phase transition. However, the theoretical reconciliation of these multilevel dynamic aspects in neural circuits remains an open question. Herein, we study excitation-inhibition (E-I) balanced neuronal network with biologically realistic synaptic kinetics. It can maintain irregular spiking dynamics with different levels of synchrony and critical avalanches emerge near the synchronous transition point. We propose a novel semi-analytical mean-field theory to derive the field equations governing the network macroscopic dynamics. It reveals that the E-I balanced state of the network manifesting irregular individual spiking is characterized by a macroscopic stable state, which can be either a fixed point or a periodic motion and the transition is predicted by a Hopf bifurcation in the macroscopic field. Furthermore, by analyzing public data, we find the coexistence of irregular spiking and critical avalanches in the spontaneous spiking activities of mouse cortical slice in vitro, indicating the universality of the observed phenomena. Our theory unveils the mechanism that permits complex neural activities in different spatiotemporal scales to coexist and elucidates a possible origin of the criticality of neural systems. It also provides a novel tool for analyzing the macroscopic dynamics of E-I balanced networks and its relationship to the microscopic counterparts, which can be useful for large-scale modeling and computation of cortical dynamics.
Collapse
Affiliation(s)
- Junhao Liang
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Key Laboratory of Computational Mathematics, Guangdong Province, and School of Mathematics, Sun Yat-sen University, Guangzhou, China
| | - Tianshou Zhou
- Key Laboratory of Computational Mathematics, Guangdong Province, and School of Mathematics, Sun Yat-sen University, Guangzhou, China
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Department of Physics, Zhejiang University, Hangzhou, China
| |
Collapse
|
26
|
Kulkarni A, Ranft J, Hakim V. Synchronization, Stochasticity, and Phase Waves in Neuronal Networks With Spatially-Structured Connectivity. Front Comput Neurosci 2020; 14:569644. [PMID: 33192427 PMCID: PMC7604323 DOI: 10.3389/fncom.2020.569644] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 08/18/2020] [Indexed: 01/15/2023] Open
Abstract
Oscillations in the beta/low gamma range (10–45 Hz) are recorded in diverse neural structures. They have successfully been modeled as sparsely synchronized oscillations arising from reciprocal interactions between randomly connected excitatory (E) pyramidal cells and local interneurons (I). The synchronization of spatially distant oscillatory spiking E–I modules has been well-studied in the rate model framework but less so for modules of spiking neurons. Here, we first show that previously proposed modifications of rate models provide a quantitative description of spiking E–I modules of Exponential Integrate-and-Fire (EIF) neurons. This allows us to analyze the dynamical regimes of sparsely synchronized oscillatory E–I modules connected by long-range excitatory interactions, for two modules, as well as for a chain of such modules. For modules with a large number of neurons (> 105), we obtain results similar to previously obtained ones based on the classic deterministic Wilson-Cowan rate model, with the added bonus that the results quantitatively describe simulations of spiking EIF neurons. However, for modules with a moderate (~ 104) number of neurons, stochastic variations in the spike emission of neurons are important and need to be taken into account. On the one hand, they modify the oscillations in a way that tends to promote synchronization between different modules. On the other hand, independent fluctuations on different modules tend to disrupt synchronization. The correlations between distant oscillatory modules can be described by stochastic equations for the oscillator phases that have been intensely studied in other contexts. On shorter distances, we develop a description that also takes into account amplitude modes and that quantitatively accounts for our simulation data. Stochastic dephasing of neighboring modules produces transient phase gradients and the transient appearance of phase waves. We propose that these stochastically-induced phase waves provide an explanative framework for the observations of traveling waves in the cortex during beta oscillations.
Collapse
Affiliation(s)
- Anirudh Kulkarni
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France.,IBENS, Ecole Normale Supérieure, PSL University, CNRS, INSERM, Paris, France
| | - Jonas Ranft
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France.,IBENS, Ecole Normale Supérieure, PSL University, CNRS, INSERM, Paris, France
| | - Vincent Hakim
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France
| |
Collapse
|
27
|
René A, Longtin A, Macke JH. Inference of a Mesoscopic Population Model from Population Spike Trains. Neural Comput 2020; 32:1448-1498. [DOI: 10.1162/neco_a_01292] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Understanding how rich dynamics emerge in neural populations requires models exhibiting a wide range of behaviors while remaining interpretable in terms of connectivity and single-neuron dynamics. However, it has been challenging to fit such mechanistic spiking networks at the single-neuron scale to empirical population data. To close this gap, we propose to fit such data at a mesoscale, using a mechanistic but low-dimensional and, hence, statistically tractable model. The mesoscopic representation is obtained by approximating a population of neurons as multiple homogeneous pools of neurons and modeling the dynamics of the aggregate population activity within each pool. We derive the likelihood of both single-neuron and connectivity parameters given this activity, which can then be used to optimize parameters by gradient ascent on the log likelihood or perform Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. We illustrate this approach using a model of generalized integrate-and-fire neurons for which mesoscopic dynamics have been previously derived and show that both single-neuron and connectivity parameters can be recovered from simulated data. In particular, our inference method extracts posterior correlations between model parameters, which define parameter subsets able to reproduce the data. We compute the Bayesian posterior for combinations of parameters using MCMC sampling and investigate how the approximations inherent in a mesoscopic population model affect the accuracy of the inferred single-neuron parameters.
Collapse
Affiliation(s)
- Alexandre René
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada; Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany; and Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich 52425, Germany
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada, and Brain and Mind Research Institute, University of Ottawa, Ottawa K1H 8M5, Canada
| | - Jakob H. Macke
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany, and Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of Munich, Munich 80333, Germany
| |
Collapse
|
28
|
Biophysically grounded mean-field models of neural populations under electrical stimulation. PLoS Comput Biol 2020; 16:e1007822. [PMID: 32324734 PMCID: PMC7200022 DOI: 10.1371/journal.pcbi.1007822] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Revised: 05/05/2020] [Accepted: 03/24/2020] [Indexed: 11/19/2022] Open
Abstract
Electrical stimulation of neural systems is a key tool for understanding neural dynamics and ultimately for developing clinical treatments. Many applications of electrical stimulation affect large populations of neurons. However, computational models of large networks of spiking neurons are inherently hard to simulate and analyze. We evaluate a reduced mean-field model of excitatory and inhibitory adaptive exponential integrate-and-fire (AdEx) neurons which can be used to efficiently study the effects of electrical stimulation on large neural populations. The rich dynamical properties of this basic cortical model are described in detail and validated using large network simulations. Bifurcation diagrams reflecting the network's state reveal asynchronous up- and down-states, bistable regimes, and oscillatory regions corresponding to fast excitation-inhibition and slow excitation-adaptation feedback loops. The biophysical parameters of the AdEx neuron can be coupled to an electric field with realistic field strengths which then can be propagated up to the population description. We show how on the edge of bifurcation, direct electrical inputs cause network state transitions, such as turning on and off oscillations of the population rate. Oscillatory input can frequency-entrain and phase-lock endogenous oscillations. Relatively weak electric field strengths on the order of 1 V/m are able to produce these effects, indicating that field effects are strongly amplified in the network. The effects of time-varying external stimulation are well-predicted by the mean-field model, further underpinning the utility of low-dimensional neural mass models.
Collapse
|
29
|
Schmidt H, Avitabile D. Bumps and oscillons in networks of spiking neurons. CHAOS (WOODBURY, N.Y.) 2020; 30:033133. [PMID: 32237760 DOI: 10.1063/1.5135579] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/06/2019] [Accepted: 03/03/2020] [Indexed: 06/11/2023]
Abstract
We study localized patterns in an exact mean-field description of a spatially extended network of quadratic integrate-and-fire neurons. We investigate conditions for the existence and stability of localized solutions, so-called bumps, and give an analytic estimate for the parameter range, where these solutions exist in parameter space, when one or more microscopic network parameters are varied. We develop Galerkin methods for the model equations, which enable numerical bifurcation analysis of stationary and time-periodic spatially extended solutions. We study the emergence of patterns composed of multiple bumps, which are arranged in a snake-and-ladder bifurcation structure if a homogeneous or heterogeneous synaptic kernel is suitably chosen. Furthermore, we examine time-periodic, spatially localized solutions (oscillons) in the presence of external forcing, and in autonomous, recurrently coupled excitatory and inhibitory networks. In both cases, we observe period-doubling cascades leading to chaotic oscillations.
Collapse
Affiliation(s)
- Helmut Schmidt
- Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany
| | - Daniele Avitabile
- Department of Mathematics, Faculteit der Exacte Wetenschappen, Vrije Universiteit (VU University Amsterdam), De Boelelaan 1081a, 1081 HV Amsterdam, Netherlands and Mathneuro Team, Inria Sophia Antipolis, 2004 Rue des Lucioles, Sophia Antipolis, 06902 Cedex, France
| |
Collapse
|
30
|
Inferring and validating mechanistic models of neural microcircuits based on spike-train data. Nat Commun 2019; 10:4933. [PMID: 31666513 PMCID: PMC6821748 DOI: 10.1038/s41467-019-12572-0] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2018] [Accepted: 09/18/2019] [Indexed: 01/11/2023] Open
Abstract
The interpretation of neuronal spike train recordings often relies on abstract statistical models that allow for principled parameter estimation and model selection but provide only limited insights into underlying microcircuits. In contrast, mechanistic models are useful to interpret microcircuit dynamics, but are rarely quantitatively matched to experimental data due to methodological challenges. Here we present analytical methods to efficiently fit spiking circuit models to single-trial spike trains. Using derived likelihood functions, we statistically infer the mean and variance of hidden inputs, neuronal adaptation properties and connectivity for coupled integrate-and-fire neurons. Comprehensive evaluations on synthetic data, validations using ground truth in-vitro and in-vivo recordings, and comparisons with existing techniques demonstrate that parameter estimation is very accurate and efficient, even for highly subsampled networks. Our methods bridge statistical, data-driven and theoretical, model-based neurosciences at the level of spiking circuits, for the purpose of a quantitative, mechanistic interpretation of recorded neuronal population activity. It is difficult to fit mechanistic, biophysically constrained circuit models to spike train data from in vivo extracellular recordings. Here the authors present analytical methods that enable efficient parameter estimation for integrate-and-fire circuit models and inference of the underlying connectivity structure in subsampled networks.
Collapse
|
31
|
Koren V, Andrei AR, Hu M, Dragoi V, Obermayer K. Reading-out task variables as a low-dimensional reconstruction of neural spike trains in single trials. PLoS One 2019; 14:e0222649. [PMID: 31622346 PMCID: PMC6797168 DOI: 10.1371/journal.pone.0222649] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2019] [Accepted: 09/03/2019] [Indexed: 11/18/2022] Open
Abstract
We propose a new model of the read-out of spike trains that exploits the multivariate structure of responses of neural ensembles. Assuming the point of view of a read-out neuron that receives synaptic inputs from a population of projecting neurons, synaptic inputs are weighted with a heterogeneous set of weights. We propose that synaptic weights reflect the role of each neuron within the population for the computational task that the network has to solve. In our case, the computational task is discrimination of binary classes of stimuli, and weights are such as to maximize the discrimination capacity of the network. We compute synaptic weights as the feature weights of an optimal linear classifier. Once weights have been learned, they weight spike trains and allow to compute the post-synaptic current that modulates the spiking probability of the read-out unit in real time. We apply the model on parallel spike trains from V1 and V4 areas in the behaving monkey macaca mulatta, while the animal is engaged in a visual discrimination task with binary classes of stimuli. The read-out of spike trains with our model allows to discriminate the two classes of stimuli, while population PSTH entirely fails to do so. Splitting neurons in two subpopulations according to the sign of the weight, we show that population signals of the two functional subnetworks are negatively correlated. Disentangling the superficial, the middle and the deep layer of the cortex, we show that in both V1 and V4, superficial layers are the most important in discriminating binary classes of stimuli.
Collapse
Affiliation(s)
- Veronika Koren
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Germany
- * E-mail:
| | - Ariana R. Andrei
- Department of Neurobiology and Anatomy, University of Texas Medical School, Houston, Texas, United States of America
| | - Ming Hu
- Picower Institute for Learning and Memory, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
| | - Valentin Dragoi
- Department of Neurobiology and Anatomy, University of Texas Medical School, Houston, Texas, United States of America
| | - Klaus Obermayer
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Germany
| |
Collapse
|
32
|
Mattia M, Biggio M, Galluzzi A, Storace M. Dimensional reduction in networks of non-Markovian spiking neurons: Equivalence of synaptic filtering and heterogeneous propagation delays. PLoS Comput Biol 2019; 15:e1007404. [PMID: 31593569 PMCID: PMC6799936 DOI: 10.1371/journal.pcbi.1007404] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2018] [Revised: 10/18/2019] [Accepted: 09/16/2019] [Indexed: 11/19/2022] Open
Abstract
Message passing between components of a distributed physical system is non-instantaneous and contributes to determine the time scales of the emerging collective dynamics. In biological neuron networks this is due in part to local synaptic filtering of exchanged spikes, and in part to the distribution of the axonal transmission delays. How differently these two kinds of communication protocols affect the network dynamics is still an open issue due to the difficulties in dealing with the non-Markovian nature of synaptic transmission. Here, we develop a mean-field dimensional reduction yielding to an effective Markovian dynamics of the population density of the neuronal membrane potential, valid under the hypothesis of small fluctuations of the synaptic current. Within this limit, the resulting theory allows us to prove the formal equivalence between the two transmission mechanisms, holding for any synaptic time scale, integrate-and-fire neuron model, spike emission regimes and for different network states even when the neuron number is finite. The equivalence holds even for larger fluctuations of the synaptic input, if white noise currents are incorporated to model other possible biological features such as ionic channel stochasticity.
Collapse
|
33
|
Schwalger T, Chizhov AV. Mind the last spike - firing rate models for mesoscopic populations of spiking neurons. Curr Opin Neurobiol 2019; 58:155-166. [PMID: 31590003 DOI: 10.1016/j.conb.2019.08.003] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/25/2019] [Indexed: 02/07/2023]
Abstract
The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons - a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany; Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany.
| | - Anton V Chizhov
- Ioffe Institute, 194021 Saint-Petersburg, Russia; Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, 194223 Saint-Petersburg, Russia
| |
Collapse
|
34
|
Ladenbauer J, Obermayer K. Weak electric fields promote resonance in neuronal spiking activity: Analytical results from two-compartment cell and network models. PLoS Comput Biol 2019; 15:e1006974. [PMID: 31009455 PMCID: PMC6476479 DOI: 10.1371/journal.pcbi.1006974] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2018] [Accepted: 03/22/2019] [Indexed: 12/29/2022] Open
Abstract
Transcranial brain stimulation and evidence of ephaptic coupling have sparked strong interests in understanding the effects of weak electric fields on the dynamics of neuronal populations. While their influence on the subthreshold membrane voltage can be biophysically well explained using spatially extended neuron models, mechanistic analyses of neuronal spiking and network activity have remained a methodological challenge. More generally, this challenge applies to phenomena for which single-compartment (point) neuron models are oversimplified. Here we employ a pyramidal neuron model that comprises two compartments, allowing to distinguish basal-somatic from apical dendritic inputs and accounting for an extracellular field in a biophysically minimalistic way. Using an analytical approach we fit its parameters to reproduce the response properties of a canonical, spatial model neuron and dissect the stochastic spiking dynamics of single cells and large networks. We show that oscillatory weak fields effectively mimic anti-correlated inputs at the soma and dendrite and strongly modulate neuronal spiking activity in a rather narrow frequency band. This effect carries over to coupled populations of pyramidal cells and inhibitory interneurons, boosting network-induced resonance in the beta and gamma frequency bands. Our work contributes a useful theoretical framework for mechanistic analyses of population dynamics going beyond point neuron models, and provides insights on modulation effects of extracellular fields due to the morphology of pyramidal cells. The elongated spatial structure of pyramidal neurons, which possess large apical dendrites, plays an important role for the integration of synaptic inputs and mediates sensitivity to weak extracellular electric fields. Modeling studies at the population level greatly contribute to our mechanistic understanding but face a methodological challenge because morphologically detailed neuron models are too complex for use in noisy, in-vivo like conditions and large networks in particular. Here we present an analytical approach based on a two-compartment spiking neuron model that can distinguish synaptic inputs at the apical dendrite from those at the somatic region and accounts for an extracellular field in a biophysically minimalistic way. We devised efficient methods to approximate the responses of a spatially more detailed pyramidal neuron model, and to study the spiking dynamics of single neurons and sparsely coupled large networks in the presence of fluctuating inputs. Using these methods we focused on how responses are affected by oscillatory weak fields. Our results suggest that ephaptic coupling may play a mechanistic role for oscillations of population activity and indicate the potential to entrain networks by weak electric stimulation.
Collapse
Affiliation(s)
- Josef Ladenbauer
- Laboratoire de Neurosciences Cognitives et Computationnelles, École Normale Supérieure - PSL Research University, Paris, France
- * E-mail:
| | - Klaus Obermayer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Germany
| |
Collapse
|
35
|
di Volo M, Romagnoni A, Capone C, Destexhe A. Biologically Realistic Mean-Field Models of Conductance-Based Networks of Spiking Neurons with Adaptation. Neural Comput 2019; 31:653-680. [DOI: 10.1162/neco_a_01173] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Accurate population models are needed to build very large-scale neural models, but their derivation is difficult for realistic networks of neurons, in particular when nonlinear properties are involved, such as conductance-based interactions and spike-frequency adaptation. Here, we consider such models based on networks of adaptive exponential integrate-and-fire excitatory and inhibitory neurons. Using a master equation formalism, we derive a mean-field model of such networks and compare it to the full network dynamics. The mean-field model is capable of correctly predicting the average spontaneous activity levels in asynchronous irregular regimes similar to in vivo activity. It also captures the transient temporal response of the network to complex external inputs. Finally, the mean-field model is also able to quantitatively describe regimes where high- and low-activity states alternate (up-down state dynamics), leading to slow oscillations. We conclude that such mean-field models are biologically realistic in the sense that they can capture both spontaneous and evoked activity, and they naturally appear as candidates to build very large-scale models involving multiple brain areas.
Collapse
Affiliation(s)
- Matteo di Volo
- Unité de Neuroscience, Information et Complexité, CNRS FRE 3693, 91198 Gif sur Yvette, France
| | - Alberto Romagnoni
- Centre de Recherche sur l'inflammation UMR 1149, Inserm-Université Paris Diderot, 75018 Paris, France, and Data Team, Departement d'informatique de l'Ecole normale supérieure, CNRS, PSL Research University, 75005 Paris, France, and European Institute for Theoretical Neuroscience, 75012 Paris, France
| | - Cristiano Capone
- European Institute for Theoretical Neuroscience, 75012 Paris, France, and INFN Sezione di Roma, Rome 00185, Italy
| | - Alain Destexhe
- Unité de Neuroscience, Information et Complexité, CNRS FRE 3693, 91198 Gif sur Yvette, France, and European Institute for Theoretical Neuroscience, 75012 Paris, France
| |
Collapse
|
36
|
de Kamps M, Lepperød M, Lai YM. Computational geometry for modeling neural populations: From visualization to simulation. PLoS Comput Biol 2019; 15:e1006729. [PMID: 30830903 PMCID: PMC6417745 DOI: 10.1371/journal.pcbi.1006729] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2018] [Revised: 03/14/2019] [Accepted: 11/26/2018] [Indexed: 11/18/2022] Open
Abstract
The importance of a mesoscopic description level of the brain has now been well established. Rate based models are widely used, but have limitations. Recently, several extremely efficient population-level methods have been proposed that go beyond the characterization of a population in terms of a single variable. Here, we present a method for simulating neural populations based on two dimensional (2D) point spiking neuron models that defines the state of the population in terms of a density function over the neural state space. Our method differs in that we do not make the diffusion approximation, nor do we reduce the state space to a single dimension (1D). We do not hard code the neural model, but read in a grid describing its state space in the relevant simulation region. Novel models can be studied without even recompiling the code. The method is highly modular: variations of the deterministic neural dynamics and the stochastic process can be investigated independently. Currently, there is a trend to reduce complex high dimensional neuron models to 2D ones as they offer a rich dynamical repertoire that is not available in 1D, such as limit cycles. We will demonstrate that our method is ideally suited to investigate noise in such systems, replicating results obtained in the diffusion limit and generalizing them to a regime of large jumps. The joint probability density function is much more informative than 1D marginals, and we will argue that the study of 2D systems subject to noise is important complementary to 1D systems.
Collapse
Affiliation(s)
- Marc de Kamps
- Institute for Artificial and Biological Intelligence, University of Leeds, Leeds, West Yorkshire, United Kingdom
| | - Mikkel Lepperød
- Institute of Basic Medical Sciences, and Center for Integrative Neuroplasticity, University of Oslo, Oslo, Norway
| | - Yi Ming Lai
- Institute for Artificial and Biological Intelligence, University of Leeds, Leeds, West Yorkshire, United Kingdom.,Currently at the School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| |
Collapse
|
37
|
Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS Comput Biol 2019; 15:e1006893. [PMID: 30897092 PMCID: PMC6445477 DOI: 10.1371/journal.pcbi.1006893] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 04/02/2019] [Accepted: 02/19/2019] [Indexed: 11/19/2022] Open
Abstract
Neural activity in awake behaving animals exhibits a vast range of timescales that can be several fold larger than the membrane time constant of individual neurons. Two types of mechanisms have been proposed to explain this conundrum. One possibility is that large timescales are generated by a network mechanism based on positive feedback, but this hypothesis requires fine-tuning of the strength or structure of the synaptic connections. A second possibility is that large timescales in the neural dynamics are inherited from large timescales of underlying biophysical processes, two prominent candidates being intrinsic adaptive ionic currents and synaptic transmission. How the timescales of adaptation or synaptic transmission influence the timescale of the network dynamics has however not been fully explored. To address this question, here we analyze large networks of randomly connected excitatory and inhibitory units with additional degrees of freedom that correspond to adaptation or synaptic filtering. We determine the fixed points of the systems, their stability to perturbations and the corresponding dynamical timescales. Furthermore, we apply dynamical mean field theory to study the temporal statistics of the activity in the fluctuating regime, and examine how the adaptation and synaptic timescales transfer from individual units to the whole population. Our overarching finding is that synaptic filtering and adaptation in single neurons have very different effects at the network level. Unexpectedly, the macroscopic network dynamics do not inherit the large timescale present in adaptive currents. In contrast, the timescales of network activity increase proportionally to the time constant of the synaptic filter. Altogether, our study demonstrates that the timescales of different biophysical processes have different effects on the network level, so that the slow processes within individual neurons do not necessarily induce slow activity in large recurrent neural networks.
Collapse
Affiliation(s)
- Manuel Beiran
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
38
|
Schmidt H, Avitabile D, Montbrió E, Roxin A. Network mechanisms underlying the role of oscillations in cognitive tasks. PLoS Comput Biol 2018; 14:e1006430. [PMID: 30188889 PMCID: PMC6143269 DOI: 10.1371/journal.pcbi.1006430] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2018] [Revised: 09/18/2018] [Accepted: 08/13/2018] [Indexed: 11/18/2022] Open
Abstract
Oscillatory activity robustly correlates with task demands during many cognitive tasks. However, not only are the network mechanisms underlying the generation of these rhythms poorly understood, but it is also still unknown to what extent they may play a functional role, as opposed to being a mere epiphenomenon. Here we study the mechanisms underlying the influence of oscillatory drive on network dynamics related to cognitive processing in simple working memory (WM), and memory recall tasks. Specifically, we investigate how the frequency of oscillatory input interacts with the intrinsic dynamics in networks of recurrently coupled spiking neurons to cause changes of state: the neuronal correlates of the corresponding cognitive process. We find that slow oscillations, in the delta and theta band, are effective in activating network states associated with memory recall. On the other hand, faster oscillations, in the beta range, can serve to clear memory states by resonantly driving transient bouts of spike synchrony which destabilize the activity. We leverage a recently derived set of exact mean-field equations for networks of quadratic integrate-and-fire neurons to systematically study the bifurcation structure in the periodically forced spiking network. Interestingly, we find that the oscillatory signals which are most effective in allowing flexible switching between network states are not smooth, pure sinusoids, but rather burst-like, with a sharp onset. We show that such periodic bursts themselves readily arise spontaneously in networks of excitatory and inhibitory neurons, and that the burst frequency can be tuned via changes in tonic drive. Finally, we show that oscillations in the gamma range can actually stabilize WM states which otherwise would not persist.
Collapse
Affiliation(s)
- Helmut Schmidt
- Centre de Recerca Matemàtica, Campus de Bellaterra Edifici C, 08193 Bellaterra, Barcelona, Spain.,Barcelona Graduate School of Mathematics, Campus de Bellaterra Edifici C, 08193 Bellaterra, Barcelona, Spain
| | - Daniele Avitabile
- School of Mathematical Sciences, University of Nottingham, University Park, Nottingham NG7 2QL, United Kingdom.,Inria Sophia Antipolis Méditerranée Research Centre, MathNeuro Team, 2004 route des Lucioles - Boîte Postale 93 06902 Sophia Antipolis, Cedex, France
| | - Ernest Montbrió
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, C. Ramon Trias Fargas 25 - 27, 08005 Barcelona, Spain
| | - Alex Roxin
- Centre de Recerca Matemàtica, Campus de Bellaterra Edifici C, 08193 Bellaterra, Barcelona, Spain.,Barcelona Graduate School of Mathematics, Campus de Bellaterra Edifici C, 08193 Bellaterra, Barcelona, Spain
| |
Collapse
|
39
|
Devalle F, Roxin A, Montbrió E. Firing rate equations require a spike synchrony mechanism to correctly describe fast oscillations in inhibitory networks. PLoS Comput Biol 2017; 13:e1005881. [PMID: 29287081 PMCID: PMC5764488 DOI: 10.1371/journal.pcbi.1005881] [Citation(s) in RCA: 51] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2017] [Revised: 01/11/2018] [Accepted: 11/15/2017] [Indexed: 12/25/2022] Open
Abstract
Recurrently coupled networks of inhibitory neurons robustly generate oscillations in the gamma band. Nonetheless, the corresponding Wilson-Cowan type firing rate equation for such an inhibitory population does not generate such oscillations without an explicit time delay. We show that this discrepancy is due to a voltage-dependent spike-synchronization mechanism inherent in networks of spiking neurons which is not captured by standard firing rate equations. Here we investigate an exact low-dimensional description for a network of heterogeneous canonical Class 1 inhibitory neurons which includes the sub-threshold dynamics crucial for generating synchronous states. In the limit of slow synaptic kinetics the spike-synchrony mechanism is suppressed and the standard Wilson-Cowan equations are formally recovered as long as external inputs are also slow. However, even in this limit synchronous spiking can be elicited by inputs which fluctuate on a time-scale of the membrane time-constant of the neurons. Our meanfield equations therefore represent an extension of the standard Wilson-Cowan equations in which spike synchrony is also correctly described. Population models describing the average activity of large neuronal ensembles are a powerful mathematical tool to investigate the principles underlying cooperative function of large neuronal systems. However, these models do not properly describe the phenomenon of spike synchrony in networks of neurons. In particular, they fail to capture the onset of synchronous oscillations in networks of inhibitory neurons. We show that this limitation is due to a voltage-dependent synchronization mechanism which is naturally present in spiking neuron models but not captured by traditional firing rate equations. Here we investigate a novel set of macroscopic equations which incorporate both firing rate and membrane potential dynamics, and that correctly generate fast inhibition-based synchronous oscillations. In the limit of slow-synaptic processing oscillations are suppressed, and the model reduces to an equation formally equivalent to the Wilson-Cowan model.
Collapse
Affiliation(s)
- Federico Devalle
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
- Department of Physics, Lancaster University, Lancaster, United Kingdom
| | - Alex Roxin
- Centre de Recerca Matemàtica, Campus de Bellaterra, Edifici C, Bellaterra, Barcelona, Spain
| | - Ernest Montbrió
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
- * E-mail:
| |
Collapse
|
40
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|