1
|
Di Geronimo C, Destexhe A, Di Volo M. Biologically realistic mean field model of spiking neural networks with fast and slow inhibitory synapses. J Comput Neurosci 2025:10.1007/s10827-025-00904-7. [PMID: 40266459 DOI: 10.1007/s10827-025-00904-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2025] [Revised: 02/28/2025] [Accepted: 04/03/2025] [Indexed: 04/24/2025]
Abstract
We present a mean field model for a spiking neural network of excitatory and inhibitory neurons with fast GABAA and nonlinear slow GABAB inhibitory conductance-based synapses. This mean field model can predict the spontaneous and evoked response of the network to external stimulation in asynchronous irregular regimes. The model displays theta oscillations for sufficiently strong GABAB conductance. Optogenetic activation of interneurons and an increase of GABAB conductance caused opposite effects on the emergence of gamma oscillations in the model. In agreement with direct numerical simulations of neural networks and experimental data, the mean field model predicts that an increase of GABAB conductance reduces gamma oscillations. Furthermore, the slow dynamics of GABAB synapses regulates the appearance and duration of transient gamma oscillations, namely gamma bursts, in the mean field model. Finally, we show that nonlinear GABAB synapses play a major role to stabilize the network from the emergence of epileptic seizures.
Collapse
Affiliation(s)
- Claudio Di Geronimo
- Université Claude Bernard Lyon 1, Institut National de la Santé et de la Recherche Médicale, Stem Cell and Brain Research Institute U1208, Bron, France
- Dipartimento di Fisica, Universita di Firenze, Via G. Sansone 1, I-50019, Sesto Fiorentino (FI), Italy
| | - Alain Destexhe
- CNRS, Institute of Neuroscience (NeuroPSI), Paris-Saclay University, Saclay, France
| | - Matteo Di Volo
- Université Claude Bernard Lyon 1, Institut National de la Santé et de la Recherche Médicale, Stem Cell and Brain Research Institute U1208, Bron, France.
| |
Collapse
|
2
|
Paliwal S, Ocker GK, Brinkman BAW. Metastability in networks of nonlinear stochastic integrate-and-fire neurons. ARXIV 2024:arXiv:2406.07445v2. [PMID: 38947936 PMCID: PMC11213153] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 07/02/2024]
Abstract
Neurons in the brain continuously process the barrage of sensory inputs they receive from the environment. A wide array of experimental work has shown that the collective activity of neural populations encodes and processes this constant bombardment of information. How these collective patterns of activity depend on single-neuron properties is often unclear. Single-neuron recordings have shown that individual neurons' responses to inputs are nonlinear, which prevents a straight-forward extrapolation from single neuron features to emergent collective states. Here, we use a field-theoretic formulation of a stochastic leaky integrate-and-fire model to study the impact of single-neuron nonlinearities on macroscopic network activity. In this model, a neuron integrates spiking output from other neurons in its membrane voltage and emits spikes stochastically with an intensity depending on the membrane voltage, after which the voltage resets. We show that the interplay between nonlinear spike intensity functions and membrane potential resets can i) give rise to metastable active firing rate states in recurrent networks, and ii) can enhance or suppress mean firing rates and membrane potentials in the same or paradoxically opposite directions.
Collapse
Affiliation(s)
- Siddharth Paliwal
- Graduate Program in Neuroscience, Stony Brook University, Stony Brook, NY, 11794, USA
| | - Gabriel Koch Ocker
- Department of Mathematics and Statistics, Boston University, Boston, MA, 02215, USA
| | - Braden A. W. Brinkman
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, NY, 11794, USA
| |
Collapse
|
3
|
Qi Y. Moment neural network and an efficient numerical method for modeling irregular spiking activity. Phys Rev E 2024; 110:024310. [PMID: 39295055 DOI: 10.1103/physreve.110.024310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2024] [Accepted: 07/01/2024] [Indexed: 09/21/2024]
Abstract
Continuous rate-based neural networks have been widely applied to modeling the dynamics of cortical circuits. However, cortical neurons in the brain exhibit irregular spiking activity with complex correlation structures that cannot be captured by mean firing rate alone. To close this gap, we consider a framework for modeling irregular spiking activity, called the moment neural network, which naturally generalizes rate models to second-order moments and can accurately capture the firing statistics of spiking neural networks. We propose an efficient numerical method that allows for rapid evaluation of moment mappings for neuronal activations without solving the underlying Fokker-Planck equation. This allows simulation of coupled interactions of mean firing rate and firing variability of large-scale neural circuits while retaining the advantage of analytical tractability of continuous rate models. We demonstrate how the moment neural network can explain a range of phenomena including diverse Fano factor in networks with quenched disorder and the emergence of irregular oscillatory dynamics in excitation-inhibition networks with delay.
Collapse
Affiliation(s)
- Yang Qi
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Shanghai 200433, China; and MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China
| |
Collapse
|
4
|
Painchaud V, Desrosiers P, Doyon N. The Determining Role of Covariances in Large Networks of Stochastic Neurons. Neural Comput 2024; 36:1121-1162. [PMID: 38657971 DOI: 10.1162/neco_a_01656] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Accepted: 01/02/2024] [Indexed: 04/26/2024]
Abstract
Biological neural networks are notoriously hard to model due to their stochastic behavior and high dimensionality. We tackle this problem by constructing a dynamical model of both the expectations and covariances of the fractions of active and refractory neurons in the network's populations. We do so by describing the evolution of the states of individual neurons with a continuous-time Markov chain, from which we formally derive a low-dimensional dynamical system. This is done by solving a moment closure problem in a way that is compatible with the nonlinearity and boundedness of the activation function. Our dynamical system captures the behavior of the high-dimensional stochastic model even in cases where the mean-field approximation fails to do so. Taking into account the second-order moments modifies the solutions that would be obtained with the mean-field approximation and can lead to the appearance or disappearance of fixed points and limit cycles. We moreover perform numerical experiments where the mean-field approximation leads to periodically oscillating solutions, while the solutions of the second-order model can be interpreted as an average taken over many realizations of the stochastic model. Altogether, our results highlight the importance of including higher moments when studying stochastic networks and deepen our understanding of correlated neuronal activity.
Collapse
Affiliation(s)
- Vincent Painchaud
- Department of Mathematics and Statistics, McGill University, Montreal, Québec H3A 0B6, Canada
| | - Patrick Desrosiers
- Department of Physics, Engineering Physics, and Optics, Université Laval, Quebec City, Québec G1V 0A6, Canada
- CERVO Brain Research Center, Quebec City, Québec G1E 1T2, Canada
- Centre interdisciplinaire en modélisation mathématique de l'Université Laval, Quebec City, Québec G1V 0A6, Canada
| | - Nicolas Doyon
- Départment of Mathematics and Statistics, Université Laval, Quebec City, Québec G1V 0A6, Canada
- CERVO Brain Research Center, Quebec City, Québec G1E 1T2, Canada
- Centre interdisciplinaire en modélisation mathématique de l'Université Laval, Quebec City, Québec G1V 0A6, Canada
| |
Collapse
|
5
|
Ma H, Qi Y, Gong P, Zhang J, Lu WL, Feng J. Self-Organization of Nonlinearly Coupled Neural Fluctuations Into Synergistic Population Codes. Neural Comput 2023; 35:1820-1849. [PMID: 37725705 DOI: 10.1162/neco_a_01612] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2023] [Accepted: 06/26/2023] [Indexed: 09/21/2023]
Abstract
Neural activity in the brain exhibits correlated fluctuations that may strongly influence the properties of neural population coding. However, how such correlated neural fluctuations may arise from the intrinsic neural circuit dynamics and subsequently affect the computational properties of neural population activity remains poorly understood. The main difficulty lies in resolving the nonlinear coupling between correlated fluctuations with the overall dynamics of the system. In this study, we investigate the emergence of synergistic neural population codes from the intrinsic dynamics of correlated neural fluctuations in a neural circuit model capturing realistic nonlinear noise coupling of spiking neurons. We show that a rich repertoire of spatial correlation patterns naturally emerges in a bump attractor network and further reveals the dynamical regime under which the interplay between differential and noise correlations leads to synergistic codes. Moreover, we find that negative correlations may induce stable bound states between two bumps, a phenomenon previously unobserved in firing rate models. These noise-induced effects of bump attractors lead to a number of computational advantages including enhanced working memory capacity and efficient spatiotemporal multiplexing and can account for a range of cognitive and behavioral phenomena related to working memory. This study offers a dynamical approach to investigating realistic correlated neural fluctuations and insights to their roles in cortical computations.
Collapse
Affiliation(s)
- Hengyuan Ma
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
| | - Yang Qi
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, Shanghai 200433, China
| | - Pulin Gong
- School of Physics, University of Sydney, Sydney, NSW 2006, Australia
| | - Jie Zhang
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, Shanghai 200433, China
| | - Wen-Lian Lu
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, Shanghai 200433, China
| | - Jianfeng Feng
- Institute of Science and Technology for Brain-inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, Shanghai 200433, China
- Department of Computer Science, University of Warwick, Coventry, CV4 7AL, U.K.
| |
Collapse
|
6
|
Cimeša L, Ciric L, Ostojic S. Geometry of population activity in spiking networks with low-rank structure. PLoS Comput Biol 2023; 19:e1011315. [PMID: 37549194 PMCID: PMC10461857 DOI: 10.1371/journal.pcbi.1011315] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Revised: 08/28/2023] [Accepted: 06/27/2023] [Indexed: 08/09/2023] Open
Abstract
Recurrent network models are instrumental in investigating how behaviorally-relevant computations emerge from collective neural dynamics. A recently developed class of models based on low-rank connectivity provides an analytically tractable framework for understanding of how connectivity structure determines the geometry of low-dimensional dynamics and the ensuing computations. Such models however lack some fundamental biological constraints, and in particular represent individual neurons in terms of abstract units that communicate through continuous firing rates rather than discrete action potentials. Here we examine how far the theoretical insights obtained from low-rank rate networks transfer to more biologically plausible networks of spiking neurons. Adding a low-rank structure on top of random excitatory-inhibitory connectivity, we systematically compare the geometry of activity in networks of integrate-and-fire neurons to rate networks with statistically equivalent low-rank connectivity. We show that the mean-field predictions of rate networks allow us to identify low-dimensional dynamics at constant population-average activity in spiking networks, as well as novel non-linear regimes of activity such as out-of-phase oscillations and slow manifolds. We finally exploit these results to directly build spiking networks that perform nonlinear computations.
Collapse
Affiliation(s)
- Ljubica Cimeša
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Lazar Ciric
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
7
|
Ekelmans P, Kraynyukovas N, Tchumatchenko T. Targeting operational regimes of interest in recurrent neural networks. PLoS Comput Biol 2023; 19:e1011097. [PMID: 37186668 DOI: 10.1371/journal.pcbi.1011097] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Revised: 05/25/2023] [Accepted: 04/11/2023] [Indexed: 05/17/2023] Open
Abstract
Neural computations emerge from local recurrent neural circuits or computational units such as cortical columns that comprise hundreds to a few thousand neurons. Continuous progress in connectomics, electrophysiology, and calcium imaging require tractable spiking network models that can consistently incorporate new information about the network structure and reproduce the recorded neural activity features. However, for spiking networks, it is challenging to predict which connectivity configurations and neural properties can generate fundamental operational states and specific experimentally reported nonlinear cortical computations. Theoretical descriptions for the computational state of cortical spiking circuits are diverse, including the balanced state where excitatory and inhibitory inputs balance almost perfectly or the inhibition stabilized state (ISN) where the excitatory part of the circuit is unstable. It remains an open question whether these states can co-exist with experimentally reported nonlinear computations and whether they can be recovered in biologically realistic implementations of spiking networks. Here, we show how to identify spiking network connectivity patterns underlying diverse nonlinear computations such as XOR, bistability, inhibitory stabilization, supersaturation, and persistent activity. We establish a mapping between the stabilized supralinear network (SSN) and spiking activity which allows us to pinpoint the location in parameter space where these activity regimes occur. Notably, we find that biologically-sized spiking networks can have irregular asynchronous activity that does not require strong excitation-inhibition balance or large feedforward input and we show that the dynamic firing rate trajectories in spiking networks can be precisely targeted without error-driven training algorithms.
Collapse
Affiliation(s)
- Pierre Ekelmans
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany
| | - Nataliya Kraynyukovas
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, Universitätsklinikum Bonn, Bonn, Germany
| | - Tatjana Tchumatchenko
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, Universitätsklinikum Bonn, Bonn, Germany
- Institute of physiological chemistry, Medical center of the Johannes Gutenberg-University Mainz, Mainz, Germany
| |
Collapse
|
8
|
Clusella P, Köksal-Ersöz E, Garcia-Ojalvo J, Ruffini G. Comparison between an exact and a heuristic neural mass model with second-order synapses. BIOLOGICAL CYBERNETICS 2023; 117:5-19. [PMID: 36454267 PMCID: PMC10160168 DOI: 10.1007/s00422-022-00952-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Accepted: 10/23/2022] [Indexed: 05/05/2023]
Abstract
Neural mass models (NMMs) are designed to reproduce the collective dynamics of neuronal populations. A common framework for NMMs assumes heuristically that the output firing rate of a neural population can be described by a static nonlinear transfer function (NMM1). However, a recent exact mean-field theory for quadratic integrate-and-fire (QIF) neurons challenges this view by showing that the mean firing rate is not a static function of the neuronal state but follows two coupled nonlinear differential equations (NMM2). Here we analyze and compare these two descriptions in the presence of second-order synaptic dynamics. First, we derive the mathematical equivalence between the two models in the infinitely slow synapse limit, i.e., we show that NMM1 is an approximation of NMM2 in this regime. Next, we evaluate the applicability of this limit in the context of realistic physiological parameter values by analyzing the dynamics of models with inhibitory or excitatory synapses. We show that NMM1 fails to reproduce important dynamical features of the exact model, such as the self-sustained oscillations of an inhibitory interneuron QIF network. Furthermore, in the exact model but not in the limit one, stimulation of a pyramidal cell population induces resonant oscillatory activity whose peak frequency and amplitude increase with the self-coupling gain and the external excitatory input. This may play a role in the enhanced response of densely connected networks to weak uniform inputs, such as the electric fields produced by noninvasive brain stimulation.
Collapse
Affiliation(s)
- Pau Clusella
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain.
| | - Elif Köksal-Ersöz
- LTSI - UMR 1099, INSERM, Univ Rennes, Campus Beaulieu, 35000, Rennes, France
| | - Jordi Garcia-Ojalvo
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain
| | - Giulio Ruffini
- Brain Modeling Department, Neuroelectrics, Av. Tibidabo, 47b, 08035, Barcelona, Spain.
| |
Collapse
|
9
|
Shi YL, Zeraati R, Levina A, Engel TA. Spatial and temporal correlations in neural networks with structured connectivity. PHYSICAL REVIEW RESEARCH 2023; 5:013005. [PMID: 38938692 PMCID: PMC11210526 DOI: 10.1103/physrevresearch.5.013005] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/29/2024]
Abstract
Correlated fluctuations in the activity of neural populations reflect the network's dynamics and connectivity. The temporal and spatial dimensions of neural correlations are interdependent. However, prior theoretical work mainly analyzed correlations in either spatial or temporal domains, oblivious to their interplay. We show that the network dynamics and connectivity jointly define the spatiotemporal profile of neural correlations. We derive analytical expressions for pairwise correlations in networks of binary units with spatially arranged connectivity in one and two dimensions. We find that spatial interactions among units generate multiple timescales in auto- and cross-correlations. Each timescale is associated with fluctuations at a particular spatial frequency, making a hierarchical contribution to the correlations. External inputs can modulate the correlation timescales when spatial interactions are nonlinear, and the modulation effect depends on the operating regime of network dynamics. These theoretical results open new ways to relate connectivity and dynamics in cortical networks via measurements of spatiotemporal neural correlations.
Collapse
Affiliation(s)
- Yan-Liang Shi
- Cold Spring Harbor Laboratory, Cold Spring Harbor, New York, USA
| | - Roxana Zeraati
- International Max Planck Research School for the Mechanisms of Mental Function and Dysfunction, University of Tübingen, Tübingen, Germany
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Anna Levina
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Department of Computer Science, University of Tübingen, Tübingen, Germany
- Bernstein Center for Computational Neuroscience Tübingen, Tübingen, Germany
| | - Tatiana A Engel
- Cold Spring Harbor Laboratory, Cold Spring Harbor, New York, USA
| |
Collapse
|
10
|
Roberts PD, Conour J. Mechanistic modeling as an explanatory tool for clinical treatment of chronic catatonia. Front Pharmacol 2022; 13:1025417. [PMID: 36438845 PMCID: PMC9682077 DOI: 10.3389/fphar.2022.1025417] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Accepted: 10/04/2022] [Indexed: 11/11/2022] Open
Abstract
Mathematical modeling of neural systems is an effective means to integrate complex information about the brain into a numerical tool that can help explain observations. However, the use of neural models to inform clinical decisions has been limited. In this study, we use a simple model of brain circuitry, the Wilson-Cowan model, to predict changes in a clinical measure for catatonia, the Bush-Francis Catatonia Rating Scale, for use in clinical treatment of schizophrenia. This computational tool can then be used to better understand mechanisms of action for pharmaceutical treatments, and to fine-tune dosage in individual cases. We present the conditions of clinical care for a residential patient cohort, and describe methods for synthesizing data to demonstrated the functioning of the model. We then show that the model can be used to explain effect sizes of treatments and estimate outcomes for combinations of medications. We conclude with a demonstration of how this model could be personalized for individual patients to inform ongoing treatment protocols.
Collapse
Affiliation(s)
- Patrick D. Roberts
- Amazon Web Services, Portland, OR, United States
- *Correspondence: Patrick D. Roberts,
| | - James Conour
- Cascadia Behavioral Healthcare, Portland, OR, United States
| |
Collapse
|
11
|
Gast R, Knösche TR, Schmidt H. Mean-field approximations of networks of spiking neurons with short-term synaptic plasticity. Phys Rev E 2021; 104:044310. [PMID: 34781468 DOI: 10.1103/physreve.104.044310] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Accepted: 09/30/2021] [Indexed: 01/17/2023]
Abstract
Low-dimensional descriptions of spiking neural network dynamics are an effective tool for bridging different scales of organization of brain structure and function. Recent advances in deriving mean-field descriptions for networks of coupled oscillators have sparked the development of a new generation of neural mass models. Of notable interest are mean-field descriptions of all-to-all coupled quadratic integrate-and-fire (QIF) neurons, which have already seen numerous extensions and applications. These extensions include different forms of short-term adaptation considered to play an important role in generating and sustaining dynamic regimes of interest in the brain. It is an open question, however, whether the incorporation of presynaptic forms of synaptic plasticity driven by single neuron activity would still permit the derivation of mean-field equations using the same method. Here we discuss this problem using an established model of short-term synaptic plasticity at the single neuron level, for which we present two different approaches for the derivation of the mean-field equations. We compare these models with a recently proposed mean-field approximation that assumes stochastic spike timings. In general, the latter fails to accurately reproduce the macroscopic activity in networks of deterministic QIF neurons with distributed parameters. We show that the mean-field models we propose provide a more accurate description of the network dynamics, although they are mathematically more involved. Using bifurcation analysis, we find that QIF networks with presynaptic short-term plasticity can express regimes of periodic bursting activity as well as bistable regimes. Together, we provide novel insight into the macroscopic effects of short-term synaptic plasticity in spiking neural networks, as well as two different mean-field descriptions for future investigations of such networks.
Collapse
Affiliation(s)
- Richard Gast
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Thomas R Knösche
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Helmut Schmidt
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
12
|
Critical behaviour of the stochastic Wilson-Cowan model. PLoS Comput Biol 2021; 17:e1008884. [PMID: 34460811 PMCID: PMC8432901 DOI: 10.1371/journal.pcbi.1008884] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Revised: 09/10/2021] [Accepted: 07/31/2021] [Indexed: 11/19/2022] Open
Abstract
Spontaneous brain activity is characterized by bursts and avalanche-like dynamics, with scale-free features typical of critical behaviour. The stochastic version of the celebrated Wilson-Cowan model has been widely studied as a system of spiking neurons reproducing non-trivial features of the neural activity, from avalanche dynamics to oscillatory behaviours. However, to what extent such phenomena are related to the presence of a genuine critical point remains elusive. Here we address this central issue, providing analytical results in the linear approximation and extensive numerical analysis. In particular, we present results supporting the existence of a bona fide critical point, where a second-order-like phase transition occurs, characterized by scale-free avalanche dynamics, scaling with the system size and a diverging relaxation time-scale. Moreover, our study shows that the observed critical behaviour falls within the universality class of the mean-field branching process, where the exponents of the avalanche size and duration distributions are, respectively, 3/2 and 2. We also provide an accurate analysis of the system behaviour as a function of the total number of neurons, focusing on the time correlation functions of the firing rate in a wide range of the parameter space.
Collapse
|
13
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
14
|
Stapmanns J, Kühn T, Dahmen D, Luu T, Honerkamp C, Helias M. Self-consistent formulations for stochastic nonlinear neuronal dynamics. Phys Rev E 2020; 101:042124. [PMID: 32422832 DOI: 10.1103/physreve.101.042124] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 12/18/2019] [Indexed: 01/28/2023]
Abstract
Neural dynamics is often investigated with tools from bifurcation theory. However, many neuron models are stochastic, mimicking fluctuations in the input from unknown parts of the brain or the spiking nature of signals. Noise changes the dynamics with respect to the deterministic model; in particular classical bifurcation theory cannot be applied. We formulate the stochastic neuron dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To formulate self-consistency equations, we derive a fundamental link between the effective action in the Onsager-Machlup (OM) formalism, which allows the study of phase transitions, and the MSRDJ effective action, which is computationally advantageous. These results in particular allow the derivation of an OM effective action for systems with non-Gaussian noise. This approach naturally leads to effective deterministic equations for the first moment of the stochastic system; they explain how nonlinearities and noise cooperate to produce memory effects. Moreover, the MSRDJ formulation yields an effective linear system that has identical power spectra and linear response. Starting from the better known loopwise approximation, we then discuss the use of the fRG as a method to obtain self-consistency beyond the mean. We present a new efficient truncation scheme for the hierarchy of flow equations for the vertex functions by adapting the Blaizot, Méndez, and Wschebor approximation from the derivative expansion to the vertex expansion. The methods are presented by means of the simplest possible example of a stochastic differential equation that has generic features of neuronal dynamics.
Collapse
Affiliation(s)
- Jonas Stapmanns
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Thomas Luu
- Institut für Kernphysik (IKP-3), Institute for Advanced Simulation (IAS-4) and Jülich Center for Hadron Physics, Jülich Research Centre, Jülich, Germany
| | - Carsten Honerkamp
- Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany.,JARA-FIT, Jülich Aachen Research Alliance-Fundamentals of Future Information Technology, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| |
Collapse
|
15
|
Abstract
The Wilson-Cowan equations represent a landmark in the history of computational neuroscience. Along with the insights Wilson and Cowan offered for neuroscience, they crystallized an approach to modeling neural dynamics and brain function. Although their iconic equations are used in various guises today, the ideas that led to their formulation and the relationship to other approaches are not well known. Here, we give a little context to some of the biological and theoretical concepts that lead to the Wilson-Cowan equations and discuss how to extend beyond them.
Collapse
Affiliation(s)
- Carson C Chow
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, Maryland
| | - Yahya Karimipanah
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, Maryland
| |
Collapse
|
16
|
Carlu M, Chehab O, Dalla Porta L, Depannemaecker D, Héricé C, Jedynak M, Köksal Ersöz E, Muratore P, Souihel S, Capone C, Zerlaut Y, Destexhe A, di Volo M. A mean-field approach to the dynamics of networks of complex neurons, from nonlinear Integrate-and-Fire to Hodgkin-Huxley models. J Neurophysiol 2020; 123:1042-1051. [PMID: 31851573 PMCID: PMC7099478 DOI: 10.1152/jn.00399.2019] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2019] [Revised: 12/05/2019] [Accepted: 12/09/2019] [Indexed: 11/22/2022] Open
Abstract
We present a mean-field formalism able to predict the collective dynamics of large networks of conductance-based interacting spiking neurons. We apply this formalism to several neuronal models, from the simplest Adaptive Exponential Integrate-and-Fire model to the more complex Hodgkin-Huxley and Morris-Lecar models. We show that the resulting mean-field models are capable of predicting the correct spontaneous activity of both excitatory and inhibitory neurons in asynchronous irregular regimes, typical of cortical dynamics. Moreover, it is possible to quantitatively predict the population response to external stimuli in the form of external spike trains. This mean-field formalism therefore provides a paradigm to bridge the scale between population dynamics and the microscopic complexity of the individual cells physiology.NEW & NOTEWORTHY Population models are a powerful mathematical tool to study the dynamics of neuronal networks and to simulate the brain at macroscopic scales. We present a mean-field model capable of quantitatively predicting the temporal dynamics of a network of complex spiking neuronal models, from Integrate-and-Fire to Hodgkin-Huxley, thus linking population models to neurons electrophysiology. This opens a perspective on generating biologically realistic mean-field models from electrophysiological recordings.
Collapse
Affiliation(s)
- M. Carlu
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - O. Chehab
- Ecole Normale Superieure Paris-Saclay, France
| | - L. Dalla Porta
- Institut d’Investigacions Biomèdiques August Pi i Sunyer, Barcelona, Spain
| | - D. Depannemaecker
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - C. Héricé
- Strathclyde Institute of Pharmacy and Biomedical Sciences, Glasgow, Scotland, United Kingdom
| | - M. Jedynak
- Université Grenoble Alpes, Grenoble Institut des Neurosciences and Institut National de la Santé et de la Recherche Médicale (INSERM), U1216, France
| | - E. Köksal Ersöz
- INSERM, U1099, Rennes, France
- MathNeuro Team, Inria Sophia Antipolis Méditerranée, Sophia Antipolis, France
| | - P. Muratore
- Physics Department, Sapienza University, Rome, Italy
| | - S. Souihel
- Université Côte d’Azur, Inria Sophia Antipolis Méditerranée, France
| | - C. Capone
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - Y. Zerlaut
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - A. Destexhe
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - M. di Volo
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
- Laboratoire de Physique Théorique et Modelisation, Université de Cergy-Pontoise, Cergy-Pontoise, France
| |
Collapse
|
17
|
Goldman JS, Tort-Colet N, di Volo M, Susin E, Bouté J, Dali M, Carlu M, Nghiem TA, Górski T, Destexhe A. Bridging Single Neuron Dynamics to Global Brain States. Front Syst Neurosci 2019; 13:75. [PMID: 31866837 PMCID: PMC6908479 DOI: 10.3389/fnsys.2019.00075] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2019] [Accepted: 11/19/2019] [Indexed: 11/13/2022] Open
Abstract
Biological neural networks produce information backgrounds of multi-scale spontaneous activity that become more complex in brain states displaying higher capacities for cognition, for instance, attentive awake versus asleep or anesthetized states. Here, we review brain state-dependent mechanisms spanning ion channel currents (microscale) to the dynamics of brain-wide, distributed, transient functional assemblies (macroscale). Not unlike how microscopic interactions between molecules underlie structures formed in macroscopic states of matter, using statistical physics, the dynamics of microscopic neural phenomena can be linked to macroscopic brain dynamics through mesoscopic scales. Beyond spontaneous dynamics, it is observed that stimuli evoke collapses of complexity, most remarkable over high dimensional, asynchronous, irregular background dynamics during consciousness. In contrast, complexity may not be further collapsed beyond synchrony and regularity characteristic of unconscious spontaneous activity. We propose that increased dimensionality of spontaneous dynamics during conscious states supports responsiveness, enhancing neural networks' emergent capacity to robustly encode information over multiple scales.
Collapse
Affiliation(s)
- Jennifer S. Goldman
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | - Núria Tort-Colet
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | - Matteo di Volo
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | - Eduarda Susin
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | - Jules Bouté
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | - Melissa Dali
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | - Mallory Carlu
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | | | - Tomasz Górski
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| | - Alain Destexhe
- Department of Integrative and Computational Neuroscience (ICN), Centre National de la Recherche Scientifique (CNRS), Paris-Saclay Institute of Neuroscience (NeuroPSI), Gif-sur-Yvette, France
| |
Collapse
|
18
|
Rule ME, Schnoerr D, Hennig MH, Sanguinetti G. Neural field models for latent state inference: Application to large-scale neuronal recordings. PLoS Comput Biol 2019; 15:e1007442. [PMID: 31682604 PMCID: PMC6855563 DOI: 10.1371/journal.pcbi.1007442] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2019] [Revised: 11/14/2019] [Accepted: 09/27/2019] [Indexed: 11/18/2022] Open
Abstract
Large-scale neural recording methods now allow us to observe large populations of identified single neurons simultaneously, opening a window into neural population dynamics in living organisms. However, distilling such large-scale recordings to build theories of emergent collective dynamics remains a fundamental statistical challenge. The neural field models of Wilson, Cowan, and colleagues remain the mainstay of mathematical population modeling owing to their interpretable, mechanistic parameters and amenability to mathematical analysis. Inspired by recent advances in biochemical modeling, we develop a method based on moment closure to interpret neural field models as latent state-space point-process models, making them amenable to statistical inference. With this approach we can infer the intrinsic states of neurons, such as active and refractory, solely from spiking activity in large populations. After validating this approach with synthetic data, we apply it to high-density recordings of spiking activity in the developing mouse retina. This confirms the essential role of a long lasting refractory state in shaping spatiotemporal properties of neonatal retinal waves. This conceptual and methodological advance opens up new theoretical connections between mathematical theory and point-process state-space models in neural data analysis. Developing statistical tools to connect single-neuron activity to emergent collective dynamics is vital for building interpretable models of neural activity. Neural field models relate single-neuron activity to emergent collective dynamics in neural populations, but integrating them with data remains challenging. Recently, latent state-space models have emerged as a powerful tool for constructing phenomenological models of neural population activity. The advent of high-density multi-electrode array recordings now enables us to examine large-scale collective neural activity. We show that classical neural field approaches can yield latent state-space equations and demonstrate that this enables inference of the intrinsic states of neurons from recorded spike trains in large populations.
Collapse
Affiliation(s)
- Michael E. Rule
- Department of Engineering, University of Cambridge, Cambridge, United Kingdom
- * E-mail:
| | - David Schnoerr
- Theoretical Systems Biology, Imperial College London, London, United Kingdom
| | - Matthias H. Hennig
- Department of Informatics, University of Edinburgh, Edinburgh, United Kingdom
| | - Guido Sanguinetti
- Department of Informatics, University of Edinburgh, Edinburgh, United Kingdom
| |
Collapse
|
19
|
Ly C, Shew WL, Barreiro AK. Efficient calculation of heterogeneous non-equilibrium statistics in coupled firing-rate models. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2019; 9:2. [PMID: 31073652 PMCID: PMC6509307 DOI: 10.1186/s13408-019-0070-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/04/2019] [Accepted: 04/28/2019] [Indexed: 06/09/2023]
Abstract
Understanding nervous system function requires careful study of transient (non-equilibrium) neural response to rapidly changing, noisy input from the outside world. Such neural response results from dynamic interactions among multiple, heterogeneous brain regions. Realistic modeling of these large networks requires enormous computational resources, especially when high-dimensional parameter spaces are considered. By assuming quasi-steady-state activity, one can neglect the complex temporal dynamics; however, in many cases the quasi-steady-state assumption fails. Here, we develop a new reduction method for a general heterogeneous firing-rate model receiving background correlated noisy inputs that accurately handles highly non-equilibrium statistics and interactions of heterogeneous cells. Our method involves solving an efficient set of nonlinear ODEs, rather than time-consuming Monte Carlo simulations or high-dimensional PDEs, and it captures the entire set of first and second order statistics while allowing significant heterogeneity in all model parameters.
Collapse
Affiliation(s)
- Cheng Ly
- Department of Statistical Sciences and Operations Research, Virginia Commonwealth University, Richmond, USA
| | - Woodrow L. Shew
- Department of Physics, University of Arkansas, Fayetteville, USA
| | | |
Collapse
|
20
|
di Volo M, Romagnoni A, Capone C, Destexhe A. Biologically Realistic Mean-Field Models of Conductance-Based Networks of Spiking Neurons with Adaptation. Neural Comput 2019; 31:653-680. [DOI: 10.1162/neco_a_01173] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Accurate population models are needed to build very large-scale neural models, but their derivation is difficult for realistic networks of neurons, in particular when nonlinear properties are involved, such as conductance-based interactions and spike-frequency adaptation. Here, we consider such models based on networks of adaptive exponential integrate-and-fire excitatory and inhibitory neurons. Using a master equation formalism, we derive a mean-field model of such networks and compare it to the full network dynamics. The mean-field model is capable of correctly predicting the average spontaneous activity levels in asynchronous irregular regimes similar to in vivo activity. It also captures the transient temporal response of the network to complex external inputs. Finally, the mean-field model is also able to quantitatively describe regimes where high- and low-activity states alternate (up-down state dynamics), leading to slow oscillations. We conclude that such mean-field models are biologically realistic in the sense that they can capture both spontaneous and evoked activity, and they naturally appear as candidates to build very large-scale models involving multiple brain areas.
Collapse
Affiliation(s)
- Matteo di Volo
- Unité de Neuroscience, Information et Complexité, CNRS FRE 3693, 91198 Gif sur Yvette, France
| | - Alberto Romagnoni
- Centre de Recherche sur l'inflammation UMR 1149, Inserm-Université Paris Diderot, 75018 Paris, France, and Data Team, Departement d'informatique de l'Ecole normale supérieure, CNRS, PSL Research University, 75005 Paris, France, and European Institute for Theoretical Neuroscience, 75012 Paris, France
| | - Cristiano Capone
- European Institute for Theoretical Neuroscience, 75012 Paris, France, and INFN Sezione di Roma, Rome 00185, Italy
| | - Alain Destexhe
- Unité de Neuroscience, Information et Complexité, CNRS FRE 3693, 91198 Gif sur Yvette, France, and European Institute for Theoretical Neuroscience, 75012 Paris, France
| |
Collapse
|
21
|
Zhang J, Shao Y, Rangan AV, Tao L. A coarse-graining framework for spiking neuronal networks: from strongly-coupled conductance-based integrate-and-fire neurons to augmented systems of ODEs. J Comput Neurosci 2019; 46:211-232. [PMID: 30788694 DOI: 10.1007/s10827-019-00712-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2018] [Revised: 01/27/2019] [Accepted: 01/31/2019] [Indexed: 11/29/2022]
Abstract
Homogeneously structured, fluctuation-driven networks of spiking neurons can exhibit a wide variety of dynamical behaviors, ranging from homogeneity to synchrony. We extend our partitioned-ensemble average (PEA) formalism proposed in Zhang et al. (Journal of Computational Neuroscience, 37(1), 81-104, 2014a) to systematically coarse grain the heterogeneous dynamics of strongly coupled, conductance-based integrate-and-fire neuronal networks. The population dynamics models derived here successfully capture the so-called multiple-firing events (MFEs), which emerge naturally in fluctuation-driven networks of strongly coupled neurons. Although these MFEs likely play a crucial role in the generation of the neuronal avalanches observed in vitro and in vivo, the mechanisms underlying these MFEs cannot easily be understood using standard population dynamic models. Using our PEA formalism, we systematically generate a sequence of model reductions, going from Master equations, to Fokker-Planck equations, and finally, to an augmented system of ordinary differential equations. Furthermore, we show that these reductions can faithfully describe the heterogeneous dynamic regimes underlying the generation of MFEs in strongly coupled conductance-based integrate-and-fire neuronal networks.
Collapse
Affiliation(s)
- Jiwei Zhang
- School of Mathematics and Statistics, Wuhan University, Wuhan, 430072, China.,Hubei Key Laboratory of Computational Science, Wuhan University, Wuhan, 430072, China
| | - Yuxiu Shao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, 100871, China.,Center for Quantitative Biology, Peking University, Beijing, 100871, China
| | - Aaditya V Rangan
- Courant Institute of Mathematical Sciences, New York University, New York, NY, USA
| | - Louis Tao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, 100871, China. .,Center for Quantitative Biology, Peking University, Beijing, 100871, China.
| |
Collapse
|
22
|
Qiu SW, Chow CC. Finite-size effects for spiking neural networks with spatially dependent coupling. Phys Rev E 2018; 98:062414. [PMID: 32478211 PMCID: PMC7258138 DOI: 10.1103/physreve.98.062414] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
We study finite-size fluctuations in a network of spiking deterministic neurons coupled with nonuniform synaptic coupling. We generalize a previously developed theory of finite-size effects for globally coupled neurons with a uniform coupling function. In the uniform coupling case, mean-field theory is well defined by averaging over the network as the number of neurons in the network goes to infinity. However, for nonuniform coupling it is no longer possible to average over the entire network if we are interested in fluctuations at a particular location within the network. We show that if the coupling function approaches a continuous function in the infinite system size limit, then an average over a local neighborhood can be defined such that mean-field theory is well defined for a spatially dependent field. We then use a path-integral formalism to derive a perturbation expansion in the inverse system size around the mean-field limit for the covariance of the input to a neuron (synaptic drive) and firing rate fluctuations due to dynamical deterministic finite-size effects.
Collapse
Affiliation(s)
- Si-Wei Qiu
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), National Institutes of Health (NIH), Bethesda, Maryland 20892, USA
| | - Carson C Chow
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), National Institutes of Health (NIH), Bethesda, Maryland 20892, USA
| |
Collapse
|
23
|
Large deviations for randomly connected neural networks: I. Spatially extended systems. ADV APPL PROBAB 2018. [DOI: 10.1017/apr.2018.42] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Abstract
In a series of two papers, we investigate the large deviations and asymptotic behavior of stochastic models of brain neural networks with random interaction coefficients. In this first paper, we take into account the spatial structure of the brain and consider first the presence of interaction delays that depend on the distance between cells and then the Gaussian random interaction amplitude with a mean and variance that depend on the position of the neurons and scale as the inverse of the network size. We show that the empirical measure satisfies a large deviations principle with a good rate function reaching its minimum at a unique spatially extended probability measure. This result implies an averaged convergence of the empirical measure and a propagation of chaos. The limit is characterized through a complex non-Markovian implicit equation in which the network interaction term is replaced by a nonlocal Gaussian process with a mean and covariance that depend on the statistics of the solution over the whole neural field.
Collapse
|
24
|
O'Neill GC, Tewarie P, Vidaurre D, Liuzzi L, Woolrich MW, Brookes MJ. Dynamics of large-scale electrophysiological networks: A technical review. Neuroimage 2018; 180:559-576. [PMID: 28988134 DOI: 10.1016/j.neuroimage.2017.10.003] [Citation(s) in RCA: 118] [Impact Index Per Article: 16.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2017] [Revised: 09/23/2017] [Accepted: 10/02/2017] [Indexed: 12/12/2022] Open
Abstract
For several years it has been argued that neural synchronisation is crucial for cognition. The idea that synchronised temporal patterns between different neural groups carries information above and beyond the isolated activity of these groups has inspired a shift in focus in the field of functional neuroimaging. Specifically, investigation into the activation elicited within certain regions by some stimulus or task has, in part, given way to analysis of patterns of co-activation or functional connectivity between distal regions. Recently, the functional connectivity community has been looking beyond the assumptions of stationarity that earlier work was based on, and has introduced methods to incorporate temporal dynamics into the analysis of connectivity. In particular, non-invasive electrophysiological data (magnetoencephalography/electroencephalography (MEG/EEG)), which provides direct measurement of whole-brain activity and rich temporal information, offers an exceptional window into such (potentially fast) brain dynamics. In this review, we discuss challenges, solutions, and a collection of analysis tools that have been developed in recent years to facilitate the investigation of dynamic functional connectivity using these imaging modalities. Further, we discuss the applications of these approaches in the study of cognition and neuropsychiatric disorders. Finally, we review some existing developments that, by using realistic computational models, pursue a deeper understanding of the underlying causes of non-stationary connectivity.
Collapse
Affiliation(s)
- George C O'Neill
- Sir Peter Mansfield Imaging Centre, School of Physics and Astronomy, University of Nottingham, Nottingham, United Kingdom
| | - Prejaas Tewarie
- Sir Peter Mansfield Imaging Centre, School of Physics and Astronomy, University of Nottingham, Nottingham, United Kingdom
| | - Diego Vidaurre
- Oxford Centre for Human Brain Activity, Department of Psychiatry, University of Oxford, Oxford, United Kingdom
| | - Lucrezia Liuzzi
- Sir Peter Mansfield Imaging Centre, School of Physics and Astronomy, University of Nottingham, Nottingham, United Kingdom
| | - Mark W Woolrich
- Oxford Centre for Human Brain Activity, Department of Psychiatry, University of Oxford, Oxford, United Kingdom
| | - Matthew J Brookes
- Sir Peter Mansfield Imaging Centre, School of Physics and Astronomy, University of Nottingham, Nottingham, United Kingdom.
| |
Collapse
|
25
|
Lang E, Stannat W. Finite-Size Effects on Traveling Wave Solutions to Neural Field Equations. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2017; 7:5. [PMID: 28685484 PMCID: PMC5500661 DOI: 10.1186/s13408-017-0048-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2016] [Accepted: 06/02/2017] [Indexed: 06/07/2023]
Abstract
Neural field equations are used to describe the spatio-temporal evolution of the activity in a network of synaptically coupled populations of neurons in the continuum limit. Their heuristic derivation involves two approximation steps. Under the assumption that each population in the network is large, the activity is described in terms of a population average. The discrete network is then approximated by a continuum. In this article we make the two approximation steps explicit. Extending a model by Bressloff and Newby, we describe the evolution of the activity in a discrete network of finite populations by a Markov chain. In order to determine finite-size effects-deviations from the mean-field limit due to the finite size of the populations in the network-we analyze the fluctuations of this Markov chain and set up an approximating system of diffusion processes. We show that a well-posed stochastic neural field equation with a noise term accounting for finite-size effects on traveling wave solutions is obtained as the strong continuum limit.
Collapse
Affiliation(s)
- Eva Lang
- Institut für Mathematik, Technische Universität Berlin, Berlin, 10623 Germany
- Bernstein Center for Computational Neuroscience, Berlin, 10115 Germany
| | - Wilhelm Stannat
- Institut für Mathematik, Technische Universität Berlin, Berlin, 10623 Germany
- Bernstein Center for Computational Neuroscience, Berlin, 10115 Germany
| |
Collapse
|
26
|
Abstract
We expand the theory of Hawkes processes to the nonstationary case, in which the mutually exciting point processes receive time-dependent inputs. We derive an analytical expression for the time-dependent correlations, which can be applied to networks with arbitrary connectivity, and inputs with arbitrary statistics. The expression shows how the network correlations are determined by the interplay between the network topology, the transfer functions relating units within the network, and the pattern and statistics of the external inputs. We illustrate the correlation structure using several examples in which neural network dynamics are modeled as a Hawkes process. In particular, we focus on the interplay between internally and externally generated oscillations and their signatures in the spike and rate correlation functions.
Collapse
Affiliation(s)
- Neta Ravid Tannenbaum
- Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel
| | - Yoram Burak
- Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel and Racah Institute of Physics, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel
| |
Collapse
|
27
|
Farkhooi F, Stannat W. Complete Mean-Field Theory for Dynamics of Binary Recurrent Networks. PHYSICAL REVIEW LETTERS 2017; 119:208301. [PMID: 29219356 DOI: 10.1103/physrevlett.119.208301] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Indexed: 06/07/2023]
Abstract
We develop a unified theory that encompasses the macroscopic dynamics of recurrent interactions of binary units within arbitrary network architectures. Using the martingale theory, our mathematical analysis provides a complete description of nonequilibrium fluctuations in networks with a finite size and finite degree of interactions. Our approach allows the investigation of systems for which a deterministic mean-field theory breaks down. To demonstrate this, we uncover a novel dynamic state in which a recurrent network of binary units with statistically inhomogeneous interactions, along with an asynchronous behavior, also exhibits collective nontrivial stochastic fluctuations in the thermodynamical limit.
Collapse
Affiliation(s)
- Farzad Farkhooi
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin,Germany
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| | - Wilhelm Stannat
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin,Germany
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| |
Collapse
|
28
|
Rostami V, Porta Mana P, Grün S, Helias M. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models. PLoS Comput Biol 2017; 13:e1005762. [PMID: 28968396 PMCID: PMC5645158 DOI: 10.1371/journal.pcbi.1005762] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2016] [Revised: 10/17/2017] [Accepted: 09/05/2017] [Indexed: 11/30/2022] Open
Abstract
Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition. Networks of interacting units are ubiquitous in various fields of biology; e.g. gene regulatory networks, neuronal networks, social structures. If a limited set of observables is accessible, maximum-entropy models provide a way to construct a statistical model for such networks, under particular assumptions. The pairwise maximum-entropy model only uses the first two moments among those observables, and can be interpreted as a network with only pairwise interactions. If correlations are on average positive, we here show that the maximum entropy distribution tends to become bimodal. In the application to neuronal activity this is a problem, because the bimodality is an artefact of the statistical model and not observed in real data. This problem could also affect other fields in biology. We here explain under which conditions bimodality arises and present a solution to the problem by introducing a collective negative feedback, corresponding to a modified maximum-entropy model. This result may point to the existence of a homeostatic mechanism active in the system that is not part of our set of observable units.
Collapse
Affiliation(s)
- Vahid Rostami
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- * E-mail:
| | - PierGianLuca Porta Mana
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Sonja Grün
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Theoretical Systems Neurobiology, RWTH Aachen University, Aachen, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
29
|
A theoretical framework for analyzing coupled neuronal networks: Application to the olfactory system. PLoS Comput Biol 2017; 13:e1005780. [PMID: 28968384 PMCID: PMC5638622 DOI: 10.1371/journal.pcbi.1005780] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2017] [Revised: 10/12/2017] [Accepted: 09/15/2017] [Indexed: 12/27/2022] Open
Abstract
Determining how synaptic coupling within and between regions is modulated during sensory processing is an important topic in neuroscience. Electrophysiological recordings provide detailed information about neural spiking but have traditionally been confined to a particular region or layer of cortex. Here we develop new theoretical methods to study interactions between and within two brain regions, based on experimental measurements of spiking activity simultaneously recorded from the two regions. By systematically comparing experimentally-obtained spiking statistics to (efficiently computed) model spike rate statistics, we identify regions in model parameter space that are consistent with the experimental data. We apply our new technique to dual micro-electrode array in vivo recordings from two distinct regions: olfactory bulb (OB) and anterior piriform cortex (PC). Our analysis predicts that: i) inhibition within the afferent region (OB) has to be weaker than the inhibition within PC, ii) excitation from PC to OB is generally stronger than excitation from OB to PC, iii) excitation from PC to OB and inhibition within PC have to both be relatively strong compared to presynaptic inputs from OB. These predictions are validated in a spiking neural network model of the OB–PC pathway that satisfies the many constraints from our experimental data. We find when the derived relationships are violated, the spiking statistics no longer satisfy the constraints from the data. In principle this modeling framework can be adapted to other systems and be used to investigate relationships between other neural attributes besides network connection strengths. Thus, this work can serve as a guide to further investigations into the relationships of various neural attributes within and across different regions during sensory processing. Sensory processing is known to span multiple regions of the nervous system. However, electrophysiological recordings during sensory processing have traditionally been limited to a single region or brain layer. With recent advances in experimental techniques, recorded spiking activity from multiple regions simultaneously is feasible. However, other important quantities— such as inter-region connection strengths—cannot yet be measured. Here, we develop new theoretical tools to leverage data obtained by recording from two different brain regions simultaneously. We address the following questions: what are the crucial neural network attributes that enable sensory processing across different regions, and how are these attributes related to one another? With a novel theoretical framework to efficiently calculate spiking statistics, we can characterize a high dimensional parameter space that satisfies data constraints. We apply our results to the olfactory system to make specific predictions about effective network connectivity. Our framework relies on incorporating relatively easy-to-measure quantities to predict hard-to-measure interactions across multiple brain regions. Because this work is adaptable to other systems, we anticipate it will be a valuable tool for analysis of other larger scale brain recordings.
Collapse
|
30
|
Barreiro AK, Ly C. Practical approximation method for firing-rate models of coupled neural networks with correlated inputs. Phys Rev E 2017; 96:022413. [PMID: 28950506 DOI: 10.1103/physreve.96.022413] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2017] [Indexed: 01/18/2023]
Abstract
Rapid experimental advances now enable simultaneous electrophysiological recording of neural activity at single-cell resolution across large regions of the nervous system. Models of this neural network activity will necessarily increase in size and complexity, thus increasing the computational cost of simulating them and the challenge of analyzing them. Here we present a method to approximate the activity and firing statistics of a general firing rate network model (of the Wilson-Cowan type) subject to noisy correlated background inputs. The method requires solving a system of transcendental equations and is fast compared to Monte Carlo simulations of coupled stochastic differential equations. We implement the method with several examples of coupled neural networks and show that the results are quantitatively accurate even with moderate coupling strengths and an appreciable amount of heterogeneity in many parameters. This work should be useful for investigating how various neural attributes qualitatively affect the spiking statistics of coupled neural networks.
Collapse
Affiliation(s)
- Andrea K Barreiro
- Department of Mathematics, Southern Methodist University, P.O. Box 750235, Dallas, Texas 75275, USA
| | - Cheng Ly
- Department of Statistical Sciences and Operations Research, Virginia Commonwealth University, 1015 Floyd Avenue, Richmond, Virginia 23284, USA
| |
Collapse
|
31
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
32
|
A stochastic-field description of finite-size spiking neural networks. PLoS Comput Biol 2017; 13:e1005691. [PMID: 28787447 PMCID: PMC5560761 DOI: 10.1371/journal.pcbi.1005691] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2016] [Revised: 08/17/2017] [Accepted: 07/20/2017] [Indexed: 11/19/2022] Open
Abstract
Neural network dynamics are governed by the interaction of spiking neurons. Stochastic aspects of single-neuron dynamics propagate up to the network level and shape the dynamical and informational properties of the population. Mean-field models of population activity disregard the finite-size stochastic fluctuations of network dynamics and thus offer a deterministic description of the system. Here, we derive a stochastic partial differential equation (SPDE) describing the temporal evolution of the finite-size refractory density, which represents the proportion of neurons in a given refractory state at any given time. The population activity—the density of active neurons per unit time—is easily extracted from this refractory density. The SPDE includes finite-size effects through a two-dimensional Gaussian white noise that acts both in time and along the refractory dimension. For an infinite number of neurons the standard mean-field theory is recovered. A discretization of the SPDE along its characteristic curves allows direct simulations of the activity of large but finite spiking networks; this constitutes the main advantage of our approach. Linearizing the SPDE with respect to the deterministic asynchronous state allows the theoretical investigation of finite-size activity fluctuations. In particular, analytical expressions for the power spectrum and autocorrelation of activity fluctuations are obtained. Moreover, our approach can be adapted to incorporate multiple interacting populations and quasi-renewal single-neuron dynamics. In the brain, information about stimuli is encoded in the timing of action potentials produced by neurons. An understanding of this neural code is facilitated by the use of a well-established method called mean-field theory. Over the last two decades or so, mean-field theory has brought an important added value to the study of emergent properties of neural circuits. Nonetheless, in the mean-field framework, the thermodynamic limit has to be taken, that is, to postulate the number of neurons to be infinite. Doing so, small fluctuations are neglected, and the randomness so present at the cellular level disappears from the description of the circuit dynamics. The origin and functional implications of variability at the network scale are ongoing questions of interest in neuroscience. It is therefore crucial to go beyond the mean-field approach and to propose a description that fully entails the stochastic aspects of network dynamics. In this manuscript, we address this issue by showing that the dynamics of finite-size networks can be represented by stochastic partial differential equations.
Collapse
|
33
|
Abstract
Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks’ spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities—including those of different cell types—combine with connectivity to shape population activity and function. Neuronal networks, like many biological systems, exhibit variable activity. This activity is shaped by both the underlying biology of the component neurons and the structure of their interactions. How can we combine knowledge of these two things—that is, models of individual neurons and of their interactions—to predict the statistics of single- and multi-neuron activity? Current approaches rely on linearizing neural activity around a stationary state. In the face of neural nonlinearities, however, these linear methods can fail to predict spiking statistics and even fail to correctly predict whether activity is stable or pathological. Here, we show how to calculate any spike train cumulant in a broad class of models, while systematically accounting for nonlinear effects. We then study a fundamental effect of nonlinear input-rate transfer–coupling between different orders of spiking statistic–and how this depends on single-neuron and network properties.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Krešimir Josić
- Department of Mathematics and Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- Department of BioSciences, Rice University, Houston, Texas, United States of America
| | - Eric Shea-Brown
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, and UW Institute of Neuroengineering, University of Washington, Seattle, Washington, United States of America
| | - Michael A. Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
34
|
Kühn T, Helias M. Locking of correlated neural activity to ongoing oscillations. PLoS Comput Biol 2017; 13:e1005534. [PMID: 28604771 PMCID: PMC5484611 DOI: 10.1371/journal.pcbi.1005534] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2016] [Revised: 06/26/2017] [Accepted: 04/26/2017] [Indexed: 02/01/2023] Open
Abstract
Population-wide oscillations are ubiquitously observed in mesoscopic signals of cortical activity. In these network states a global oscillatory cycle modulates the propensity of neurons to fire. Synchronous activation of neurons has been hypothesized to be a separate channel of signal processing information in the brain. A salient question is therefore if and how oscillations interact with spike synchrony and in how far these channels can be considered separate. Experiments indeed showed that correlated spiking co-modulates with the static firing rate and is also tightly locked to the phase of beta-oscillations. While the dependence of correlations on the mean rate is well understood in feed-forward networks, it remains unclear why and by which mechanisms correlations tightly lock to an oscillatory cycle. We here demonstrate that such correlated activation of pairs of neurons is qualitatively explained by periodically-driven random networks. We identify the mechanisms by which covariances depend on a driving periodic stimulus. Mean-field theory combined with linear response theory yields closed-form expressions for the cyclostationary mean activities and pairwise zero-time-lag covariances of binary recurrent random networks. Two distinct mechanisms cause time-dependent covariances: the modulation of the susceptibility of single neurons (via the external input and network feedback) and the time-varying variances of single unit activities. For some parameters, the effectively inhibitory recurrent feedback leads to resonant covariances even if mean activities show non-resonant behavior. Our analytical results open the question of time-modulated synchronous activity to a quantitative analysis.
Collapse
Affiliation(s)
- Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- * E-mail:
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
35
|
Hahne J, Dahmen D, Schuecker J, Frommer A, Bolten M, Helias M, Diesmann M. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator. Front Neuroinform 2017; 11:34. [PMID: 28596730 PMCID: PMC5442232 DOI: 10.3389/fninf.2017.00034] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2016] [Accepted: 05/01/2017] [Indexed: 01/21/2023] Open
Abstract
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.
Collapse
Affiliation(s)
- Jan Hahne
- School of Mathematics and Natural Sciences, Bergische Universität WuppertalWuppertal, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA BRAIN Institute I, Jülich Research CentreJülich, Germany
| | - Jannis Schuecker
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA BRAIN Institute I, Jülich Research CentreJülich, Germany
| | - Andreas Frommer
- School of Mathematics and Natural Sciences, Bergische Universität WuppertalWuppertal, Germany
| | - Matthias Bolten
- School of Mathematics and Natural Sciences, Bergische Universität WuppertalWuppertal, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA BRAIN Institute I, Jülich Research CentreJülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen UniversityAachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA BRAIN Institute I, Jülich Research CentreJülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen UniversityAachen, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen UniversityAachen, Germany
| |
Collapse
|
36
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
37
|
Bressloff PC, Ermentrout B, Faugeras O, Thomas PJ. Stochastic Network Models in Neuroscience: A Festschrift for Jack Cowan. Introduction to the Special Issue. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2016; 6:4. [PMID: 27043152 PMCID: PMC4820414 DOI: 10.1186/s13408-016-0036-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/18/2016] [Accepted: 03/18/2016] [Indexed: 06/05/2023]
Abstract
Jack Cowan's remarkable career has spanned, and molded, the development of neuroscience as a quantitative and mathematical discipline combining deep theoretical contributions, rigorous mathematical work and groundbreaking biological insights. The Banff International Research Station hosted a workshop in his honor, on Stochastic Network Models of Neocortex, July 17-24, 2014. This accompanying Festschrift celebrates Cowan's contributions by assembling current research in stochastic phenomena in neural networks. It combines historical perspectives with new results including applications to epilepsy, path-integral methods, stochastic synchronization, higher-order correlation analysis, and pattern formation in visual cortex.
Collapse
Affiliation(s)
- Paul C. Bressloff
- />Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, UT 84112 USA
| | - Bard Ermentrout
- />Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260 USA
| | - Olivier Faugeras
- />INRIA and LJAD, University of Nice-Sophia-Antipolis, Nice, France
| | - Peter J. Thomas
- />Department of Mathematics, Applied Mathematics and Statistics, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, OH 44106-7058 USA
| |
Collapse
|
38
|
Pessa E. Neural Network Models. NATURE-INSPIRED COMPUTING 2016:368-395. [DOI: 10.4018/978-1-5225-0788-8.ch015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2025]
Abstract
The Artificial Neural Network (ANN) models gained a wide popularity owing to a number of claimed advantages such as biological plausibility, tolerance with respect to errors or noise in the input data, learning ability allowing an adaptability to environmental constraints. Notwithstanding the fact that most of these advantages are not typical only of ANNs, engineers, psychologists and neuroscientists made an extended use of ANN models in a large number of scientific investigations. In most cases, however, these models have been introduced in order to provide optimization tools more useful than the ones commonly used by traditional Optimization Theory. Unfortunately, just the successful performance of ANN models in optimization tasks produced a widespread neglect of the true – and important – objectives pursued by the first promoters of these models. These objectives can be shortly summarized by the manifesto of connectionist psychology, stating that mental processes are nothing but macroscopic phenomena, emergent from the cooperative interaction of a large number of microscopic knowledge units. This statement – wholly in line with the goal of statistical mechanics – can be readily extended to other processes, beyond the mental ones, including social, economic, and, in general, organizational ones. Therefore this chapter has been designed in order to answer a number of related questions, such as: are the ANN models able to grant for the occurrence of this sort of emergence? How can the occurrence of this emergence be empirically detected? How can the emergence produced by ANN models be controlled? In which sense the ANN emergence could offer a new paradigm for the explanation of macroscopic phenomena? Answering these questions induces to focus the chapter on less popular ANNs, such as the recurrent ones, while neglecting more popular models, such as perceptrons, and on less used units, such as spiking neurons, rather than on McCulloch-Pitts neurons. Moreover, the chapter must mention a number of strategies of emergence detection, useful for researchers performing computer simulations of ANN behaviours. Among these strategies it is possible to quote the reduction of ANN models to continuous models, such as the neural field models or the neural mass models, the recourse to the methods of Network Theory and the employment of techniques borrowed by Statistical Physics, like the one based on the Renormalization Group. Of course, owing to space (and mathematical expertise) requirements, most mathematical details of the proposed arguments are neglected, and, to gain more information, the reader is deferred to the quoted literature.
Collapse
|
39
|
Klinshov V, Franović I. Mean-field dynamics of a random neural network with noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062813. [PMID: 26764750 DOI: 10.1103/physreve.92.062813] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/08/2015] [Indexed: 06/05/2023]
Abstract
We consider a network of randomly coupled rate-based neurons influenced by external and internal noise. We derive a second-order stochastic mean-field model for the network dynamics and use it to analyze the stability and bifurcations in the thermodynamic limit, as well as to study the fluctuations due to the finite-size effect. It is demonstrated that the two types of noise have substantially different impact on the network dynamics. While both sources of noise give rise to stochastic fluctuations in the case of the finite-size network, only the external noise affects the stationary activity levels of the network in the thermodynamic limit. We compare the theoretical predictions with the direct simulation results and show that they agree for large enough network sizes and for parameter domains sufficiently away from bifurcations.
Collapse
Affiliation(s)
- Vladimir Klinshov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
| | - Igor Franović
- Scientific Computing Laboratory, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia
| |
Collapse
|
40
|
Scalability of Asynchronous Networks Is Limited by One-to-One Mapping between Effective Connectivity and Correlations. PLoS Comput Biol 2015; 11:e1004490. [PMID: 26325661 PMCID: PMC4556689 DOI: 10.1371/journal.pcbi.1004490] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2014] [Accepted: 08/05/2015] [Indexed: 11/19/2022] Open
Abstract
Network models are routinely downscaled compared to nature in terms of numbers of nodes or edges because of a lack of computational resources, often without explicit mention of the limitations this entails. While reliable methods have long existed to adjust parameters such that the first-order statistics of network dynamics are conserved, here we show that limitations already arise if also second-order statistics are to be maintained. The temporal structure of pairwise averaged correlations in the activity of recurrent networks is determined by the effective population-level connectivity. We first show that in general the converse is also true and explicitly mention degenerate cases when this one-to-one relationship does not hold. The one-to-one correspondence between effective connectivity and the temporal structure of pairwise averaged correlations implies that network scalings should preserve the effective connectivity if pairwise averaged correlations are to be held constant. Changes in effective connectivity can even push a network from a linearly stable to an unstable, oscillatory regime and vice versa. On this basis, we derive conditions for the preservation of both mean population-averaged activities and pairwise averaged correlations under a change in numbers of neurons or synapses in the asynchronous regime typical of cortical networks. We find that mean activities and correlation structure can be maintained by an appropriate scaling of the synaptic weights, but only over a range of numbers of synapses that is limited by the variance of external inputs to the network. Our results therefore show that the reducibility of asynchronous networks is fundamentally limited.
Collapse
|
41
|
Asymptotic Description of Neural Networks with Correlated Synaptic Weights. ENTROPY 2015. [DOI: 10.3390/e17074701] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
42
|
Chow CC, Buice MA. Path integral methods for stochastic differential equations. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2015; 5:8. [PMID: 25852983 PMCID: PMC4385267 DOI: 10.1186/s13408-015-0018-5] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/12/2015] [Accepted: 02/13/2015] [Indexed: 06/04/2023]
Abstract
Stochastic differential equations (SDEs) have multiple applications in mathematical neuroscience and are notoriously difficult. Here, we give a self-contained pedagogical review of perturbative field theoretic and path integral methods to calculate moments of the probability density function of SDEs. The methods can be extended to high dimensional systems such as networks of coupled neurons and even deterministic systems with quenched disorder.
Collapse
Affiliation(s)
- Carson C. Chow
- Mathematical Biology Section, Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, MD 20892 USA
| | - Michael A. Buice
- Mathematical Biology Section, Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, MD 20892 USA
| |
Collapse
|
43
|
Bressloff PC. Path-integral methods for analyzing the effects of fluctuations in stochastic hybrid neural networks. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2015; 5:4. [PMID: 25852979 PMCID: PMC4385107 DOI: 10.1186/s13408-014-0016-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/03/2014] [Accepted: 12/11/2014] [Indexed: 06/04/2023]
Abstract
We consider applications of path-integral methods to the analysis of a stochastic hybrid model representing a network of synaptically coupled spiking neuronal populations. The state of each local population is described in terms of two stochastic variables, a continuous synaptic variable and a discrete activity variable. The synaptic variables evolve according to piecewise-deterministic dynamics describing, at the population level, synapses driven by spiking activity. The dynamical equations for the synaptic currents are only valid between jumps in spiking activity, and the latter are described by a jump Markov process whose transition rates depend on the synaptic variables. We assume a separation of time scales between fast spiking dynamics with time constant [Formula: see text] and slower synaptic dynamics with time constant τ. This naturally introduces a small positive parameter [Formula: see text], which can be used to develop various asymptotic expansions of the corresponding path-integral representation of the stochastic dynamics. First, we derive a variational principle for maximum-likelihood paths of escape from a metastable state (large deviations in the small noise limit [Formula: see text]). We then show how the path integral provides an efficient method for obtaining a diffusion approximation of the hybrid system for small ϵ. The resulting Langevin equation can be used to analyze the effects of fluctuations within the basin of attraction of a metastable state, that is, ignoring the effects of large deviations. We illustrate this by using the Langevin approximation to analyze the effects of intrinsic noise on pattern formation in a spatially structured hybrid network. In particular, we show how noise enlarges the parameter regime over which patterns occur, in an analogous fashion to PDEs. Finally, we carry out a [Formula: see text]-loop expansion of the path integral, and use this to derive corrections to voltage-based mean-field equations, analogous to the modified activity-based equations generated from a neural master equation.
Collapse
Affiliation(s)
- Paul C. Bressloff
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, UT 84112 USA
| |
Collapse
|
44
|
Urdapilleta E, Samengo I. Effects of spike-triggered negative feedback on receptive-field properties. J Comput Neurosci 2015; 38:405-25. [PMID: 25601482 DOI: 10.1007/s10827-014-0546-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2014] [Accepted: 12/30/2014] [Indexed: 11/29/2022]
Abstract
Sensory neurons are often described in terms of a receptive field, that is, a linear kernel through which stimuli are filtered before they are further processed. If information transmission is assumed to proceed in a feedforward cascade, the receptive field may be interpreted as the external stimulus' profile maximizing neuronal output. The nervous system, however, contains many feedback loops, and sensory neurons filter more currents than the ones representing the transduced external stimulus. Some of the additional currents are generated by the output activity of the neuron itself, and therefore constitute feedback signals. By means of a time-frequency analysis of the input/output transformation, here we show how feedback modifies the receptive field. The model is applicable to various types of feedback processes, from spike-triggered intrinsic conductances to inhibitory synaptic inputs from nearby neurons. We distinguish between the intrinsic receptive field (filtering all input currents) and the effective receptive field (filtering only external stimuli). Whereas the intrinsic receptive field summarizes the biophysical properties of the neuron associated to subthreshold integration and spike generation, only the effective receptive field can be interpreted as the external stimulus' profile maximizing neuronal output. We demonstrate that spike-triggered feedback shifts low-pass filtering towards band-pass processing, transforming integrator neurons into resonators. For strong feedback, a sharp resonance in the spectral neuronal selectivity may appear. Our results provide a unified framework to interpret a collection of previous experimental studies where specific feedback mechanisms were shown to modify the filtering properties of neurons.
Collapse
Affiliation(s)
- Eugenio Urdapilleta
- Física Estadística e Interdisciplinaria, Centro Atómico Bariloche, Av. E. Bustillo Km 9.500, S. C. de Bariloche, (8400), Río Negro, Argentina,
| | | |
Collapse
|
45
|
Stochastic representations of ion channel kinetics and exact stochastic simulation of neuronal dynamics. J Comput Neurosci 2014; 38:67-82. [PMID: 25408289 DOI: 10.1007/s10827-014-0528-2] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2014] [Revised: 08/18/2014] [Accepted: 09/03/2014] [Indexed: 10/24/2022]
Abstract
In this paper we provide two representations for stochastic ion channel kinetics, and compare the performance of exact simulation with a commonly used numerical approximation strategy. The first representation we present is a random time change representation, popularized by Thomas Kurtz, with the second being analogous to a "Gillespie" representation. Exact stochastic algorithms are provided for the different representations, which are preferable to either (a) fixed time step or (b) piecewise constant propensity algorithms, which still appear in the literature. As examples, we provide versions of the exact algorithms for the Morris-Lecar conductance based model, and detail the error induced, both in a weak and a strong sense, by the use of approximate algorithms on this model. We include ready-to-use implementations of the random time change algorithm in both XPP and Matlab. Finally, through the consideration of parametric sensitivity analysis, we show how the representations presented here are useful in the development of further computational methods. The general representations and simulation strategies provided here are known in other parts of the sciences, but less so in the present setting.
Collapse
|
46
|
Robinson PA. Determination of effective brain connectivity from functional connectivity using propagator-based interferometry and neural field theory with application to the corticothalamic system. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 90:042712. [PMID: 25375528 DOI: 10.1103/physreve.90.042712] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2014] [Indexed: 06/04/2023]
Abstract
It is shown how to compute both direct and total effective connection matrices (deCMs and teCMs), which embody the strengths of neural connections between regions, from correlation-based functional CMs using propagator-based interferometry, a method that stems from geophysics and acoustics, coupled with the recent identification of deCMs and teCMs with bare and dressed propagators, respectively. The approach incorporates excitatory and inhibitory connections, multiple structures and populations, and measurement effects. The propagator is found for a generalized scalar wave equation derived from neural field theory, and expressed in terms of neural activity correlations and covariances, and wave damping rates. It is then related to correlation matrices that are commonly used to express functional and effective connectivities in the brain. The results are illustrated in analytically tractable test cases.
Collapse
Affiliation(s)
- P A Robinson
- School of Physics, University of Sydney, New South Wales 2006, Australia; Center for Integrative Brain Function, University of Sydney, New South Wales 2006, Australia; Brain Dynamics Center, Westmead Millennium Institute, Darcy Rd, Westmead, New South Wales 2145, Australia; Cooperative Research Center for Alertness, Safety, and Productivity, University of Sydney, New South Wales 2006, Australia; and Neurosleep, 431 Glebe Point Rd., Glebe, New South Wales 2037, Australia
| |
Collapse
|
47
|
Bressloff PC, Newby JM. Path integrals and large deviations in stochastic hybrid systems. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 89:042701. [PMID: 24827272 DOI: 10.1103/physreve.89.042701] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/23/2014] [Indexed: 06/03/2023]
Abstract
We construct a path-integral representation of solutions to a stochastic hybrid system, consisting of one or more continuous variables evolving according to a piecewise-deterministic dynamics. The differential equations for the continuous variables are coupled to a set of discrete variables that satisfy a continuous-time Markov process, which means that the differential equations are only valid between jumps in the discrete variables. Examples of stochastic hybrid systems arise in biophysical models of stochastic ion channels, motor-driven intracellular transport, gene networks, and stochastic neural networks. We use the path-integral representation to derive a large deviation action principle for a stochastic hybrid system. Minimizing the associated action functional with respect to the set of all trajectories emanating from a metastable state (assuming that such a minimization scheme exists) then determines the most probable paths of escape. Moreover, evaluating the action functional along a most probable path generates the so-called quasipotential used in the calculation of mean first passage times. We illustrate the theory by considering the optimal paths of escape from a metastable state in a bistable neural network.
Collapse
Affiliation(s)
- Paul C Bressloff
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, Utah 84112, USA
| | - Jay M Newby
- Mathematical Biosciences Institute, Ohio State University, Columbus, Ohio 43210, USA
| |
Collapse
|
48
|
The correlation structure of local neuronal networks intrinsically results from recurrent dynamics. PLoS Comput Biol 2014; 10:e1003428. [PMID: 24453955 PMCID: PMC3894226 DOI: 10.1371/journal.pcbi.1003428] [Citation(s) in RCA: 57] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2013] [Accepted: 11/22/2013] [Indexed: 11/19/2022] Open
Abstract
Correlated neuronal activity is a natural consequence of network connectivity and shared inputs to pairs of neurons, but the task-dependent modulation of correlations in relation to behavior also hints at a functional role. Correlations influence the gain of postsynaptic neurons, the amount of information encoded in the population activity and decoded by readout neurons, and synaptic plasticity. Further, it affects the power and spatial reach of extracellular signals like the local-field potential. A theory of correlated neuronal activity accounting for recurrent connectivity as well as fluctuating external sources is currently lacking. In particular, it is unclear how the recently found mechanism of active decorrelation by negative feedback on the population level affects the network response to externally applied correlated stimuli. Here, we present such an extension of the theory of correlations in stochastic binary networks. We show that (1) for homogeneous external input, the structure of correlations is mainly determined by the local recurrent connectivity, (2) homogeneous external inputs provide an additive, unspecific contribution to the correlations, (3) inhibitory feedback effectively decorrelates neuronal activity, even if neurons receive identical external inputs, and (4) identical synaptic input statistics to excitatory and to inhibitory cells increases intrinsically generated fluctuations and pairwise correlations. We further demonstrate how the accuracy of mean-field predictions can be improved by self-consistently including correlations. As a byproduct, we show that the cancellation of correlations between the summed inputs to pairs of neurons does not originate from the fast tracking of external input, but from the suppression of fluctuations on the population level by the local network. This suppression is a necessary constraint, but not sufficient to determine the structure of correlations; specifically, the structure observed at finite network size differs from the prediction based on perfect tracking, even though perfect tracking implies suppression of population fluctuations. The co-occurrence of action potentials of pairs of neurons within short time intervals has been known for a long time. Such synchronous events can appear time-locked to the behavior of an animal, and also theoretical considerations argue for a functional role of synchrony. Early theoretical work tried to explain correlated activity by neurons transmitting common fluctuations due to shared inputs. This, however, overestimates correlations. Recently, the recurrent connectivity of cortical networks was shown responsible for the observed low baseline correlations. Two different explanations were given: One argues that excitatory and inhibitory population activities closely follow the external inputs to the network, so that their effects on a pair of cells mutually cancel. Another explanation relies on negative recurrent feedback to suppress fluctuations in the population activity, equivalent to small correlations. In a biological neuronal network one expects both, external inputs and recurrence, to affect correlated activity. The present work extends the theoretical framework of correlations to include both contributions and explains their qualitative differences. Moreover, the study shows that the arguments of fast tracking and recurrent feedback are not equivalent, only the latter correctly predicts the cell-type specific correlations.
Collapse
|
49
|
Buice MA, Chow CC. Generalized activity equations for spiking neural network dynamics. Front Comput Neurosci 2013; 7:162. [PMID: 24298252 PMCID: PMC3829481 DOI: 10.3389/fncom.2013.00162] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2013] [Accepted: 10/23/2013] [Indexed: 11/25/2022] Open
Abstract
Much progress has been made in uncovering the computational capabilities of spiking neural networks. However, spiking neurons will always be more expensive to simulate compared to rate neurons because of the inherent disparity in time scales-the spike duration time is much shorter than the inter-spike time, which is much shorter than any learning time scale. In numerical analysis, this is a classic stiff problem. Spiking neurons are also much more difficult to study analytically. One possible approach to making spiking networks more tractable is to augment mean field activity models with some information about spiking correlations. For example, such a generalized activity model could carry information about spiking rates and correlations between spikes self-consistently. Here, we will show how this can be accomplished by constructing a complete formal probabilistic description of the network and then expanding around a small parameter such as the inverse of the number of neurons in the network. The mean field theory of the system gives a rate-like description. The first order terms in the perturbation expansion keep track of covariances.
Collapse
Affiliation(s)
- Michael A. Buice
- Modeling, Analysis and Theory Team, Allen Institute for Brain ScienceSeattle, WA, USA
| | - Carson C. Chow
- Laboratory of Biological Modeling, NIDDK, National Institutes of HealthBethesda, MD, USA
| |
Collapse
|
50
|
Dipoppa M, Gutkin BS. Correlations in background activity control persistent state stability and allow execution of working memory tasks. Front Comput Neurosci 2013; 7:139. [PMID: 24155714 PMCID: PMC3801087 DOI: 10.3389/fncom.2013.00139] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2013] [Accepted: 09/25/2013] [Indexed: 11/17/2022] Open
Abstract
Working memory (WM) requires selective information gating, active information maintenance, and rapid active updating. Hence performing a WM task needs rapid and controlled transitions between neural persistent activity and the resting state. We propose that changes in correlations in neural activity provides a mechanism for the required WM operations. As a proof of principle, we implement sustained activity and WM in recurrently coupled spiking networks with neurons receiving excitatory random background activity where background correlations are induced by a common noise source. We first characterize how the level of background correlations controls the stability of the persistent state. With sufficiently high correlations, the sustained state becomes practically unstable, so it cannot be initiated by a transient stimulus. We exploit this in WM models implementing the delay match to sample task by modulating flexibly in time the correlation level at different phases of the task. The modulation sets the network in different working regimes: more prompt to gate in a signal or clear the memory. We examine how the correlations affect the ability of the network to perform the task when distractors are present. We show that in a winner-take-all version of the model, where two populations cross-inhibit, correlations make the distractor blocking robust. In a version of the mode where no cross inhibition is present, we show that appropriate modulation of correlation levels is sufficient to also block the distractor access while leaving the relevant memory trace in tact. The findings presented in this manuscript can form the basis for a new paradigm about how correlations are flexibly controlled by the cortical circuits to execute WM operations.
Collapse
Affiliation(s)
- Mario Dipoppa
- Departement d'Etudes Cognitives, Ecole Normale Superieure, Group for Neural Theory, Laboratoire des Neurosciences Cognitives INSERM U960 Paris, France ; Ecole Doctorale Cerveau Cognition Comportement, Université Pierre et Marie Curie Paris, France
| | | |
Collapse
|