1
|
Liang T, Brinkman BAW. Statistically inferred neuronal connections in subsampled neural networks strongly correlate with spike train covariances. Phys Rev E 2024; 109:044404. [PMID: 38755896 DOI: 10.1103/physreve.109.044404] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Accepted: 02/29/2024] [Indexed: 05/18/2024]
Abstract
Statistically inferred neuronal connections from observed spike train data are often skewed from ground truth by factors such as model mismatch, unobserved neurons, and limited data. Spike train covariances, sometimes referred to as "functional connections," are often used as a proxy for the connections between pairs of neurons, but reflect statistical relationships between neurons, not anatomical connections. Moreover, covariances are not causal: spiking activity is correlated in both the past and the future, whereas neurons respond only to synaptic inputs in the past. Connections inferred by maximum likelihood inference, however, can be constrained to be causal. However, we show in this work that the inferred connections in spontaneously active networks modeled by stochastic leaky integrate-and-fire networks strongly correlate with the covariances between neurons, and may reflect noncausal relationships, when many neurons are unobserved or when neurons are weakly coupled. This phenomenon occurs across different network structures, including random networks and balanced excitatory-inhibitory networks. We use a combination of simulations and a mean-field analysis with fluctuation corrections to elucidate the relationships between spike train covariances, inferred synaptic filters, and ground-truth connections in partially observed networks.
Collapse
Affiliation(s)
- Tong Liang
- Department of Physics and Astronomy, Stony Brook University, Stony Brook, New York 11794, USA
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York 11794, USA
| | - Braden A W Brinkman
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York 11794, USA
| |
Collapse
|
2
|
Papo D, Buldú JM. Does the brain behave like a (complex) network? I. Dynamics. Phys Life Rev 2024; 48:47-98. [PMID: 38145591 DOI: 10.1016/j.plrev.2023.12.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Accepted: 12/10/2023] [Indexed: 12/27/2023]
Abstract
Graph theory is now becoming a standard tool in system-level neuroscience. However, endowing observed brain anatomy and dynamics with a complex network structure does not entail that the brain actually works as a network. Asking whether the brain behaves as a network means asking whether network properties count. From the viewpoint of neurophysiology and, possibly, of brain physics, the most substantial issues a network structure may be instrumental in addressing relate to the influence of network properties on brain dynamics and to whether these properties ultimately explain some aspects of brain function. Here, we address the dynamical implications of complex network, examining which aspects and scales of brain activity may be understood to genuinely behave as a network. To do so, we first define the meaning of networkness, and analyse some of its implications. We then examine ways in which brain anatomy and dynamics can be endowed with a network structure and discuss possible ways in which network structure may be shown to represent a genuine organisational principle of brain activity, rather than just a convenient description of its anatomy and dynamics.
Collapse
Affiliation(s)
- D Papo
- Department of Neuroscience and Rehabilitation, Section of Physiology, University of Ferrara, Ferrara, Italy; Center for Translational Neurophysiology, Fondazione Istituto Italiano di Tecnologia, Ferrara, Italy.
| | - J M Buldú
- Complex Systems Group & G.I.S.C., Universidad Rey Juan Carlos, Madrid, Spain
| |
Collapse
|
3
|
Shmakov S, Littlewood PB. Coalescence of limit cycles in the presence of noise. Phys Rev E 2024; 109:024220. [PMID: 38491679 DOI: 10.1103/physreve.109.024220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Accepted: 01/19/2024] [Indexed: 03/18/2024]
Abstract
Complex dynamical systems may exhibit multiple steady states, including time-periodic limit cycles, where the final trajectory depends on initial conditions. With tuning of parameters, limit cycles can proliferate or merge at an exceptional point. Here we ask how dynamics in the vicinity of such a bifurcation are influenced by noise. A pitchfork bifurcation can be used to induce bifurcation behavior. We model a limit cycle with the normal form of the Hopf oscillator, couple it to the pitchfork, and investigate the resulting dynamical system in the presence of noise. We show that the generating functional for the averages of the dynamical variables factorizes between the pitchfork and the oscillator. The statistical properties of the pitchfork in the presence of noise in its various regimes are investigated and a scaling theory is developed for the correlation and response functions, including a possible symmetry-breaking field. The analysis is done by perturbative calculations as well as numerical means. Finally, observables illustrating the coupling of a system with a limit cycle to a pitchfork are discussed and the phase-phase correlations are shown to exhibit nondiffusive behavior with universal scaling.
Collapse
Affiliation(s)
- Sergei Shmakov
- James Franck Institute and Department of Physics, The University of Chicago, Chicago, Illinois 60637, USA
| | - Peter B Littlewood
- James Franck Institute and Department of Physics, The University of Chicago, Chicago, Illinois 60637, USA and School of Physics and Astronomy, University of St Andrews, St Andrews KY16 9AJ, United Kingdom
| |
Collapse
|
4
|
Crosser JT, Brinkman BAW. Applications of information geometry to spiking neural network activity. Phys Rev E 2024; 109:024302. [PMID: 38491696 DOI: 10.1103/physreve.109.024302] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Accepted: 01/10/2024] [Indexed: 03/18/2024]
Abstract
The space of possible behaviors that complex biological systems may exhibit is unimaginably vast, and these systems often appear to be stochastic, whether due to variable noisy environmental inputs or intrinsically generated chaos. The brain is a prominent example of a biological system with complex behaviors. The number of possible patterns of spikes emitted by a local brain circuit is combinatorially large, although the brain may not make use of all of them. Understanding which of these possible patterns are actually used by the brain, and how those sets of patterns change as properties of neural circuitry change is a major goal in neuroscience. Recently, tools from information geometry have been used to study embeddings of probabilistic models onto a hierarchy of model manifolds that encode how model outputs change as a function of their parameters, giving a quantitative notion of "distances" between outputs. We apply this method to a network model of excitatory and inhibitory neural populations to understand how the competition between membrane and synaptic response timescales shapes the network's information geometry. The hyperbolic embedding allows us to identify the statistical parameters to which the model behavior is most sensitive, and demonstrate how the ranking of these coordinates changes with the balance of excitation and inhibition in the network.
Collapse
Affiliation(s)
- Jacob T Crosser
- Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, New York 11794, USA and Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York 11794, USA
| | - Braden A W Brinkman
- Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, New York 11794, USA and Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York 11794, USA
| |
Collapse
|
5
|
Giuliano ME, Combi B, dell'Erba MG, Sánchez AD. Perturbative computation of nonlinear harvesting through a path integral approach. Phys Rev E 2024; 109:014210. [PMID: 38366396 DOI: 10.1103/physreve.109.014210] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Accepted: 12/10/2023] [Indexed: 02/18/2024]
Abstract
Statistical field theories provide powerful tools to study complex dynamical systems. In this work those tools are used to analyze the dynamics of a kinetic energy harvester, which is modeled by a system of coupled stochastic nonlinear differential equations and driven by colored noise. Using the Martin-Siggia-Rose response fields we analytically approach the problem through path integrals in the phase space and represent the moments that correspond to physical observables through Feynman diagrams. This analysis method is tested by comparing the solution to the linear case with previous analytical results. Through a perturbative expansion it is calculated how the nonlinearity affects, to the first order, the energy harvest supporting the results through numerical simulations.
Collapse
Affiliation(s)
- Martín E Giuliano
- IFIMAR-CONICET Universidad Nacional de Mar del Plata, 7600 Mar del Plata, Argentina
| | - Bruno Combi
- IFIMAR-CONICET Universidad Nacional de Mar del Plata, 7600 Mar del Plata, Argentina
| | - Matías G dell'Erba
- IFIMAR-CONICET Universidad Nacional de Mar del Plata, 7600 Mar del Plata, Argentina
| | - Alejandro D Sánchez
- IFIMAR-CONICET Universidad Nacional de Mar del Plata, 7600 Mar del Plata, Argentina
| |
Collapse
|
6
|
Gallo M, Magaletti F, Georgoulas A, Marengo M, De Coninck J, Casciola CM. A nanoscale view of the origin of boiling and its dynamics. Nat Commun 2023; 14:6428. [PMID: 37833270 PMCID: PMC10576093 DOI: 10.1038/s41467-023-41959-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2023] [Accepted: 09/21/2023] [Indexed: 10/15/2023] Open
Abstract
In this work, we present a dynamical theory of boiling based on fluctuating hydrodynamics and the diffuse interface approach. The model is able to describe boiling from the stochastic nucleation up to the macroscopic bubble dynamics. It covers, with a modest computational cost, the mesoscale area from nano to micrometers, where most of the controversial observations related to the phenomenon originate. In particular, the role of wettability in the macroscopic observables of boiling is elucidated. In addition, by comparing the ideal case of boiling on ultra-smooth surfaces with a chemically heterogeneous wall, our results will definitively shed light on the puzzling low onset temperatures measured in experiments. Sporadic nanometric spots of hydrophobic wettability will be shown to be enough to trigger the nucleation at low superheat, significantly reducing the temperature of boiling onset, in line with experimental results. The proposed mesoscale approach constitutes the missing link between macroscopic approaches and molecular dynamics simulations and will open a breakthrough pathway toward accurate understanding and prediction.
Collapse
Affiliation(s)
- Mirko Gallo
- Sapienza University of Rome, Rome, Italy.
- School of Architecture, Technology and Engineering, University of Brighton, Lewes Road, Brighton, UK.
| | - Francesco Magaletti
- School of Architecture, Technology and Engineering, University of Brighton, Lewes Road, Brighton, UK
| | - Anastasios Georgoulas
- School of Architecture, Technology and Engineering, University of Brighton, Lewes Road, Brighton, UK
| | - Marco Marengo
- School of Architecture, Technology and Engineering, University of Brighton, Lewes Road, Brighton, UK
- Dept. of Civil Engineering and Architecture, University of Pavia, Pavia, Italy
| | - Joel De Coninck
- School of Architecture, Technology and Engineering, University of Brighton, Lewes Road, Brighton, UK
| | | |
Collapse
|
7
|
Balick DJ. A field theoretic approach to non-equilibrium population genetics in the strong selection regime. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.16.524324. [PMID: 36711507 PMCID: PMC9882232 DOI: 10.1101/2023.01.16.524324] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
Natural populations are virtually never observed in equilibrium, yet equilibrium approximations comprise the majority of our understanding of population genetics. Using standard tools from statistical physics, a formalism is presented that re-expresses the stochastic equations describing allelic evolution as a partition functional over all possible allelic trajectories ('paths') governed by selection, mutation, and drift. A perturbative field theory is developed for strong additive selection, relevant to disease variation, that facilitates the straightforward computation of closed-form approximations for time-dependent moments of the allele frequency distribution across a wide range of non-equilibrium scenarios; examples are presented for constant population size, exponential growth, bottlenecks, and oscillatory size, all of which align well to simulations and break down just above the drift barrier. Equilibration times are computed and, even for static population size, generically extend beyond the order 1/s timescale associated with exponential frequency decay. Though the mutation load is largely robust to variable population size, perturbative drift-based corrections to the deterministic trajectory are readily computed. Under strong selection, the variance of a new mutation's frequency (related to homozygosity) is dominated by drift-driven dynamics and a transient increase in variance often occurs prior to equilibrating. The excess kurtosis over skew squared is roughly constant (i.e., independent of selection, provided 2Ns ≳ 5) for static population size, and thus potentially sensitive to deviation from equilibrium. These insights highlight the value of such closed-form approximations, naturally generated from Feynman diagrams in a perturbative field theory, which can simply and accurately capture the parameter dependences describing a variety of non-equilibrium population genetic phenomena of interest.
Collapse
Affiliation(s)
- Daniel J Balick
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA
- Division of Genetics, Brigham and Women's Hospital, Harvard Medical School, Boston, MA
| |
Collapse
|
8
|
Peng X, Lin W. Complex Dynamics of Noise-Perturbed Excitatory-Inhibitory Neural Networks With Intra-Correlative and Inter-Independent Connections. Front Physiol 2022; 13:915511. [PMID: 35812336 PMCID: PMC9263264 DOI: 10.3389/fphys.2022.915511] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 05/09/2022] [Indexed: 11/24/2022] Open
Abstract
Real neural system usually contains two types of neurons, i.e., excitatory neurons and inhibitory ones. Analytical and numerical interpretation of dynamics induced by different types of interactions among the neurons of two types is beneficial to understanding those physiological functions of the brain. Here, we articulate a model of noise-perturbed random neural networks containing both excitatory and inhibitory (E&I) populations. Particularly, both intra-correlatively and inter-independently connected neurons in two populations are taken into account, which is different from the most existing E&I models only considering the independently-connected neurons. By employing the typical mean-field theory, we obtain an equivalent system of two dimensions with an input of stationary Gaussian process. Investigating the stationary autocorrelation functions along the obtained system, we analytically find the parameters’ conditions under which the synchronized behaviors between the two populations are sufficiently emergent. Taking the maximal Lyapunov exponent as an index, we also find different critical values of the coupling strength coefficients for the chaotic excitatory neurons and for the chaotic inhibitory ones. Interestingly, we reveal that the noise is able to suppress chaotic dynamics of the random neural networks having neurons in two populations, while an appropriate amount of correlation coefficient in intra-coupling strengths can enhance chaos occurrence. Finally, we also detect a previously-reported phenomenon where the parameters region corresponds to neither linearly stable nor chaotic dynamics; however, the size of the region area crucially depends on the populations’ parameters.
Collapse
Affiliation(s)
- Xiaoxiao Peng
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| | - Wei Lin
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- State Key Laboratory of Medical Neurobiology, MOE Frontiers Center for Brain Science, and Institutes of Brain Science, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| |
Collapse
|
9
|
Fagerholm ED, Foulkes WMC, Gallero-Salas Y, Helmchen F, Moran RJ, Friston KJ, Leech R. Estimating anisotropy directly via neural timeseries. J Comput Neurosci 2022; 50:241-249. [PMID: 35182268 PMCID: PMC9035010 DOI: 10.1007/s10827-021-00810-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2021] [Revised: 11/14/2021] [Accepted: 12/06/2021] [Indexed: 11/30/2022]
Abstract
An isotropic dynamical system is one that looks the same in every direction, i.e., if we imagine standing somewhere within an isotropic system, we would not be able to differentiate between different lines of sight. Conversely, anisotropy is a measure of the extent to which a system deviates from perfect isotropy, with larger values indicating greater discrepancies between the structure of the system along its axes. Here, we derive the form of a generalised scalable (mechanically similar) discretized field theoretic Lagrangian that allows for levels of anisotropy to be directly estimated via timeseries of arbitrary dimensionality. We generate synthetic data for both isotropic and anisotropic systems and, by using Bayesian model inversion and reduction, show that we can discriminate between the two datasets - thereby demonstrating proof of principle. We then apply this methodology to murine calcium imaging data collected in rest and task states, showing that anisotropy can be estimated directly from different brain states and cortical regions in an empirical in vivo biological setting. We hope that this theoretical foundation, together with the methodology and publicly available MATLAB code, will provide an accessible way for researchers to obtain new insight into the structural organization of neural systems in terms of how scalable neural regions grow - both ontogenetically during the development of an individual organism, as well as phylogenetically across species.
Collapse
Affiliation(s)
- Erik D Fagerholm
- Department of Neuroimaging, King's College London, London, United Kingdom.
| | - W M C Foulkes
- Department of Physics, Imperial College London, London, United Kingdom
| | - Yasir Gallero-Salas
- Brain Research Institute, University of Zürich, Zürich, Switzerland
- Neuroscience Center Zürich, Zürich, Switzerland
| | - Fritjof Helmchen
- Brain Research Institute, University of Zürich, Zürich, Switzerland
- Neuroscience Center Zürich, Zürich, Switzerland
| | - Rosalyn J Moran
- Department of Neuroimaging, King's College London, London, United Kingdom
| | - Karl J Friston
- Wellcome Centre for Human Neuroimaging, University College London, London, United Kingdom
| | - Robert Leech
- Department of Neuroimaging, King's College London, London, United Kingdom
| |
Collapse
|
10
|
Krishnamurthy K, Can T, Schwab DJ. Theory of Gating in Recurrent Neural Networks. PHYSICAL REVIEW. X 2022; 12:011011. [PMID: 36545030 PMCID: PMC9762509 DOI: 10.1103/physrevx.12.011011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience. Prior theoretical work has focused on RNNs with additive interactions. However gating i.e., multiplicative interactions are ubiquitous in real neurons and also the central feature of the best-performing RNNs in ML. Here, we show that gating offers flexible control of two salient features of the collective dynamics: (i) timescales and (ii) dimensionality. The gate controlling timescales leads to a novel marginally stable state, where the network functions as a flexible integrator. Unlike previous approaches, gating permits this important function without parameter fine-tuning or special symmetries. Gates also provide a flexible, context-dependent mechanism to reset the memory trace, thus complementing the memory function. The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs. At this transition, unlike additive RNNs, the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity). The rich dynamics are summarized in phase diagrams, thus providing a map for principled parameter initialization choices to ML practitioners.
Collapse
Affiliation(s)
- Kamesh Krishnamurthy
- Joseph Henry Laboratories of Physics and PNI, Princeton University, Princeton, New Jersey 08544, USA
| | - Tankut Can
- Institute for Advanced Study, Princeton, New Jersey 08540, USA
| | - David J. Schwab
- Initiative for Theoretical Sciences, Graduate Center, CUNY, New York, New York 10016, USA
| |
Collapse
|
11
|
van Meegen A, Kühn T, Helias M. Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions. PHYSICAL REVIEW LETTERS 2021; 127:158302. [PMID: 34678014 DOI: 10.1103/physrevlett.127.158302] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 07/05/2021] [Accepted: 08/19/2021] [Indexed: 06/13/2023]
Abstract
We here unify the field-theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions.
Collapse
Affiliation(s)
- Alexander van Meegen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, 52428 Jülich, Germany
- Institute of Zoology, University of Cologne, 50674 Cologne, Germany
| | - Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, 52428 Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, 52074 Aachen, Germany
- Laboratoire de Physique de l'Ecole Normale Supérieure, ENS, Université PSL, CNRS, Sorbonne Université, Université de Paris, F-75005 Paris, France
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, 52428 Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, 52074 Aachen, Germany
| |
Collapse
|
12
|
Fagerholm ED, Foulkes WMC, Friston KJ, Moran RJ, Leech R. Rendering neuronal state equations compatible with the principle of stationary action. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2021; 11:10. [PMID: 34386910 PMCID: PMC8360977 DOI: 10.1186/s13408-021-00108-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/23/2020] [Accepted: 07/23/2021] [Indexed: 06/13/2023]
Abstract
The principle of stationary action is a cornerstone of modern physics, providing a powerful framework for investigating dynamical systems found in classical mechanics through to quantum field theory. However, computational neuroscience, despite its heavy reliance on concepts in physics, is anomalous in this regard as its main equations of motion are not compatible with a Lagrangian formulation and hence with the principle of stationary action. Taking the Dynamic Causal Modelling (DCM) neuronal state equation as an instructive archetype of the first-order linear differential equations commonly found in computational neuroscience, we show that it is possible to make certain modifications to this equation to render it compatible with the principle of stationary action. Specifically, we show that a Lagrangian formulation of the DCM neuronal state equation is facilitated using a complex dependent variable, an oscillatory solution, and a Hermitian intrinsic connectivity matrix. We first demonstrate proof of principle by using Bayesian model inversion to show that both the original and modified models can be correctly identified via in silico data generated directly from their respective equations of motion. We then provide motivation for adopting the modified models in neuroscience by using three different types of publicly available in vivo neuroimaging datasets, together with open source MATLAB code, to show that the modified (oscillatory) model provides a more parsimonious explanation for some of these empirical timeseries. It is our hope that this work will, in combination with existing techniques, allow people to explore the symmetries and associated conservation laws within neural systems - and to exploit the computational expediency facilitated by direct variational techniques.
Collapse
Affiliation(s)
| | - W M C Foulkes
- Department of Physics, Imperial College London, London, UK
| | - Karl J Friston
- Wellcome Centre for Human Neuroimaging, University College London, London, UK
| | - Rosalyn J Moran
- Department of Neuroimaging, King's College London, London, UK
| | - Robert Leech
- Department of Neuroimaging, King's College London, London, UK
| |
Collapse
|
13
|
Symmetric and Asymmetric Diffusions through Age-Varying Mixed-Species Stand Parameters. Symmetry (Basel) 2021. [DOI: 10.3390/sym13081457] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
(1) Background: This paper deals with unevenly aged, whole-stand models from mixed-effect parameters diffusion processes and Voronoi diagram points of view and concentrates on the mixed-species stands in Lithuania. We focus on the Voronoi diagram of potentially available areas to tree positions as the measure of the competition effect of individual trees and the tree diameter at breast height to relate their evolution through time. (2) Methods: We consider a bivariate hybrid mixed-effect parameters stochastic differential equation for the parameterization of the diameter and available polygon area at age to ensure a proper description of the link between them during the age (time) span of a forest stand. In this study, the Voronoi diagram was used as a mathematical tool for the quantitative characterization of inter-tree competition. (3) Results: The newly derived model considers bivariate correlated observations, tree diameter, and polygon area arising from a particular stand and enables defining equations for calculating diameter, polygon-area, and stand-density predictions and forecasts. (4) Conclusions: From a statistical point of view, the newly developed models produced acceptable statistical measures of predictions and forecasts. All the results were implemented in the Maple computer algebra system.
Collapse
|
14
|
Kieninger S, Keller BG. Path probability ratios for Langevin dynamics-Exact and approximate. J Chem Phys 2021; 154:094102. [PMID: 33685138 DOI: 10.1063/5.0038408] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
Path reweighting is a principally exact method to estimate dynamic properties from biased simulations-provided that the path probability ratio matches the stochastic integrator used in the simulation. Previously reported path probability ratios match the Euler-Maruyama scheme for overdamped Langevin dynamics. Since molecular dynamics simulations use Langevin dynamics rather than overdamped Langevin dynamics, this severely impedes the application of path reweighting methods. Here, we derive the path probability ratio ML for Langevin dynamics propagated by a variant of the Langevin Leapfrog integrator. This new path probability ratio allows for exact reweighting of Langevin dynamics propagated by this integrator. We also show that a previously derived approximate path probability ratio Mapprox differs from the exact ML only by O(ξ4Δt4) and thus yields highly accurate dynamic reweighting results. (Δt is the integration time step, and ξ is the collision rate.) The results are tested, and the efficiency of path reweighting is explored using butane as an example.
Collapse
Affiliation(s)
- S Kieninger
- Department of Biology, Chemistry, Pharmacy, Freie Universität Berlin, Arnimallee 22, D-14195 Berlin, Germany
| | - B G Keller
- Department of Biology, Chemistry, Pharmacy, Freie Universität Berlin, Arnimallee 22, D-14195 Berlin, Germany
| |
Collapse
|
15
|
Stapmanns J, Kühn T, Dahmen D, Luu T, Honerkamp C, Helias M. Self-consistent formulations for stochastic nonlinear neuronal dynamics. Phys Rev E 2020; 101:042124. [PMID: 32422832 DOI: 10.1103/physreve.101.042124] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 12/18/2019] [Indexed: 01/28/2023]
Abstract
Neural dynamics is often investigated with tools from bifurcation theory. However, many neuron models are stochastic, mimicking fluctuations in the input from unknown parts of the brain or the spiking nature of signals. Noise changes the dynamics with respect to the deterministic model; in particular classical bifurcation theory cannot be applied. We formulate the stochastic neuron dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To formulate self-consistency equations, we derive a fundamental link between the effective action in the Onsager-Machlup (OM) formalism, which allows the study of phase transitions, and the MSRDJ effective action, which is computationally advantageous. These results in particular allow the derivation of an OM effective action for systems with non-Gaussian noise. This approach naturally leads to effective deterministic equations for the first moment of the stochastic system; they explain how nonlinearities and noise cooperate to produce memory effects. Moreover, the MSRDJ formulation yields an effective linear system that has identical power spectra and linear response. Starting from the better known loopwise approximation, we then discuss the use of the fRG as a method to obtain self-consistency beyond the mean. We present a new efficient truncation scheme for the hierarchy of flow equations for the vertex functions by adapting the Blaizot, Méndez, and Wschebor approximation from the derivative expansion to the vertex expansion. The methods are presented by means of the simplest possible example of a stochastic differential equation that has generic features of neuronal dynamics.
Collapse
Affiliation(s)
- Jonas Stapmanns
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Thomas Luu
- Institut für Kernphysik (IKP-3), Institute for Advanced Simulation (IAS-4) and Jülich Center for Hadron Physics, Jülich Research Centre, Jülich, Germany
| | - Carsten Honerkamp
- Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany.,JARA-FIT, Jülich Aachen Research Alliance-Fundamentals of Future Information Technology, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| |
Collapse
|
16
|
Abstract
The Wilson-Cowan equations represent a landmark in the history of computational neuroscience. Along with the insights Wilson and Cowan offered for neuroscience, they crystallized an approach to modeling neural dynamics and brain function. Although their iconic equations are used in various guises today, the ideas that led to their formulation and the relationship to other approaches are not well known. Here, we give a little context to some of the biological and theoretical concepts that lead to the Wilson-Cowan equations and discuss how to extend beyond them.
Collapse
Affiliation(s)
- Carson C Chow
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, Maryland
| | - Yahya Karimipanah
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, Maryland
| |
Collapse
|
17
|
Vastola JJ, Holmes WR. Chemical Langevin equation: A path-integral view of Gillespie's derivation. Phys Rev E 2020; 101:032417. [PMID: 32289899 DOI: 10.1103/physreve.101.032417] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2019] [Accepted: 02/25/2020] [Indexed: 12/16/2022]
Abstract
In 2000, Gillespie rehabilitated the chemical Langevin equation (CLE) by describing two conditions that must be satisfied for it to yield a valid approximation of the chemical master equation (CME). In this work, we construct an original path-integral description of the CME and show how applying Gillespie's two conditions to it directly leads to a path-integral equivalent to the CLE. We compare this approach to the path-integral equivalent of a large system size derivation and show that they are qualitatively different. In particular, both approaches involve converting many sums into many integrals, and the difference between the two methods is essentially the difference between using the Euler-Maclaurin formula and using Riemann sums. Our results shed light on how path integrals can be used to conceptualize coarse-graining biochemical systems and are readily generalizable.
Collapse
Affiliation(s)
- John J Vastola
- Department of Physics and Astronomy, Vanderbilt University, Nashville, Tennessee, USA and Quantitative Systems Biology Center, Vanderbilt University, Nashville, Tennessee 37235, USA
| | - William R Holmes
- Department of Physics and Astronomy, Vanderbilt University, Nashville, Tennessee, USA; Quantitative Systems Biology Center, Vanderbilt University, Nashville, Tennessee 37235, USA; and Department of Mathematics, Vanderbilt University, Nashville, Tennessee 37235, USA
| |
Collapse
|
18
|
Abstract
Parallel recordings of motor cortex show weak pairwise correlations on average but a wide dispersion across cells. This observation runs counter to the prevailing notion that optimal information processing requires networks to operate at a critical point, entailing strong correlations. We here reconcile this apparent contradiction by showing that the observed structure of correlations is consistent with network models that operate close to a critical point of a different nature than previously considered: dynamics that is dominated by inhibition yet nearly unstable due to heterogeneous connectivity. Our findings provide a different perspective on criticality in neural systems: network topology and heterogeneity endow the brain with two complementary substrates for critical dynamics of largely different complexities. Cortical networks that have been found to operate close to a critical point exhibit joint activations of large numbers of neurons. However, in motor cortex of the awake macaque monkey, we observe very different dynamics: massively parallel recordings of 155 single-neuron spiking activities show weak fluctuations on the population level. This a priori suggests that motor cortex operates in a noncritical regime, which in models, has been found to be suboptimal for computational performance. However, here, we show the opposite: The large dispersion of correlations across neurons is the signature of a second critical regime. This regime exhibits a rich dynamical repertoire hidden from macroscopic brain signals but essential for high performance in such concepts as reservoir computing. An analytical link between the eigenvalue spectrum of the dynamics, the heterogeneity of connectivity, and the dispersion of correlations allows us to assess the closeness to the critical point.
Collapse
|
19
|
Martí D, Brunel N, Ostojic S. Correlations between synapses in pairs of neurons slow down dynamics in randomly connected neural networks. Phys Rev E 2018; 97:062314. [PMID: 30011528 DOI: 10.1103/physreve.97.062314] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2017] [Indexed: 01/11/2023]
Abstract
Networks of randomly connected neurons are among the most popular models in theoretical neuroscience. The connectivity between neurons in the cortex is however not fully random, the simplest and most prominent deviation from randomness found in experimental data being the overrepresentation of bidirectional connections among pyramidal cells. Using numerical and analytical methods, we investigate the effects of partially symmetric connectivity on the dynamics in networks of rate units. We consider the two dynamical regimes exhibited by random neural networks: the weak-coupling regime, where the firing activity decays to a single fixed point unless the network is stimulated, and the strong-coupling or chaotic regime, characterized by internally generated fluctuating firing rates. In the weak-coupling regime, we compute analytically, for an arbitrary degree of symmetry, the autocorrelation of network activity in the presence of external noise. In the chaotic regime, we perform simulations to determine the timescale of the intrinsic fluctuations. In both cases, symmetry increases the characteristic asymptotic decay time of the autocorrelation function and therefore slows down the dynamics in the network.
Collapse
Affiliation(s)
- Daniel Martí
- Laboratoire de Neurosciences Cognitives, Inserm UMR No. 960, Ecole Normale Supérieure, PSL Research University, 75230 Paris, France
| | - Nicolas Brunel
- Department of Statistics and Department of Neurobiology, University of Chicago, Chicago, Illinois 60637, USA.,Department of Neurobiology and Department of Physics, Duke University, Durham, North Carolina 27710, USA
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, Inserm UMR No. 960, Ecole Normale Supérieure, PSL Research University, 75230 Paris, France
| |
Collapse
|
20
|
Prüstel T, Meier-Schellersheim M. Unified path integral approach to theories of diffusion-influenced reactions. Phys Rev E 2017; 96:022151. [PMID: 28950598 DOI: 10.1103/physreve.96.022151] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2017] [Indexed: 11/07/2022]
Abstract
Building on mathematical similarities between quantum mechanics and theories of diffusion-influenced reactions, we develop a general approach for computational modeling of diffusion-influenced reactions that is capable of capturing not only the classical Smoluchowski picture but also alternative theories, as is here exemplified by a volume reactivity model. In particular, we prove the path decomposition expansion of various Green's functions describing the irreversible and reversible reaction of an isolated pair of molecules. To this end, we exploit a connection between boundary value and interaction potential problems with δ- and δ^{'}-function perturbation. We employ a known path-integral-based summation of a perturbation series to derive a number of exact identities relating propagators and survival probabilities satisfying different boundary conditions in a unified and systematic manner. Furthermore, we show how the path decomposition expansion represents the propagator as a product of three factors in the Laplace domain that correspond to quantities figuring prominently in stochastic spatially resolved simulation algorithms. This analysis will thus be useful for the interpretation of current and the design of future algorithms. Finally, we discuss the relation between the general approach and the theory of Brownian functionals and calculate the mean residence time for the case of irreversible and reversible reactions.
Collapse
Affiliation(s)
- Thorsten Prüstel
- Computational Biology Section, Laboratory of Systems Biology, National Institute of Allergy and Infectious Diseases, National Institutes of Health, Bethesda, Maryland 20892, USA
| | - Martin Meier-Schellersheim
- Computational Biology Section, Laboratory of Systems Biology, National Institute of Allergy and Infectious Diseases, National Institutes of Health, Bethesda, Maryland 20892, USA
| |
Collapse
|
21
|
Abstract
Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks’ spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities—including those of different cell types—combine with connectivity to shape population activity and function. Neuronal networks, like many biological systems, exhibit variable activity. This activity is shaped by both the underlying biology of the component neurons and the structure of their interactions. How can we combine knowledge of these two things—that is, models of individual neurons and of their interactions—to predict the statistics of single- and multi-neuron activity? Current approaches rely on linearizing neural activity around a stationary state. In the face of neural nonlinearities, however, these linear methods can fail to predict spiking statistics and even fail to correctly predict whether activity is stable or pathological. Here, we show how to calculate any spike train cumulant in a broad class of models, while systematically accounting for nonlinear effects. We then study a fundamental effect of nonlinear input-rate transfer–coupling between different orders of spiking statistic–and how this depends on single-neuron and network properties.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Krešimir Josić
- Department of Mathematics and Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- Department of BioSciences, Rice University, Houston, Texas, United States of America
| | - Eric Shea-Brown
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, and UW Institute of Neuroengineering, University of Washington, Seattle, Washington, United States of America
| | - Michael A. Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
22
|
Bressloff PC, Ermentrout B, Faugeras O, Thomas PJ. Stochastic Network Models in Neuroscience: A Festschrift for Jack Cowan. Introduction to the Special Issue. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2016; 6:4. [PMID: 27043152 PMCID: PMC4820414 DOI: 10.1186/s13408-016-0036-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/18/2016] [Accepted: 03/18/2016] [Indexed: 06/05/2023]
Abstract
Jack Cowan's remarkable career has spanned, and molded, the development of neuroscience as a quantitative and mathematical discipline combining deep theoretical contributions, rigorous mathematical work and groundbreaking biological insights. The Banff International Research Station hosted a workshop in his honor, on Stochastic Network Models of Neocortex, July 17-24, 2014. This accompanying Festschrift celebrates Cowan's contributions by assembling current research in stochastic phenomena in neural networks. It combines historical perspectives with new results including applications to epilepsy, path-integral methods, stochastic synchronization, higher-order correlation analysis, and pattern formation in visual cortex.
Collapse
Affiliation(s)
- Paul C. Bressloff
- />Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, UT 84112 USA
| | - Bard Ermentrout
- />Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260 USA
| | - Olivier Faugeras
- />INRIA and LJAD, University of Nice-Sophia-Antipolis, Nice, France
| | - Peter J. Thomas
- />Department of Mathematics, Applied Mathematics and Statistics, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, OH 44106-7058 USA
| |
Collapse
|
23
|
Brinkman BAW, LeBlanc MP, Uhl JT, Ben-Zion Y, Dahmen KA. Probabilistic model of waiting times between large failures in sheared media. Phys Rev E 2016; 93:013003. [PMID: 26871148 DOI: 10.1103/physreve.93.013003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2015] [Indexed: 11/07/2022]
Abstract
Using a probabilistic approximation of a mean-field mechanistic model of sheared systems, we analytically calculate the statistical properties of large failures under slow shear loading. For general shear F(t), the distribution of waiting times between large system-spanning failures is a generalized exponential distribution, ρ_{T}(t)=λ(F(t))P(F(t))exp[-∫_{0}^{t}dτλ(F(τ))P(F(τ))], where λ(F(t)) is the rate of small event occurrences at stress F(t) and P(F(t)) is the probability that a small event triggers a large failure. We study the behavior of this distribution as a function of fault properties, such as heterogeneity or shear rate. Because the probabilistic model accommodates any stress loading F(t), it is particularly useful for modeling experiments designed to understand how different forms of shear loading or stress perturbations impact the waiting-time statistics of large failures. As examples, we study how periodic perturbations or fluctuations on top of a linear shear stress increase impact the waiting-time distribution.
Collapse
Affiliation(s)
- Braden A W Brinkman
- Department of Physics, University of Illinois at Urbana-Champaign, Illinois 61801, USA
| | - Michael P LeBlanc
- Department of Physics, University of Illinois at Urbana-Champaign, Illinois 61801, USA
| | - Jonathan T Uhl
- Department of Physics, University of Illinois at Urbana-Champaign, Illinois 61801, USA.,Department of Earth Sciences, University of Southern California, Los Angeles, California 90089-0740, USA
| | - Yehuda Ben-Zion
- Department of Earth Sciences, University of Southern California, Los Angeles, California 90089-0740, USA
| | - Karin A Dahmen
- Department of Physics, University of Illinois at Urbana-Champaign, Illinois 61801, USA
| |
Collapse
|