1
|
Pazó D. Discontinuous transition to chaos in a canonical random neural network. Phys Rev E 2024; 110:014201. [PMID: 39161016 DOI: 10.1103/physreve.110.014201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Accepted: 06/11/2024] [Indexed: 08/21/2024]
Abstract
We study a paradigmatic random recurrent neural network introduced by Sompolinsky, Crisanti, and Sommers (SCS). In the infinite size limit, this system exhibits a direct transition from a homogeneous rest state to chaotic behavior, with the Lyapunov exponent gradually increasing from zero. We generalize the SCS model considering odd saturating nonlinear transfer functions, beyond the usual choice ϕ(x)=tanhx. A discontinuous transition to chaos occurs whenever the slope of ϕ at 0 is a local minimum [i.e., for ϕ^{'''}(0)>0]. Chaos appears out of the blue, by an attractor-repeller fold. Accordingly, the Lyapunov exponent stays away from zero at the birth of chaos.
Collapse
|
2
|
Papo D, Buldú JM. Does the brain behave like a (complex) network? I. Dynamics. Phys Life Rev 2024; 48:47-98. [PMID: 38145591 DOI: 10.1016/j.plrev.2023.12.006] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Accepted: 12/10/2023] [Indexed: 12/27/2023]
Abstract
Graph theory is now becoming a standard tool in system-level neuroscience. However, endowing observed brain anatomy and dynamics with a complex network structure does not entail that the brain actually works as a network. Asking whether the brain behaves as a network means asking whether network properties count. From the viewpoint of neurophysiology and, possibly, of brain physics, the most substantial issues a network structure may be instrumental in addressing relate to the influence of network properties on brain dynamics and to whether these properties ultimately explain some aspects of brain function. Here, we address the dynamical implications of complex network, examining which aspects and scales of brain activity may be understood to genuinely behave as a network. To do so, we first define the meaning of networkness, and analyse some of its implications. We then examine ways in which brain anatomy and dynamics can be endowed with a network structure and discuss possible ways in which network structure may be shown to represent a genuine organisational principle of brain activity, rather than just a convenient description of its anatomy and dynamics.
Collapse
Affiliation(s)
- D Papo
- Department of Neuroscience and Rehabilitation, Section of Physiology, University of Ferrara, Ferrara, Italy; Center for Translational Neurophysiology, Fondazione Istituto Italiano di Tecnologia, Ferrara, Italy.
| | - J M Buldú
- Complex Systems Group & G.I.S.C., Universidad Rey Juan Carlos, Madrid, Spain
| |
Collapse
|
3
|
Hutt A, Trotter D, Pariz A, Valiante TA, Lefebvre J. Diversity-induced trivialization and resilience of neural dynamics. CHAOS (WOODBURY, N.Y.) 2024; 34:013147. [PMID: 38285722 DOI: 10.1063/5.0165773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Accepted: 01/01/2024] [Indexed: 01/31/2024]
Abstract
Heterogeneity is omnipresent across all living systems. Diversity enriches the dynamical repertoire of these systems but remains challenging to reconcile with their manifest robustness and dynamical persistence over time, a fundamental feature called resilience. To better understand the mechanism underlying resilience in neural circuits, we considered a nonlinear network model, extracting the relationship between excitability heterogeneity and resilience. To measure resilience, we quantified the number of stationary states of this network, and how they are affected by various control parameters. We analyzed both analytically and numerically gradient and non-gradient systems modeled as non-linear sparse neural networks evolving over long time scales. Our analysis shows that neuronal heterogeneity quenches the number of stationary states while decreasing the susceptibility to bifurcations: a phenomenon known as trivialization. Heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in network size and connection probability by quenching the system's dynamic volatility.
Collapse
Affiliation(s)
- Axel Hutt
- MLMS, MIMESIS, Université de Strasbourg, CNRS, Inria, ICube, 67000 Strasbourg, France
| | - Daniel Trotter
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
| | - Aref Pariz
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
| | - Taufik A Valiante
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Electrical and Computer Engineering, Institute of Medical Science, Institute of Biomedical Engineering, Division of Neurosurgery, Department of Surgery, CRANIA (Center for Advancing Neurotechnological Innovation to Application), Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, Ontario M5S 3G8, Canada
| | - Jérémie Lefebvre
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, Ontario M5S 2E4, Canada
| |
Collapse
|
4
|
Ros V, Roy F, Biroli G, Bunin G, Turner AM. Generalized Lotka-Volterra Equations with Random, Nonreciprocal Interactions: The Typical Number of Equilibria. PHYSICAL REVIEW LETTERS 2023; 130:257401. [PMID: 37418712 DOI: 10.1103/physrevlett.130.257401] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Accepted: 05/31/2023] [Indexed: 07/09/2023]
Abstract
We compute the typical number of equilibria of the generalized Lotka-Volterra equations describing species-rich ecosystems with random, nonreciprocal interactions using the replicated Kac-Rice method. We characterize the multiple-equilibria phase by determining the average abundance and similarity between equilibria as a function of their diversity (i.e., of the number of coexisting species) and of the variability of the interactions. We show that linearly unstable equilibria are dominant, and that the typical number of equilibria differs with respect to the average number.
Collapse
Affiliation(s)
- Valentina Ros
- Université Paris-Saclay, CNRS, LPTMS, 91405 Orsay, France
| | - Felix Roy
- Laboratoire de Physique de l'Ecole Normale Supérieure, ENS, Université PSL, CNRS, Sorbonne Université, Université de Paris, F-75005 Paris, France
| | - Giulio Biroli
- Laboratoire de Physique de l'Ecole Normale Supérieure, ENS, Université PSL, CNRS, Sorbonne Université, Université de Paris, F-75005 Paris, France
| | - Guy Bunin
- Department of Physics, Technion-Israel Institute of Technology, Haifa 32000, Israel
| | - Ari M Turner
- Department of Physics, Technion-Israel Institute of Technology, Haifa 32000, Israel
| |
Collapse
|
5
|
Evaluating the statistical similarity of neural network activity and connectivity via eigenvector angles. Biosystems 2023; 223:104813. [PMID: 36460172 DOI: 10.1016/j.biosystems.2022.104813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Revised: 11/15/2022] [Accepted: 11/15/2022] [Indexed: 12/02/2022]
Abstract
Neural systems are networks, and strategic comparisons between multiple networks are a prevalent task in many research scenarios. In this study, we construct a statistical test for the comparison of matrices representing pairwise aspects of neural networks, in particular, the correlation between spiking activity and connectivity. The "eigenangle test" quantifies the similarity of two matrices by the angles between their ranked eigenvectors. We calibrate the behavior of the test for use with correlation matrices using stochastic models of correlated spiking activity and demonstrate how it compares to classical two-sample tests, such as the Kolmogorov-Smirnov distance, in the sense that it is able to evaluate also structural aspects of pairwise measures. Furthermore, the principle of the eigenangle test can be applied to compare the similarity of adjacency matrices of certain types of networks. Thus, the approach can be used to quantitatively explore the relationship between connectivity and activity with the same metric. By applying the eigenangle test to the comparison of connectivity matrices and correlation matrices of a random balanced network model before and after a specific synaptic rewiring intervention, we gauge the influence of connectivity features on the correlated activity. Potential applications of the eigenangle test include simulation experiments, model validation, and data analysis.
Collapse
|
6
|
O'Byrne J, Jerbi K. How critical is brain criticality? Trends Neurosci 2022; 45:820-837. [PMID: 36096888 DOI: 10.1016/j.tins.2022.08.007] [Citation(s) in RCA: 86] [Impact Index Per Article: 28.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Revised: 07/27/2022] [Accepted: 08/10/2022] [Indexed: 10/31/2022]
Abstract
Criticality is the singular state of complex systems poised at the brink of a phase transition between order and randomness. Such systems display remarkable information-processing capabilities, evoking the compelling hypothesis that the brain may itself be critical. This foundational idea is now drawing renewed interest thanks to high-density data and converging cross-disciplinary knowledge. Together, these lines of inquiry have shed light on the intimate link between criticality, computation, and cognition. Here, we review these emerging trends in criticality neuroscience, highlighting new data pertaining to the edge of chaos and near-criticality, and making a case for the distance to criticality as a useful metric for probing cognitive states and mental illness. This unfolding progress in the field contributes to establishing criticality theory as a powerful mechanistic framework for studying emergent function and its efficiency in both biological and artificial neural networks.
Collapse
Affiliation(s)
- Jordan O'Byrne
- Cognitive and Computational Neuroscience Lab, Psychology Department, University of Montreal, Montreal, Quebec, Canada
| | - Karim Jerbi
- Cognitive and Computational Neuroscience Lab, Psychology Department, University of Montreal, Montreal, Quebec, Canada; MILA (Quebec Artificial Intelligence Institute), Montreal, Quebec, Canada; UNIQUE Center (Quebec Neuro-AI Research Center), Montreal, Quebec, Canada.
| |
Collapse
|
7
|
Wardak A, Gong P. Extended Anderson Criticality in Heavy-Tailed Neural Networks. PHYSICAL REVIEW LETTERS 2022; 129:048103. [PMID: 35939004 DOI: 10.1103/physrevlett.129.048103] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Revised: 05/08/2022] [Accepted: 07/05/2022] [Indexed: 06/15/2023]
Abstract
We investigate the emergence of complex dynamics in networks with heavy-tailed connectivity by developing a non-Hermitian random matrix theory. We uncover the existence of an extended critical regime of spatially multifractal fluctuations between the quiescent and active phases. This multifractal critical phase combines features of localization and delocalization and differs from the edge of chaos in classical networks by the appearance of universal hallmarks of Anderson criticality over an extended region in phase space. We show that the rich nonlinear response properties of the extended critical regime can account for a variety of neural dynamics such as the diversity of timescales, providing a computational advantage for persistent classification in a reservoir setting.
Collapse
Affiliation(s)
- Asem Wardak
- School of Physics, University of Sydney, New South Wales 2006, Australia
| | - Pulin Gong
- School of Physics, University of Sydney, New South Wales 2006, Australia
| |
Collapse
|
8
|
Peng X, Lin W. Complex Dynamics of Noise-Perturbed Excitatory-Inhibitory Neural Networks With Intra-Correlative and Inter-Independent Connections. Front Physiol 2022; 13:915511. [PMID: 35812336 PMCID: PMC9263264 DOI: 10.3389/fphys.2022.915511] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 05/09/2022] [Indexed: 11/24/2022] Open
Abstract
Real neural system usually contains two types of neurons, i.e., excitatory neurons and inhibitory ones. Analytical and numerical interpretation of dynamics induced by different types of interactions among the neurons of two types is beneficial to understanding those physiological functions of the brain. Here, we articulate a model of noise-perturbed random neural networks containing both excitatory and inhibitory (E&I) populations. Particularly, both intra-correlatively and inter-independently connected neurons in two populations are taken into account, which is different from the most existing E&I models only considering the independently-connected neurons. By employing the typical mean-field theory, we obtain an equivalent system of two dimensions with an input of stationary Gaussian process. Investigating the stationary autocorrelation functions along the obtained system, we analytically find the parameters’ conditions under which the synchronized behaviors between the two populations are sufficiently emergent. Taking the maximal Lyapunov exponent as an index, we also find different critical values of the coupling strength coefficients for the chaotic excitatory neurons and for the chaotic inhibitory ones. Interestingly, we reveal that the noise is able to suppress chaotic dynamics of the random neural networks having neurons in two populations, while an appropriate amount of correlation coefficient in intra-coupling strengths can enhance chaos occurrence. Finally, we also detect a previously-reported phenomenon where the parameters region corresponds to neither linearly stable nor chaotic dynamics; however, the size of the region area crucially depends on the populations’ parameters.
Collapse
Affiliation(s)
- Xiaoxiao Peng
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| | - Wei Lin
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- State Key Laboratory of Medical Neurobiology, MOE Frontiers Center for Brain Science, and Institutes of Brain Science, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| |
Collapse
|
9
|
Malekan A, Saber S, Saberi AA. Exact finite-size scaling for the random-matrix representation of bond percolation on square lattice. CHAOS (WOODBURY, N.Y.) 2022; 32:023112. [PMID: 35232057 DOI: 10.1063/5.0079323] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Accepted: 01/20/2022] [Indexed: 06/14/2023]
Abstract
We report on the exact treatment of a random-matrix representation of a bond-percolation model on a square lattice in two dimensions with occupation probability p. The percolation problem is mapped onto a random complex matrix composed of two random real-valued matrices of elements +1 and -1 with probability p and 1-p, respectively. We find that the onset of percolation transition can be detected by the emergence of power-law divergences due to the coalescence of the first two extreme eigenvalues in the thermodynamic limit. We develop a universal finite-size scaling law that fully characterizes the scaling behavior of the extreme eigenvalue's fluctuation in terms of a set of universal scaling exponents and amplitudes. We make use of the relative entropy as an index of the disparity between two distributions of the first and second-largest extreme eigenvalues to show that its minimum underlies the scaling framework. Our study may provide an inroad for developing new methods and algorithms with diverse applications in machine learning, complex systems, and statistical physics.
Collapse
Affiliation(s)
- Azadeh Malekan
- Department of Physics, University of Tehran, P. O. Box 14395-547, Tehran, Iran
| | - Sina Saber
- Department of Physics, University of Tehran, P. O. Box 14395-547, Tehran, Iran
| | - Abbas Ali Saberi
- Department of Physics, University of Tehran, P. O. Box 14395-547, Tehran, Iran
| |
Collapse
|
10
|
Krishnamurthy K, Can T, Schwab DJ. Theory of Gating in Recurrent Neural Networks. PHYSICAL REVIEW. X 2022; 12:011011. [PMID: 36545030 PMCID: PMC9762509 DOI: 10.1103/physrevx.12.011011] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience. Prior theoretical work has focused on RNNs with additive interactions. However gating i.e., multiplicative interactions are ubiquitous in real neurons and also the central feature of the best-performing RNNs in ML. Here, we show that gating offers flexible control of two salient features of the collective dynamics: (i) timescales and (ii) dimensionality. The gate controlling timescales leads to a novel marginally stable state, where the network functions as a flexible integrator. Unlike previous approaches, gating permits this important function without parameter fine-tuning or special symmetries. Gates also provide a flexible, context-dependent mechanism to reset the memory trace, thus complementing the memory function. The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs. At this transition, unlike additive RNNs, the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity). The rich dynamics are summarized in phase diagrams, thus providing a map for principled parameter initialization choices to ML practitioners.
Collapse
Affiliation(s)
- Kamesh Krishnamurthy
- Joseph Henry Laboratories of Physics and PNI, Princeton University, Princeton, New Jersey 08544, USA
| | - Tankut Can
- Institute for Advanced Study, Princeton, New Jersey 08540, USA
| | - David J. Schwab
- Initiative for Theoretical Sciences, Graduate Center, CUNY, New York, New York 10016, USA
| |
Collapse
|
11
|
Abstract
We consider a nonlinear autonomous system of [Formula: see text] degrees of freedom randomly coupled by both relaxational ("gradient") and nonrelaxational ("solenoidal") random interactions. We show that with increased interaction strength, such systems generically undergo an abrupt transition from a trivial phase portrait with a single stable equilibrium into a topologically nontrivial regime of "absolute instability" where equilibria are on average exponentially abundant, but typically, all of them are unstable, unless the dynamics is purely gradient. When interactions increase even further, the stable equilibria eventually become on average exponentially abundant unless the interaction is purely solenoidal. We further calculate the mean proportion of equilibria that have a fixed fraction of unstable directions.
Collapse
|
12
|
Curado EMF, Melgar NB, Nobre FD. External Stimuli on Neural Networks: Analytical and Numerical Approaches. ENTROPY 2021; 23:e23081034. [PMID: 34441174 PMCID: PMC8393424 DOI: 10.3390/e23081034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Revised: 08/03/2021] [Accepted: 08/05/2021] [Indexed: 11/26/2022]
Abstract
Based on the behavior of living beings, which react mostly to external stimuli, we introduce a neural-network model that uses external patterns as a fundamental tool for the process of recognition. In this proposal, external stimuli appear as an additional field, and basins of attraction, representing memories, arise in accordance with this new field. This is in contrast to the more-common attractor neural networks, where memories are attractors inside well-defined basins of attraction. We show that this procedure considerably increases the storage capabilities of the neural network; this property is illustrated by the standard Hopfield model, which reveals that the recognition capacity of our model may be enlarged, typically, by a factor 102. The primary challenge here consists in calibrating the influence of the external stimulus, in order to attenuate the noise generated by memories that are not correlated with the external pattern. The system is analyzed primarily through numerical simulations. However, since there is the possibility of performing analytical calculations for the Hopfield model, the agreement between these two approaches can be tested—matching results are indicated in some cases. We also show that the present proposal exhibits a crucial attribute of living beings, which concerns their ability to react promptly to changes in the external environment. Additionally, we illustrate that this new approach may significantly enlarge the recognition capacity of neural networks in various situations; with correlated and non-correlated memories, as well as diluted, symmetric, or asymmetric interactions (synapses). This demonstrates that it can be implemented easily on a wide diversity of models.
Collapse
|
13
|
Reis AS, Brugnago EL, Caldas IL, Batista AM, Iarosz KC, Ferrari FAS, Viana RL. Suppression of chaotic bursting synchronization in clustered scale-free networks by an external feedback signal. CHAOS (WOODBURY, N.Y.) 2021; 31:083128. [PMID: 34470231 DOI: 10.1063/5.0056672] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Accepted: 08/02/2021] [Indexed: 06/13/2023]
Abstract
Oscillatory activities in the brain, detected by electroencephalograms, have identified synchronization patterns. These synchronized activities in neurons are related to cognitive processes. Additionally, experimental research studies on neuronal rhythms have shown synchronous oscillations in brain disorders. Mathematical modeling of networks has been used to mimic these neuronal synchronizations. Actually, networks with scale-free properties were identified in some regions of the cortex. In this work, to investigate these brain synchronizations, we focus on neuronal synchronization in a network with coupled scale-free networks. The networks are connected according to a topological organization in the structural cortical regions of the human brain. The neuronal dynamic is given by the Rulkov model, which is a two-dimensional iterated map. The Rulkov neuron can generate quiescence, tonic spiking, and bursting. Depending on the parameters, we identify synchronous behavior among the neurons in the clustered networks. In this work, we aim to suppress the neuronal burst synchronization by the application of an external perturbation as a function of the mean-field of membrane potential. We found that the method we used to suppress synchronization presents better results when compared to the time-delayed feedback method when applied to the same model of the neuronal network.
Collapse
Affiliation(s)
- Adriane S Reis
- Physics Institute, University of São Paulo, 05508-090 São Paulo, SP, Brazil
| | - Eduardo L Brugnago
- Physics Department, Federal University of Paraná, 81531-980 Curitiba, PR, Brazil
| | - Iberê L Caldas
- Physics Institute, University of São Paulo, 81531-980 São Paulo, SP, Brazil
| | - Antonio M Batista
- Department of Mathematics and Statistics, State University of Ponta Grossa, 84030-900 Ponta Grossa, PR, Brazil
| | - Kelly C Iarosz
- Faculty of Telêmaco Borba, 84266-010 Telêmaco Borba, PR, Brazil
| | - Fabiano A S Ferrari
- Institute of Engineering, Science and Technology, Federal University of the Valleys of Jequitinhonha and Mucuri, 39803-371 Janaúba, MG, Brazil
| | - Ricardo L Viana
- Physics Department, Federal University of Paraná, 81531-980 Curitiba, PR, Brazil
| |
Collapse
|
14
|
An emergent autonomous flow for mean-field spin glasses. Probab Theory Relat Fields 2021. [DOI: 10.1007/s00440-021-01040-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
15
|
Belga Fedeli S, Fyodorov YV, Ipsen JR. Nonlinearity-generated resilience in large complex systems. Phys Rev E 2021; 103:022201. [PMID: 33736106 DOI: 10.1103/physreve.103.022201] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2020] [Accepted: 12/08/2020] [Indexed: 12/15/2022]
Abstract
We consider a generic nonlinear extension of May's 1972 model by including all higher-order terms in the expansion around the chosen fixed point (placed at the origin) with random Gaussian coefficients. The ensuing analysis reveals that as long as the origin remains stable, it is surrounded by a "resilience gap": there are no other fixed points within a radius r_{*}>0 and the system is therefore expected to be resilient to a typical initial displacement small in comparison to r_{*}. The radius r_{*} is shown to vanish at the same threshold where the origin loses local stability, revealing a mechanism by which systems close to the tipping point become less resilient. We also find that beyond the resilience radius the number of fixed points in a ball surrounding the original point of equilibrium grows exponentially with N, making systems dynamics highly sensitive to far enough displacements from the origin.
Collapse
Affiliation(s)
- S Belga Fedeli
- Department of Mathematics, King's College London, London WC2R 2LS, England, United Kingdom
| | - Y V Fyodorov
- Department of Mathematics, King's College London, London WC2R 2LS, England, United Kingdom.,L. D. Landau Institute for Theoretical Physics, Semenova 1a, 142432 Chernogolovka, Russia
| | - J R Ipsen
- ARC Centre of Excellence for Mathematical and Statistical Frontiers, School of Mathematics and Statistics, The University of Melbourne, 3010 Parkville, VIC, Australia
| |
Collapse
|
16
|
Morrison CL, Greenwood PE, Ward LM. Plastic systemic inhibition controls amplitude while allowing phase pattern in a stochastic neural field model. Phys Rev E 2021; 103:032311. [PMID: 33862754 DOI: 10.1103/physreve.103.032311] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Accepted: 02/19/2021] [Indexed: 11/07/2022]
Abstract
We investigate oscillatory phase pattern formation and amplitude control for a linearized stochastic neuron field model by simulating Mexican-hat-coupled stochastic processes. We find, for several choices of parameters, that spatial pattern formation in the temporal phases of the coupled processes occurs if and only if their amplitudes are allowed to grow unrealistically large. Stimulated by recent work on homeostatic inhibitory plasticity, we introduce static and plastic (adaptive) systemic inhibitory mechanisms to keep the amplitudes stochastically bounded. We find that systems with static inhibition exhibited bounded amplitudes but no sustained phase patterns. With plastic systemic inhibition, on the other hand, the resulting systems exhibit both bounded amplitudes and sustained phase patterns. These results demonstrate that plastic inhibitory mechanisms in neural field models can dynamically control amplitudes while allowing patterns of phase synchronization to develop. Similar mechanisms of plastic systemic inhibition could play a role in regulating oscillatory functioning in the brain.
Collapse
Affiliation(s)
- Conor L Morrison
- Department of Statistics, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z4
| | - Priscilla E Greenwood
- Department of Mathematics, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z2
| | - Lawrence M Ward
- Department of Psychology and Djavad Mowafaghian Centre for Brain Health, 2136 West Mall, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z4
| |
Collapse
|
17
|
Scale free topology as an effective feedback system. PLoS Comput Biol 2020; 16:e1007825. [PMID: 32392249 PMCID: PMC7241857 DOI: 10.1371/journal.pcbi.1007825] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2019] [Revised: 05/21/2020] [Accepted: 03/26/2020] [Indexed: 12/13/2022] Open
Abstract
Biological networks are often heterogeneous in their connectivity pattern, with degree distributions featuring a heavy tail of highly connected hubs. The implications of this heterogeneity on dynamical properties are a topic of much interest. Here we show that interpreting topology as a feedback circuit can provide novel insights on dynamics. Based on the observation that in finite networks a small number of hubs have a disproportionate effect on the entire system, we construct an approximation by lumping these nodes into a single effective hub, which acts as a feedback loop with the rest of the nodes. We use this approximation to study dynamics of networks with scale-free degree distributions, focusing on their probability of convergence to fixed points. We find that the approximation preserves convergence statistics over a wide range of settings. Our mapping provides a parametrization of scale free topology which is predictive at the ensemble level and also retains properties of individual realizations. Specifically, outgoing hubs have an organizing role that can drive the network to convergence, in analogy to suppression of chaos by an external drive. In contrast, incoming hubs have no such property, resulting in a marked difference between the behavior of networks with outgoing vs. incoming scale free degree distribution. Combining feedback analysis with mean field theory predicts a transition between convergent and divergent dynamics which is corroborated by numerical simulations. Furthermore, they highlight the effect of a handful of outlying hubs, rather than of the connectivity distribution law as a whole, on network dynamics. Nature abounds with complex networks of interacting elements—from the proteins in our cells, through neural networks in our brains, to species interacting in ecosystems. In all of these fields, the relation between network structure and dynamics is an important research question. A recurring feature of natural networks is their heterogeneous structure: individual elements exhibit a huge diversity of connectivity patterns, which complicates the understanding of network dynamics. To address this problem, we devised a simplified approximation for complex structured networks which captures their dynamical properties. Separating out the largest “hubs”—a small number of nodes with disproportionately high connectivity—we represent them by a single node linked to the rest of the network. This enables us to borrow concepts from control theory, where a system’s output is linked back to itself forming a feedback loop. In this analogy, hubs in heterogeneous networks implement a feedback circuit with the rest of the network. The analogy reveals how these hubs can coordinate the network and drive it more easily towards stable states. Our approach enables analyzing dynamical properties of heterogeneous networks, which is difficult to achieve with existing techniques. It is potentially applicable to many fields where heterogeneous networks are important.
Collapse
|
18
|
Ipsen JR, Peterson ADH. Consequences of Dale's law on the stability-complexity relationship of random neural networks. Phys Rev E 2020; 101:052412. [PMID: 32575310 DOI: 10.1103/physreve.101.052412] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2019] [Accepted: 04/08/2020] [Indexed: 06/11/2023]
Abstract
In the study of randomly connected neural network dynamics there is a phase transition from a simple state with few equilibria to a complex state characterized by the number of equilibria growing exponentially with the neuron population. Such phase transitions are often used to describe pathological brain state transitions observed in neurological diseases such as epilepsy. In this paper we investigate how more realistic heterogeneous network structures affect these phase transitions using techniques from random matrix theory. Specifically, we parametrize the network structure according to Dale's law and use the Kac-Rice formalism to compute the change in the number of equilibria when a phase transition occurs. We also examine the condition where the network is not balanced between excitation and inhibition causing outliers to appear in the eigenspectrum. This enables us to compute the effects of different heterogeneous network connectivities on brain state transitions, which can provide insights into pathological brain dynamics.
Collapse
Affiliation(s)
- J R Ipsen
- ARC Centre of Excellence for Mathematical and Statistical Frontiers, School of Mathematics and Statistics, University of Melbourne, 3010 Parkville, Victoria, Australia
| | - A D H Peterson
- Graeme Clarke Institute, University of Melbourne, 3053 Carlton, Victoria, Australia and Department of Medicine, St. Vincent's Hospital, University of Melbourne, 3065 Fitzroy, Victoria, Australia
| |
Collapse
|
19
|
Gosti G, Folli V, Leonetti M, Ruocco G. Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks. ENTROPY 2019; 21:e21080726. [PMID: 33267440 PMCID: PMC7515255 DOI: 10.3390/e21080726] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Accepted: 07/23/2019] [Indexed: 11/16/2022]
Abstract
In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14N, as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.
Collapse
Affiliation(s)
- Giorgio Gosti
- Center for Life Nanoscience, Istituto Italiano di Tecnologia, Viale Regina Elena 291, 00161 Rome, Italy
- Correspondence:
| | - Viola Folli
- Center for Life Nanoscience, Istituto Italiano di Tecnologia, Viale Regina Elena 291, 00161 Rome, Italy
| | - Marco Leonetti
- Center for Life Nanoscience, Istituto Italiano di Tecnologia, Viale Regina Elena 291, 00161 Rome, Italy
- CNR NANOTEC-Institute of Nanotechnology c/o Campus Ecotekne, University of Salento, Via Monteroni, 73100 Lecce, Italy
| | - Giancarlo Ruocco
- Center for Life Nanoscience, Istituto Italiano di Tecnologia, Viale Regina Elena 291, 00161 Rome, Italy
- Department of Physics, Sapienza University of Rome, Piazzale Aldo Moro 5, 00185 Rome, Italy
| |
Collapse
|
20
|
Dahmen D, Grün S, Diesmann M, Helias M. Second type of criticality in the brain uncovers rich multiple-neuron dynamics. Proc Natl Acad Sci U S A 2019; 116:13051-13060. [PMID: 31189590 PMCID: PMC6600928 DOI: 10.1073/pnas.1818972116] [Citation(s) in RCA: 44] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/03/2022] Open
Abstract
Cortical networks that have been found to operate close to a critical point exhibit joint activations of large numbers of neurons. However, in motor cortex of the awake macaque monkey, we observe very different dynamics: massively parallel recordings of 155 single-neuron spiking activities show weak fluctuations on the population level. This a priori suggests that motor cortex operates in a noncritical regime, which in models, has been found to be suboptimal for computational performance. However, here, we show the opposite: The large dispersion of correlations across neurons is the signature of a second critical regime. This regime exhibits a rich dynamical repertoire hidden from macroscopic brain signals but essential for high performance in such concepts as reservoir computing. An analytical link between the eigenvalue spectrum of the dynamics, the heterogeneity of connectivity, and the dispersion of correlations allows us to assess the closeness to the critical point.
Collapse
Affiliation(s)
- David Dahmen
- Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, 52425 Jülich, Germany;
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, 52425 Jülich, Germany
- JARA Institute Brain Structure-Function Relationships (INM-10), Jülich-Aachen Research Alliance, Jülich Research Centre, 52425 Jülich, Germany
| | - Sonja Grün
- Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, 52425 Jülich, Germany
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, 52425 Jülich, Germany
- JARA Institute Brain Structure-Function Relationships (INM-10), Jülich-Aachen Research Alliance, Jülich Research Centre, 52425 Jülich, Germany
- Theoretical Systems Neurobiology, RWTH Aachen University, 52056 Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, 52425 Jülich, Germany
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, 52425 Jülich, Germany
- JARA Institute Brain Structure-Function Relationships (INM-10), Jülich-Aachen Research Alliance, Jülich Research Centre, 52425 Jülich, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, School of Medicine, RWTH Aachen University, 52074 Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, 52062 Aachen, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, 52425 Jülich, Germany
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, 52425 Jülich, Germany
- JARA Institute Brain Structure-Function Relationships (INM-10), Jülich-Aachen Research Alliance, Jülich Research Centre, 52425 Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, 52062 Aachen, Germany
| |
Collapse
|
21
|
Martí D, Brunel N, Ostojic S. Correlations between synapses in pairs of neurons slow down dynamics in randomly connected neural networks. Phys Rev E 2018; 97:062314. [PMID: 30011528 DOI: 10.1103/physreve.97.062314] [Citation(s) in RCA: 30] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2017] [Indexed: 01/11/2023]
Abstract
Networks of randomly connected neurons are among the most popular models in theoretical neuroscience. The connectivity between neurons in the cortex is however not fully random, the simplest and most prominent deviation from randomness found in experimental data being the overrepresentation of bidirectional connections among pyramidal cells. Using numerical and analytical methods, we investigate the effects of partially symmetric connectivity on the dynamics in networks of rate units. We consider the two dynamical regimes exhibited by random neural networks: the weak-coupling regime, where the firing activity decays to a single fixed point unless the network is stimulated, and the strong-coupling or chaotic regime, characterized by internally generated fluctuating firing rates. In the weak-coupling regime, we compute analytically, for an arbitrary degree of symmetry, the autocorrelation of network activity in the presence of external noise. In the chaotic regime, we perform simulations to determine the timescale of the intrinsic fluctuations. In both cases, symmetry increases the characteristic asymptotic decay time of the autocorrelation function and therefore slows down the dynamics in the network.
Collapse
Affiliation(s)
- Daniel Martí
- Laboratoire de Neurosciences Cognitives, Inserm UMR No. 960, Ecole Normale Supérieure, PSL Research University, 75230 Paris, France
| | - Nicolas Brunel
- Department of Statistics and Department of Neurobiology, University of Chicago, Chicago, Illinois 60637, USA.,Department of Neurobiology and Department of Physics, Duke University, Durham, North Carolina 27710, USA
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, Inserm UMR No. 960, Ecole Normale Supérieure, PSL Research University, 75230 Paris, France
| |
Collapse
|
22
|
Gonzalez-Dominguez J, Martin MJ. MPIGeneNet: Parallel Calculation of Gene Co-Expression Networks on Multicore Clusters. IEEE/ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS 2018; 15:1732-1737. [PMID: 29028205 DOI: 10.1109/tcbb.2017.2761340] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
In this work, we present MPIGeneNet, a parallel tool that applies Pearson's correlation and Random Matrix Theory to construct gene co-expression networks. It is based on the state-of-the-art sequential tool RMTGeneNet, which provides networks with high robustness and sensitivity at the expenses of relatively long runtimes for large scale input datasets. MPIGeneNet returns the same results as RMTGeneNet but improves the memory management, reduces the I/O cost, and accelerates the two most computationally demanding steps of co-expression network construction by exploiting the compute capabilities of common multicore CPU clusters. Our performance evaluation on two different systems using three typical input datasets shows that MPIGeneNet is significantly faster than RMTGeneNet. As an example, our tool is up to 175.41 times faster on a cluster with eight nodes, each one containing two 12-core Intel Haswell processors. The source code of MPIGeneNet, as well as a reference manual, are available at https://sourceforge.net/projects/mpigenenet/.
Collapse
|
23
|
Baity-Jesi M, Achard-de Lustrac A, Biroli G. Activated dynamics: An intermediate model between the random energy model and the p-spin model. Phys Rev E 2018; 98:012133. [PMID: 30110833 DOI: 10.1103/physreve.98.012133] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2018] [Indexed: 11/07/2022]
Abstract
To study the activated dynamics of mean-field glasses, which takes place on times of order exp(N), where N is the system size, we introduce a new model, the correlated random energy model (CREM), that allows for a smooth interpolation between the REM and the p-spin models. We study numerically and analytically the CREM in the intermediate regime between REM and p-spin. We fully characterize its energy landscape, which is like a golf course but, at variance with the REM, has metabasins (or holes) containing several configurations. We find that an effective description for the dynamics, in terms of traps, emerges, provided that one identifies metabasins in the CREM with configurations in the trap model.
Collapse
Affiliation(s)
- Marco Baity-Jesi
- Department of Chemistry, Columbia University, New York, New York 10027, USA
| | | | - Giulio Biroli
- Institut de Physique Théorique, Université Paris Saclay, CEA, CNRS, F-91191 Gif-sur-Yvette, France and Laboratoire de Physique Statistique, École Normale Supérieure, PSL Research University, 24 rue Lhomond, 75005 Paris, France
| |
Collapse
|
24
|
Synchronization transition in neuronal networks composed of chaotic or non-chaotic oscillators. Sci Rep 2018; 8:8370. [PMID: 29849108 PMCID: PMC5976724 DOI: 10.1038/s41598-018-26730-9] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 05/11/2018] [Indexed: 12/20/2022] Open
Abstract
Chaotic dynamics has been shown in the dynamics of neurons and neural networks, in experimental data and numerical simulations. Theoretical studies have proposed an underlying role of chaos in neural systems. Nevertheless, whether chaotic neural oscillators make a significant contribution to network behaviour and whether the dynamical richness of neural networks is sensitive to the dynamics of isolated neurons, still remain open questions. We investigated synchronization transitions in heterogeneous neural networks of neurons connected by electrical coupling in a small world topology. The nodes in our model are oscillatory neurons that – when isolated – can exhibit either chaotic or non-chaotic behaviour, depending on conductance parameters. We found that the heterogeneity of firing rates and firing patterns make a greater contribution than chaos to the steepness of the synchronization transition curve. We also show that chaotic dynamics of the isolated neurons do not always make a visible difference in the transition to full synchrony. Moreover, macroscopic chaos is observed regardless of the dynamics nature of the neurons. However, performing a Functional Connectivity Dynamics analysis, we show that chaotic nodes can promote what is known as multi-stable behaviour, where the network dynamically switches between a number of different semi-synchronized, metastable states.
Collapse
|
25
|
Effect of dilution in asymmetric recurrent neural networks. Neural Netw 2018; 104:50-59. [PMID: 29705670 DOI: 10.1016/j.neunet.2018.04.003] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Revised: 04/04/2018] [Accepted: 04/08/2018] [Indexed: 11/22/2022]
Abstract
We study with numerical simulation the possible limit behaviors of synchronous discrete-time deterministic recurrent neural networks composed of N binary neurons as a function of a network's level of dilution and asymmetry. The network dilution measures the fraction of neuron couples that are connected, and the network asymmetry measures to what extent the underlying connectivity matrix is asymmetric. For each given neural network, we study the dynamical evolution of all the different initial conditions, thus characterizing the full dynamical landscape without imposing any learning rule. Because of the deterministic dynamics, each trajectory converges to an attractor, that can be either a fixed point or a limit cycle. These attractors form the set of all the possible limit behaviors of the neural network. For each network we then determine the convergence times, the limit cycles' length, the number of attractors, and the sizes of the attractors' basin. We show that there are two network structures that maximize the number of possible limit behaviors. The first optimal network structure is fully-connected and symmetric. On the contrary, the second optimal network structure is highly sparse and asymmetric. The latter optimal is similar to what observed in different biological neuronal circuits. These observations lead us to hypothesize that independently from any given learning model, an efficient and effective biologic network that stores a number of limit behaviors close to its maximum capacity tends to develop a connectivity structure similar to one of the optimal networks we found.
Collapse
|
26
|
Livi L, Bianchi FM, Alippi C. Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:706-717. [PMID: 28092580 DOI: 10.1109/tnnls.2016.2644268] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
It is a widely accepted fact that the computational capability of recurrent neural networks (RNNs) is maximized on the so-called "edge of criticality." Once the network operates in this configuration, it performs efficiently on a specific application both in terms of: 1) low prediction error and 2) high short-term memory capacity. Since the behavior of recurrent networks is strongly influenced by the particular input signal driving the dynamics, a universal, application-independent method for determining the edge of criticality is still missing. In this paper, we aim at addressing this issue by proposing a theoretically motivated, unsupervised method based on Fisher information for determining the edge of criticality in RNNs. It is proved that Fisher information is maximized for (finite-size) systems operating in such critical regions. However, Fisher information is notoriously difficult to compute and requires the analytic form of the probability density function ruling the system behavior. This paper takes advantage of a recently developed nonparametric estimator of the Fisher information matrix and provides a method to determine the critical region of echo state networks (ESNs), a particular class of recurrent networks. The considered control parameters, which indirectly affect the ESN performance, are explored to identify those configurations lying on the edge of criticality and, as such, maximizing Fisher information and computational performance. Experimental results on benchmarks and real-world data demonstrate the effectiveness of the proposed method.
Collapse
|
27
|
Dynamical complexity and computation in recurrent neural networks beyond their fixed point. Sci Rep 2018; 8:3319. [PMID: 29463810 PMCID: PMC5820323 DOI: 10.1038/s41598-018-21624-2] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2017] [Accepted: 02/07/2018] [Indexed: 01/19/2023] Open
Abstract
Spontaneous activity found in neural networks usually results in a reduction of computational performance. As a consequence, artificial neural networks are often operated at the edge of chaos, where the network is stable yet highly susceptible to input information. Surprisingly, regular spontaneous dynamics in Neural Networks beyond their resting state possess a high degree of spatio-temporal synchronization, a situation that can also be found in biological neural networks. Characterizing information preservation via complexity indices, we show how spatial synchronization allows rRNNs to reduce the negative impact of regular spontaneous dynamics on their computational performance.
Collapse
|
28
|
An Investigation of the Dynamical Transitions in Harmonically Driven Random Networks of Firing-Rate Neurons. Cognit Comput 2017; 9:351-363. [PMID: 28680506 PMCID: PMC5487873 DOI: 10.1007/s12559-017-9464-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2016] [Accepted: 03/21/2017] [Indexed: 11/16/2022]
Abstract
Continuous-time recurrent neural networks are widely used as models of neural dynamics and also have applications in machine learning. But their dynamics are not yet well understood, especially when they are driven by external stimuli. In this article, we study the response of stable and unstable networks to different harmonically oscillating stimuli by varying a parameter ρ, the ratio between the timescale of the network and the stimulus, and use the dimensionality of the network’s attractor as an estimate of the complexity of this response. Additionally, we propose a novel technique for exploring the stationary points and locally linear dynamics of these networks in order to understand the origin of input-dependent dynamical transitions. Attractors in both stable and unstable networks show a peak in dimensionality for intermediate values of ρ, with the latter consistently showing a higher dimensionality than the former, which exhibit a resonance-like phenomenon. We explain changes in the dimensionality of a network’s dynamics in terms of changes in the underlying structure of its vector field by analysing stationary points. Furthermore, we uncover the coexistence of underlying attractors with various geometric forms in unstable networks. As ρ is increased, our visualisation technique shows the network passing through a series of phase transitions with its trajectory taking on a sequence of qualitatively distinct figure-of-eight, cylinder, and spiral shapes. These findings bring us one step closer to a comprehensive theory of this important class of neural networks by revealing the subtle structure of their dynamics under different conditions.
Collapse
|
29
|
Mastrogiuseppe F, Ostojic S. Intrinsically-generated fluctuating activity in excitatory-inhibitory networks. PLoS Comput Biol 2017; 13:e1005498. [PMID: 28437436 PMCID: PMC5421821 DOI: 10.1371/journal.pcbi.1005498] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Revised: 05/08/2017] [Accepted: 04/04/2017] [Indexed: 12/05/2022] Open
Abstract
Recurrent networks of non-linear units display a variety of dynamical regimes depending on the structure of their synaptic connectivity. A particularly remarkable phenomenon is the appearance of strongly fluctuating, chaotic activity in networks of deterministic, but randomly connected rate units. How this type of intrinsically generated fluctuations appears in more realistic networks of spiking neurons has been a long standing question. To ease the comparison between rate and spiking networks, recent works investigated the dynamical regimes of randomly-connected rate networks with segregated excitatory and inhibitory populations, and firing rates constrained to be positive. These works derived general dynamical mean field (DMF) equations describing the fluctuating dynamics, but solved these equations only in the case of purely inhibitory networks. Using a simplified excitatory-inhibitory architecture in which DMF equations are more easily tractable, here we show that the presence of excitation qualitatively modifies the fluctuating activity compared to purely inhibitory networks. In presence of excitation, intrinsically generated fluctuations induce a strong increase in mean firing rates, a phenomenon that is much weaker in purely inhibitory networks. Excitation moreover induces two different fluctuating regimes: for moderate overall coupling, recurrent inhibition is sufficient to stabilize fluctuations; for strong coupling, firing rates are stabilized solely by the upper bound imposed on activity, even if inhibition is stronger than excitation. These results extend to more general network architectures, and to rate networks receiving noisy inputs mimicking spiking activity. Finally, we show that signatures of the second dynamical regime appear in networks of integrate-and-fire neurons.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
- Laboratoire de Physique Statistique, CNRS UMR 8550, École Normale Supérieure - PSL Research University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
| |
Collapse
|
30
|
Folli V, Leonetti M, Ruocco G. On the Maximum Storage Capacity of the Hopfield Model. Front Comput Neurosci 2017; 10:144. [PMID: 28119595 PMCID: PMC5222833 DOI: 10.3389/fncom.2016.00144] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2016] [Accepted: 12/20/2016] [Indexed: 12/02/2022] Open
Abstract
Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored memory patterns (P) exceeds a fraction (≈ 14%) of the network size N. In this paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero. We investigate this model at finite N. We give an analytical expression for the number of retrieval errors and show that, by increasing the number of stored patterns over a certain threshold, the errors start to decrease and reach values below unit for P ≫ N. We demonstrate that the strongest trade-off between efficiency and effectiveness relies on the number of patterns (P) that are stored in the network by appropriately fixing the connection weights. When P≫N and the diagonal elements of the adjacency matrix are not forced to be zero, the optimal storage capacity is obtained with a number of stored memories much larger than previously reported. This theory paves the way to the design of RNN with high storage capacity and able to retrieve the desired pattern without distortions.
Collapse
Affiliation(s)
- Viola Folli
- Center for Life Nanoscience, Istituto Italiano di Tecnologia Rome, Italy
| | - Marco Leonetti
- Center for Life Nanoscience, Istituto Italiano di Tecnologia Rome, Italy
| | - Giancarlo Ruocco
- Center for Life Nanoscience, Istituto Italiano di TecnologiaRome, Italy; Department of Physics, Sapienza University of RomeRome, Italy
| |
Collapse
|
31
|
Luçon E, Stannat W. Transition from Gaussian to non-Gaussian fluctuations for mean-field diffusions in spatial interaction. ANN APPL PROBAB 2016. [DOI: 10.1214/16-aap1194] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
32
|
Bimbard C, Ledoux E, Ostojic S. Instability to a heterogeneous oscillatory state in randomly connected recurrent networks with delayed interactions. Phys Rev E 2016; 94:062207. [PMID: 28085410 DOI: 10.1103/physreve.94.062207] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2016] [Indexed: 06/06/2023]
Abstract
Oscillatory dynamics are ubiquitous in biological networks. Possible sources of oscillations are well understood in low-dimensional systems but have not been fully explored in high-dimensional networks. Here we study large networks consisting of randomly coupled rate units. We identify a type of bifurcation in which a continuous part of the eigenvalue spectrum of the linear stability matrix crosses the instability line at nonzero frequency. This bifurcation occurs when the interactions are delayed and partially antisymmetric and leads to a heterogeneous oscillatory state in which oscillations are apparent in the activity of individual units but not on the population-average level.
Collapse
Affiliation(s)
- Célian Bimbard
- Laboratoire des Systèmes Perceptifs, Équipe Audition, CNRS UMR 8248, École Normale Supérieure, Paris, France
| | - Erwan Ledoux
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure-PSL Research University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure-PSL Research University, Paris, France
| |
Collapse
|
33
|
Prakash R, Pandey A. Saturation of number variance in embedded random-matrix ensembles. Phys Rev E 2016; 93:052225. [PMID: 27300898 DOI: 10.1103/physreve.93.052225] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2015] [Indexed: 11/07/2022]
Abstract
We study fluctuation properties of embedded random matrix ensembles of noninteracting particles. For ensemble of two noninteracting particle systems, we find that unlike the spectra of classical random matrices, correlation functions are nonstationary. In the locally stationary region of spectra, we study the number variance and the spacing distributions. The spacing distributions follow the Poisson statistics, which is a key behavior of uncorrelated spectra. The number variance varies linearly as in the Poisson case for short correlation lengths but a kind of regularization occurs for large correlation lengths, and the number variance approaches saturation values. These results are known in the study of integrable systems but are being demonstrated for the first time in random matrix theory. We conjecture that the interacting particle cases, which exhibit the characteristics of classical random matrices for short correlation lengths, will also show saturation effects for large correlation lengths.
Collapse
Affiliation(s)
- Ravi Prakash
- School of Physical Sciences, Jawaharlal Nehru University, New Delhi 110067, India
| | - Akhilesh Pandey
- School of Physical Sciences, Jawaharlal Nehru University, New Delhi 110067, India
| |
Collapse
|
34
|
Abstract
We study a system of [Formula: see text] degrees of freedom coupled via a smooth homogeneous Gaussian vector field with both gradient and divergence-free components. In the absence of coupling, the system is exponentially relaxing to an equilibrium with rate μ We show that, while increasing the ratio of the coupling strength to the relaxation rate, the system experiences an abrupt transition from a topologically trivial phase portrait with a single equilibrium into a topologically nontrivial regime characterized by an exponential number of equilibria, the vast majority of which are expected to be unstable. It is suggested that this picture provides a global view on the nature of the May-Wigner instability transition originally discovered by local linear stability analysis.
Collapse
|
35
|
Martiniani S, Schrenk KJ, Stevenson JD, Wales DJ, Frenkel D. Turning intractable counting into sampling: Computing the configurational entropy of three-dimensional jammed packings. Phys Rev E 2016; 93:012906. [PMID: 26871142 DOI: 10.1103/physreve.93.012906] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2015] [Indexed: 06/05/2023]
Abstract
We present a numerical calculation of the total number of disordered jammed configurations Ω of N repulsive, three-dimensional spheres in a fixed volume V. To make these calculations tractable, we increase the computational efficiency of the approach of Xu et al. [Phys. Rev. Lett. 106, 245502 (2011)10.1103/PhysRevLett.106.245502] and Asenjo et al. [Phys. Rev. Lett. 112, 098002 (2014)10.1103/PhysRevLett.112.098002] and we extend the method to allow computation of the configurational entropy as a function of pressure. The approach that we use computes the configurational entropy by sampling the absolute volume of basins of attraction of the stable packings in the potential energy landscape. We find a surprisingly strong correlation between the pressure of a configuration and the volume of its basin of attraction in the potential energy landscape. This relation is well described by a power law. Our methodology to compute the number of minima in the potential energy landscape should be applicable to a wide range of other enumeration problems in statistical physics, string theory, cosmology, and machine learning that aim to find the distribution of the extrema of a scalar cost function that depends on many degrees of freedom.
Collapse
Affiliation(s)
- Stefano Martiniani
- Department of Chemistry, University of Cambridge, Lensfield Road, Cambridge CB2 1EW, United Kingdom
| | - K Julian Schrenk
- Department of Chemistry, University of Cambridge, Lensfield Road, Cambridge CB2 1EW, United Kingdom
| | - Jacob D Stevenson
- Department of Chemistry, University of Cambridge, Lensfield Road, Cambridge CB2 1EW, United Kingdom
- Microsoft Research Ltd, 21 Station Road, Cambridge CB1 2FB, United Kingdom
| | - David J Wales
- Department of Chemistry, University of Cambridge, Lensfield Road, Cambridge CB2 1EW, United Kingdom
| | - Daan Frenkel
- Department of Chemistry, University of Cambridge, Lensfield Road, Cambridge CB2 1EW, United Kingdom
| |
Collapse
|
36
|
Wainrib G, Galtier M. Regular graphs maximize the variability of random neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:032802. [PMID: 26465523 DOI: 10.1103/physreve.92.032802] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/17/2014] [Indexed: 06/05/2023]
Abstract
In this work we study the dynamics of systems composed of numerous interacting elements interconnected through a random weighted directed graph, such as models of random neural networks. We develop an original theoretical approach based on a combination of a classical mean-field theory originally developed in the context of dynamical spin-glass models, and the heterogeneous mean-field theory developed to study epidemic propagation on graphs. Our main result is that, surprisingly, increasing the variance of the in-degree distribution does not result in a more variable dynamical behavior, but on the contrary that the most variable behaviors are obtained in the regular graph setting. We further study how the dynamical complexity of the attractors is influenced by the statistical properties of the in-degree distribution.
Collapse
Affiliation(s)
- Gilles Wainrib
- Ecole Normale Supérieure, Département d'Informatique, équipe DATA, Paris, France
| | - Mathieu Galtier
- European Institute for Theoretical Neuroscience, Paris, France
| |
Collapse
|
37
|
Harish O, Hansel D. Asynchronous Rate Chaos in Spiking Neuronal Circuits. PLoS Comput Biol 2015; 11:e1004266. [PMID: 26230679 PMCID: PMC4521798 DOI: 10.1371/journal.pcbi.1004266] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2014] [Accepted: 04/03/2015] [Indexed: 01/25/2023] Open
Abstract
The brain exhibits temporally complex patterns of activity with features similar to those of chaotic systems. Theoretical studies over the last twenty years have described various computational advantages for such regimes in neuronal systems. Nevertheless, it still remains unclear whether chaos requires specific cellular properties or network architectures, or whether it is a generic property of neuronal circuits. We investigate the dynamics of networks of excitatory-inhibitory (EI) spiking neurons with random sparse connectivity operating in the regime of balance of excitation and inhibition. Combining Dynamical Mean-Field Theory with numerical simulations, we show that chaotic, asynchronous firing rate fluctuations emerge generically for sufficiently strong synapses. Two different mechanisms can lead to these chaotic fluctuations. One mechanism relies on slow I-I inhibition which gives rise to slow subthreshold voltage and rate fluctuations. The decorrelation time of these fluctuations is proportional to the time constant of the inhibition. The second mechanism relies on the recurrent E-I-E feedback loop. It requires slow excitation but the inhibition can be fast. In the corresponding dynamical regime all neurons exhibit rate fluctuations on the time scale of the excitation. Another feature of this regime is that the population-averaged firing rate is substantially smaller in the excitatory population than in the inhibitory population. This is not necessarily the case in the I-I mechanism. Finally, we discuss the neurophysiological and computational significance of our results.
Collapse
Affiliation(s)
- Omri Harish
- Center for Neurophysics, Physiology and Pathologies, CNRS UMR8119 and Institute of Neuroscience and Cognition, Université Paris Descartes, Paris, France
| | - David Hansel
- Center for Neurophysics, Physiology and Pathologies, CNRS UMR8119 and Institute of Neuroscience and Cognition, Université Paris Descartes, Paris, France
- The Alexander Silberman Institute of Life Sciences, The Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
38
|
Muir DR, Mrsic-Flogel T. Eigenspectrum bounds for semirandom matrices with modular and spatial structure for neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 91:042808. [PMID: 25974548 DOI: 10.1103/physreve.91.042808] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/01/2014] [Indexed: 06/04/2023]
Abstract
The eigenvalue spectrum of the matrix of directed weights defining a neural network model is informative of several stability and dynamical properties of network activity. Existing results for eigenspectra of sparse asymmetric random matrices neglect spatial or other constraints in determining entries in these matrices, and so are of partial applicability to cortical-like architectures. Here we examine a parameterized class of networks that are defined by sparse connectivity, with connection weighting modulated by physical proximity (i.e., asymmetric Euclidean random matrices), modular network partitioning, and functional specificity within the excitatory population. We present a set of analytical constraints that apply to the eigenvalue spectra of associated weight matrices, highlighting the relationship between connectivity rules and classes of network dynamics.
Collapse
Affiliation(s)
- Dylan R Muir
- Biozentrum, University of Basel, 4056 Basel, Switzerland
| | | |
Collapse
|
39
|
Mehta D, Hauenstein JD, Niemerg M, Simm NJ, Stariolo DA. Energy landscape of the finite-size mean-field 2-spin spherical model and topology trivialization. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 91:022133. [PMID: 25768484 DOI: 10.1103/physreve.91.022133] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/08/2014] [Indexed: 06/04/2023]
Abstract
Motivated by the recently observed phenomenon of topology trivialization of potential energy landscapes (PELs) for several statistical mechanics models, we perform a numerical study of the finite-size 2-spin spherical model using both numerical polynomial homotopy continuation and a reformulation via non-Hermitian matrices. The continuation approach computes all of the complex stationary points of this model while the matrix approach computes the real stationary points. Using these methods, we compute the average number of stationary points while changing the topology of the PEL as well as the variance. Histograms of these stationary points are presented along with an analysis regarding the complex stationary points. This work connects topology trivialization to two different branches of mathematics: algebraic geometry and catastrophe theory, which is fertile ground for further interdisciplinary research.
Collapse
Affiliation(s)
- Dhagash Mehta
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46565, USA
- Department of Chemistry, The University of Cambridge, Cambridge CB2 1EW, United Kingdom
| | - Jonathan D Hauenstein
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46565, USA
- Simons Institute for the Theory of Computing, University of California, Berkeley, California 94720-2190, USA
| | - Matthew Niemerg
- Simons Institute for the Theory of Computing, University of California, Berkeley, California 94720-2190, USA
- National Institute of Mathematical Sciences, Daejeon, Korea
| | - Nicholas J Simm
- School of Mathematical Sciences, Queen Mary, University of London, Mile End Road, London E1 4NS, United Kingdom
| | - Daniel A Stariolo
- Instituto de Física, Universidade Federal do Rio Grande do Sul and National Institute of Science and Technology for Complex Systems, CP 15051, 91501-970 Porto Alegre, RS, Brazil
| |
Collapse
|
40
|
Luçon E, Stannat W. Mean field limit for disordered diffusions with singular interactions. ANN APPL PROBAB 2014. [DOI: 10.1214/13-aap968] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
41
|
Stability, complexity and robustness in population dynamics. Acta Biotheor 2014; 62:243-84. [PMID: 25107273 DOI: 10.1007/s10441-014-9229-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2013] [Accepted: 06/17/2014] [Indexed: 12/21/2022]
Abstract
The problem of stability in population dynamics concerns many domains of application in demography, biology, mechanics and mathematics. The problem is highly generic and independent of the population considered (human, animals, molecules,…). We give in this paper some examples of population dynamics concerning nucleic acids interacting through direct nucleic binding with small or cyclic RNAs acting on mRNAs or tRNAs as translation factors or through protein complexes expressed by genes and linked to DNA as transcription factors. The networks made of these interactions between nucleic acids (considered respectively as edges and nodes of their interaction graph) are complex, but exhibit simple emergent asymptotic behaviours, when time tends to infinity, called attractors. We show that the quantity called attractor entropy plays a crucial role in the study of the stability and robustness of such genetic networks.
Collapse
|
42
|
Jalan S, Dwivedi SK. Extreme-value statistics of brain networks: importance of balanced condition. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 89:062718. [PMID: 25019825 DOI: 10.1103/physreve.89.062718] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2014] [Indexed: 06/03/2023]
Abstract
Despite the key role played by inhibitory-excitatory couplings in the functioning of brain networks, the impact of a balanced condition on the stability properties of underlying networks remains largely unknown. We investigate properties of the largest eigenvalues of networks having such couplings, and find that they follow completely different statistics when in the balanced situation. Based on numerical simulations, we demonstrate that the transition from Weibull to Fréchet via the Gumbel distribution can be controlled by the variance of the column sum of the adjacency matrix, which depends monotonically on the denseness of the underlying network. As a balanced condition is imposed, the largest real part of the eigenvalue emulates a transition to the generalized extreme-value statistics, independent of the inhibitory connection probability. Furthermore, the transition to the Weibull statistics and the small-world transition occur at the same rewiring probability, reflecting a more stable system.
Collapse
Affiliation(s)
- Sarika Jalan
- Complex Systems Lab, Indian Institute of Technology Indore, IET-DAVV Campus Khandwa Road, Indore-452017, India
| | - Sanjiv K Dwivedi
- Complex Systems Lab, Indian Institute of Technology Indore, IET-DAVV Campus Khandwa Road, Indore-452017, India
| |
Collapse
|
43
|
Abstract
Modeling gene regulatory networks (GRNs) is an important topic in systems biology. Although there has been much work focusing on various specific systems, the generic behavior of GRNs with continuous variables is still elusive. In particular, it is not clear typically how attractors partition among the three types of orbits: steady state, periodic and chaotic, and how the dynamical properties change with network's topological characteristics. In this work, we first investigated these questions in random GRNs with different network sizes, connectivity, fraction of inhibitory links and transcription regulation rules. Then we searched for the core motifs that govern the dynamic behavior of large GRNs. We show that the stability of a random GRN is typically governed by a few embedding motifs of small sizes, and therefore can in general be understood in the context of these short motifs. Our results provide insights for the study and design of genetic networks.
Collapse
|
44
|
Ostojic S. Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons. Nat Neurosci 2014; 17:594-600. [PMID: 24561997 DOI: 10.1038/nn.3658] [Citation(s) in RCA: 174] [Impact Index Per Article: 15.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2013] [Accepted: 01/23/2014] [Indexed: 12/14/2022]
Abstract
Asynchronous activity in balanced networks of excitatory and inhibitory neurons is believed to constitute the primary medium for the propagation and transformation of information in the neocortex. Here we show that an unstructured, sparsely connected network of model spiking neurons can display two fundamentally different types of asynchronous activity that imply vastly different computational properties. For weak synaptic couplings, the network at rest is in the well-studied asynchronous state, in which individual neurons fire irregularly at constant rates. In this state, an external input leads to a highly redundant response of different neurons that favors information transmission but hinders more complex computations. For strong couplings, we find that the network at rest displays rich internal dynamics, in which the firing rates of individual neurons fluctuate strongly in time and across neurons. In this regime, the internal dynamics interact with incoming stimuli to provide a substrate for complex information processing and learning.
Collapse
Affiliation(s)
- Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure, Paris, France
| |
Collapse
|
45
|
Wainrib G, García del Molino LC. Optimal system size for complex dynamics in random neural networks near criticality. CHAOS (WOODBURY, N.Y.) 2013; 23:043134. [PMID: 24387573 DOI: 10.1063/1.4841396] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
In this article, we consider a model of dynamical agents coupled through a random connectivity matrix, as introduced by Sompolinsky et al. [Phys. Rev. Lett. 61(3), 259-262 (1988)] in the context of random neural networks. When system size is infinite, it is known that increasing the disorder parameter induces a phase transition leading to chaotic dynamics. We observe and investigate here a novel phenomenon in the sub-critical regime for finite size systems: the probability of observing complex dynamics is maximal for an intermediate system size when the disorder is close enough to criticality. We give a more general explanation of this type of system size resonance in the framework of extreme values theory for eigenvalues of random matrices.
Collapse
Affiliation(s)
- Gilles Wainrib
- Laboratoire Analyse Géométrie et Applications, Université Paris XIII, Villetaneuse, France
| | | |
Collapse
|
46
|
García del Molino LC, Pakdaman K, Touboul J, Wainrib G. Synchronization in random balanced networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2013; 88:042824. [PMID: 24229242 DOI: 10.1103/physreve.88.042824] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2013] [Indexed: 06/02/2023]
Abstract
Characterizing the influence of network properties on the global emerging behavior of interacting elements constitutes a central question in many areas, from physical to social sciences. In this article we study a primary model of disordered neuronal networks with excitatory-inhibitory structure and balance constraints. We show how the interplay between structure and disorder in the connectivity leads to a universal transition from trivial to synchronized stationary or periodic states. This transition cannot be explained only through the analysis of the spectral density of the connectivity matrix. We provide a low-dimensional approximation that shows the role of both the structure and disorder in the dynamics.
Collapse
|