1
|
Liang J, Yang Z, Zhou C. Excitation-Inhibition Balance, Neural Criticality, and Activities in Neuronal Circuits. Neuroscientist 2025; 31:31-46. [PMID: 38291889 DOI: 10.1177/10738584231221766] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2024]
Abstract
Neural activities in local circuits exhibit complex and multilevel dynamic features. Individual neurons spike irregularly, which is believed to originate from receiving balanced amounts of excitatory and inhibitory inputs, known as the excitation-inhibition balance. The spatial-temporal cascades of clustered neuronal spikes occur in variable sizes and durations, manifested as neural avalanches with scale-free features. These may be explained by the neural criticality hypothesis, which posits that neural systems operate around the transition between distinct dynamic states. Here, we summarize the experimental evidence for and the underlying theory of excitation-inhibition balance and neural criticality. Furthermore, we review recent studies of excitatory-inhibitory networks with synaptic kinetics as a simple solution to reconcile these two apparently distinct theories in a single circuit model. This provides a more unified understanding of multilevel neural activities in local circuits, from spontaneous to stimulus-response dynamics.
Collapse
Affiliation(s)
- Junhao Liang
- Eberhard Karls University of Tübingen and Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Zhuda Yang
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Life Science Imaging Centre, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Research Centre, Hong Kong Baptist University Institute of Research and Continuing Education, Shenzhen, China
| |
Collapse
|
2
|
Liu C, Dong JQ, Yu LC, Huang L. Continuum percolation of two-dimensional adaptive dynamics systems. Phys Rev E 2024; 110:024111. [PMID: 39295008 DOI: 10.1103/physreve.110.024111] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Accepted: 07/17/2024] [Indexed: 09/21/2024]
Abstract
The percolation phase transition of a continuum adaptive neuron system with homeostasis is investigated. In order to maintain their average activity at a particular level, each neuron (represented by a disk) varies its connection radius until the sum of overlapping areas with neighboring neurons (representing the overall connection strength in the network) has reached a fixed target area for each neuron. Tuning the two key parameters in the model, i.e., the density defined as the number of neurons (disks) per unit area and the sum of the overlapping area of each disk with its adjacent disks, can drive the system into the critical percolating state. These two parameters are inversely proportional to each other at the critical state, and the critical filling factors are fixed about 0.7157, which is much less than the case of the continuum percolation with uniform disks. It is also confirmed that the critical exponents in this model are the same as the two-dimensional standard lattice percolation. Although the critical state is relatively more sensitive and exhibits long-range spatial correlation, local fluctuations do not propagate in a long-range manner through the system by the adaptive dynamics, which renders the system overall robust against perturbations.
Collapse
Affiliation(s)
- Chang Liu
- Lanzhou Center for Theoretical Physics, Key Laboratory of Quantum Theory and Applications of MoE and Key Laboratory of Theoretical Physics of Gansu Province, Lanzhou University, Lanzhou, Gansu 730000, China
| | - Jia-Qi Dong
- Lanzhou Center for Theoretical Physics, Key Laboratory of Quantum Theory and Applications of MoE and Key Laboratory of Theoretical Physics of Gansu Province, Lanzhou University, Lanzhou, Gansu 730000, China
| | - Lian-Chun Yu
- Lanzhou Center for Theoretical Physics, Key Laboratory of Quantum Theory and Applications of MoE and Key Laboratory of Theoretical Physics of Gansu Province, Lanzhou University, Lanzhou, Gansu 730000, China
| | - Liang Huang
- Lanzhou Center for Theoretical Physics, Key Laboratory of Quantum Theory and Applications of MoE and Key Laboratory of Theoretical Physics of Gansu Province, Lanzhou University, Lanzhou, Gansu 730000, China
| |
Collapse
|
3
|
Fontenele AJ, Sooter JS, Norman VK, Gautam SH, Shew WL. Low-dimensional criticality embedded in high-dimensional awake brain dynamics. SCIENCE ADVANCES 2024; 10:eadj9303. [PMID: 38669340 PMCID: PMC11051676 DOI: 10.1126/sciadv.adj9303] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 03/26/2024] [Indexed: 04/28/2024]
Abstract
Whether cortical neurons operate in a strongly or weakly correlated dynamical regime determines fundamental information processing capabilities and has fueled decades of debate. We offer a resolution of this debate; we show that two important dynamical regimes, typically considered incompatible, can coexist in the same local cortical circuit by separating them into two different subspaces. In awake mouse motor cortex, we find a low-dimensional subspace with large fluctuations consistent with criticality-a dynamical regime with moderate correlations and multi-scale information capacity and transmission. Orthogonal to this critical subspace, we find a high-dimensional subspace containing a desynchronized dynamical regime, which may optimize input discrimination. The critical subspace is apparent only at long timescales, which explains discrepancies among some previous studies. Using a computational model, we show that the emergence of a low-dimensional critical subspace at large timescales agrees with established theory of critical dynamics. Our results suggest that the cortex leverages its high dimensionality to multiplex dynamical regimes across different subspaces.
Collapse
|
4
|
Fontenele AJ, Sooter JS, Norman VK, Gautam SH, Shew WL. Low dimensional criticality embedded in high dimensional awake brain dynamics. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.05.522896. [PMID: 37546833 PMCID: PMC10401950 DOI: 10.1101/2023.01.05.522896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/08/2023]
Abstract
Whether cortical neurons operate in a strongly or weakly correlated dynamical regime determines fundamental information processing capabilities and has fueled decades of debate. Here we offer a resolution of this debate; we show that two important dynamical regimes, typically considered incompatible, can coexist in the same local cortical circuit by separating them into two different subspaces. In awake mouse motor cortex, we find a low-dimensional subspace with large fluctuations consistent with criticality - a dynamical regime with moderate correlations and multi-scale information capacity and transmission. Orthogonal to this critical subspace, we find a high-dimensional subspace containing a desynchronized dynamical regime, which may optimize input discrimination. The critical subspace is apparent only at long timescales, which explains discrepancies among some previous studies. Using a computational model, we show that the emergence of a low-dimensional critical subspace at large timescale agrees with established theory of critical dynamics. Our results suggest that cortex leverages its high dimensionality to multiplex dynamical regimes across different subspaces.
Collapse
Affiliation(s)
- Antonio J. Fontenele
- UA Integrative Systems Neuroscience Group, Department of Physics, University of Arkansas, Fayetteville, AR, USA, 72701
| | - J. Samuel Sooter
- UA Integrative Systems Neuroscience Group, Department of Physics, University of Arkansas, Fayetteville, AR, USA, 72701
| | - V. Kindler Norman
- UA Integrative Systems Neuroscience Group, Department of Physics, University of Arkansas, Fayetteville, AR, USA, 72701
| | - Shree Hari Gautam
- UA Integrative Systems Neuroscience Group, Department of Physics, University of Arkansas, Fayetteville, AR, USA, 72701
| | - Woodrow L. Shew
- UA Integrative Systems Neuroscience Group, Department of Physics, University of Arkansas, Fayetteville, AR, USA, 72701
| |
Collapse
|
5
|
Sormunen S, Gross T, Saramäki J. Critical Drift in a Neuro-Inspired Adaptive Network. PHYSICAL REVIEW LETTERS 2023; 130:188401. [PMID: 37204886 DOI: 10.1103/physrevlett.130.188401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2022] [Revised: 02/04/2023] [Accepted: 04/03/2023] [Indexed: 05/21/2023]
Abstract
It has been postulated that the brain operates in a self-organized critical state that brings multiple benefits, such as optimal sensitivity to input. Thus far, self-organized criticality has typically been depicted as a one-dimensional process, where one parameter is tuned to a critical value. However, the number of adjustable parameters in the brain is vast, and hence critical states can be expected to occupy a high-dimensional manifold inside a high-dimensional parameter space. Here, we show that adaptation rules inspired by homeostatic plasticity drive a neuro-inspired network to drift on a critical manifold, where the system is poised between inactivity and persistent activity. During the drift, global network parameters continue to change while the system remains at criticality.
Collapse
Affiliation(s)
- Silja Sormunen
- Department of Computer Science, Aalto University, 00076 Espoo, Finland
| | - Thilo Gross
- Helmholtz Institute for Functional Marine Biodiversity at the University of Oldenburg (HIFMB), Oldenburg 26129, Germany
- Alfred-Wegener Institute, Helmholtz Centre for Marine and Polar Research, Bremerhaven 27570, Germany
- Institute for Chemistry and Biology of the Marine Environment (ICBM), Carl-von-Ossietzky University, Oldenburg 26129, Germany
| | - Jari Saramäki
- Department of Computer Science, Aalto University, 00076 Espoo, Finland
| |
Collapse
|
6
|
Yadav AC, Quadir A, Jafri HH. Finite-size scaling of critical avalanches. Phys Rev E 2022; 106:014148. [PMID: 35974645 DOI: 10.1103/physreve.106.014148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Accepted: 07/15/2022] [Indexed: 06/15/2023]
Abstract
We examine probability distribution for avalanche sizes observed in self-organized critical systems. While a power-law distribution with a cutoff because of finite system size is typical behavior, a systematic investigation reveals that it may also decrease with increasing the system size at a fixed avalanche size. We implement the scaling method and identify scaling functions. The data collapse ensures a correct estimation of the critical exponents and distinguishes two exponents related to avalanche size and system size. Our simple analysis provides striking implications. While the exact value for avalanches size exponent remains elusive for the prototype sandpile on a square lattice, we suggest the exponent should be 1. The simulation results represent that the distribution shows a logarithmic system size dependence, consistent with the normalization condition. We also argue that for the train or Oslo sandpile model with bulk drive, the avalanche size exponent is slightly less than 1, which differs significantly from the previous estimate of 1.11.
Collapse
Affiliation(s)
- Avinash Chand Yadav
- Department of Physics, Institute of Science, Banaras Hindu University, Varanasi 221 005, India
| | - Abdul Quadir
- Department of Physics, Aligarh Muslim University, Aligarh 202 002, India
| | - Haider Hasan Jafri
- Department of Physics, Aligarh Muslim University, Aligarh 202 002, India
| |
Collapse
|
7
|
Khoshkhou M, Montakhab A. Optimal reinforcement learning near the edge of a synchronization transition. Phys Rev E 2022; 105:044312. [PMID: 35590577 DOI: 10.1103/physreve.105.044312] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2021] [Accepted: 03/30/2022] [Indexed: 06/15/2023]
Abstract
Recent experimental and theoretical studies have indicated that the putative criticality of cortical dynamics may correspond to a synchronization phase transition. The critical dynamics near such a critical point needs further investigation specifically when compared to the critical behavior near the standard absorbing state phase transition. Since the phenomena of learning and self-organized criticality (SOC) at the edge of synchronization transition can emerge jointly in spiking neural networks due to the presence of spike-timing dependent plasticity (STDP), it is tempting to ask the following: what is the relationship between synchronization and learning in neural networks? Further, does learning benefit from SOC at the edge of synchronization transition? In this paper, we intend to address these important issues. Accordingly, we construct a biologically inspired model of a cognitive system which learns to perform stimulus-response tasks. We train this system using a reinforcement learning rule implemented through dopamine-modulated STDP. We find that the system exhibits a continuous transition from synchronous to asynchronous neural oscillations upon increasing the average axonal time delay. We characterize the learning performance of the system and observe that it is optimized near the synchronization transition. We also study neuronal avalanches in the system and provide evidence that optimized learning is achieved in a slightly supercritical state.
Collapse
Affiliation(s)
- Mahsa Khoshkhou
- Department of Physics, College of Sciences, Shiraz University, Shiraz 71946-84795, Iran
| | - Afshin Montakhab
- Department of Physics, College of Sciences, Shiraz University, Shiraz 71946-84795, Iran
| |
Collapse
|
8
|
Drifting assemblies for persistent memory: Neuron transitions and unsupervised compensation. Proc Natl Acad Sci U S A 2021; 118:2023832118. [PMID: 34772802 DOI: 10.1073/pnas.2023832118] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/11/2021] [Indexed: 11/18/2022] Open
Abstract
Change is ubiquitous in living beings. In particular, the connectome and neural representations can change. Nevertheless, behaviors and memories often persist over long times. In a standard model, associative memories are represented by assemblies of strongly interconnected neurons. For faithful storage these assemblies are assumed to consist of the same neurons over time. Here we propose a contrasting memory model with complete temporal remodeling of assemblies, based on experimentally observed changes of synapses and neural representations. The assemblies drift freely as noisy autonomous network activity and spontaneous synaptic turnover induce neuron exchange. The gradual exchange allows activity-dependent and homeostatic plasticity to conserve the representational structure and keep inputs, outputs, and assemblies consistent. This leads to persistent memory. Our findings explain recent experimental results on temporal evolution of fear memory representations and suggest that memory systems need to be understood in their completeness as individual parts may constantly change.
Collapse
|
9
|
Gu L, Wu R. Robust cortical criticality and diverse dynamics resulting from functional specification. Phys Rev E 2021; 103:042407. [PMID: 34005915 DOI: 10.1103/physreve.103.042407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2020] [Accepted: 03/23/2021] [Indexed: 11/07/2022]
Abstract
Despite the recognition of the layered structure and evident criticality in the cortex, how the specification of input, output, and computational layers affects the self-organized criticality has not been much explored. By constructing heterogeneous structures with a well-accepted model of leaky neurons, we find that the specification can lead to robust criticality rather insensitive to the strength of external stimuli. This naturally unifies the adaptation to strong inputs without extra synaptic plasticity mechanisms. Low degree of recurrence constitutes an alternative explanation to subcriticality other than the high-frequency inputs. Unlike fully recurrent networks where external stimuli always render subcriticality, the dynamics of networks with sufficient feedforward connections can be driven to criticality and supercriticality. These findings indicate that functional and structural specification and their interplay with external stimuli are of crucial importance for the network dynamics. The robust criticality puts forward networks of the leaky neurons as promising platforms for realizing artificial neural networks that work in the vicinity of critical points.
Collapse
Affiliation(s)
- Lei Gu
- Department of Physics and Astronomy, University of California, Irvine, California 92697, USA
| | - Ruqian Wu
- Department of Physics and Astronomy, University of California, Irvine, California 92697, USA
| |
Collapse
|
10
|
Antonov NV, Gulitskiy NM, Kakin PI, Serov VD. Effects of turbulent environment and random noise on self-organized critical behavior: Universality versus nonuniversality. Phys Rev E 2021; 103:042106. [PMID: 34005875 DOI: 10.1103/physreve.103.042106] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2020] [Accepted: 03/08/2021] [Indexed: 11/07/2022]
Abstract
Self-organized criticality in the Hwa-Kardar model of a "running sandpile" [Phys. Rev. Lett. 62, 1813 (1989)10.1103/PhysRevLett.62.1813; Phys. Rev. A 45, 7002 (1992)10.1103/PhysRevA.45.7002] with a turbulent motion of the environment taken into account is studied with the field theoretic renormalization group (RG). The turbulent flow is modeled by the synthetic d-dimensional generalization of the anisotropic Gaussian velocity ensemble with finite correlation time, introduced by Avellaneda and Majda [Commun. Math. Phys. 131, 381 (1990)10.1007/BF02161420; Commun. Math. Phys. 146, 139 (1992)10.1007/BF02099212]. The Hwa-Kardar model with time-independent (spatially quenched) random noise is considered alongside the original model with white noise. The aim of the present paper is to explore fixed points of the RG equations which determine the possible types of universality classes (regimes of critical behavior of the system) and critical dimensions of the measurable quantities. Our calculations demonstrate that influence of the type of random noise is extremely large: in contrast to the case of white noise where the system possesses three fixed points, the case of spatially quenched noise involves four fixed points with overlapping stability regions. This means that in the latter case the critical behavior of the system depends not only on the global parameters of the system, which is the usual case, but also on the initial values of the charges (coupling constants) of the system. These initial conditions determine the specific fixed point which will be reached by the RG flow. Since now the critical properties of the system are not defined strictly by its parameters, the situation may be interpreted as a universality violation. Such systems are not forbidden, but they are rather rare. It is especially interesting that the same model without turbulent motion of the environment does not predict this nonuniversal behavior and demonstrates the usual one with prescribed universality classes instead [J. Stat. Phys. 178, 392 (2020)10.1007/s10955-019-02436-8].
Collapse
Affiliation(s)
- N V Antonov
- Department of Physics, Saint Petersburg State University, 7/9 Universitetskaya nab., Saint Petersburg 199034, Russian Federation
| | - N M Gulitskiy
- Department of Physics, Saint Petersburg State University, 7/9 Universitetskaya nab., Saint Petersburg 199034, Russian Federation
| | - P I Kakin
- Department of Physics, Saint Petersburg State University, 7/9 Universitetskaya nab., Saint Petersburg 199034, Russian Federation
| | - V D Serov
- Department of Physics, Saint Petersburg State University, 7/9 Universitetskaya nab., Saint Petersburg 199034, Russian Federation.,Department of Theoretical Physics, Peter the Great Saint Petersburg Polytechnic University, 29 Polytechnicheskaya st., Saint Petersburg 195251, Russian Federation
| |
Collapse
|
11
|
Notarmuzi D, Castellano C, Flammini A, Mazzilli D, Radicchi F. Percolation theory of self-exciting temporal processes. Phys Rev E 2021; 103:L020302. [PMID: 33736024 DOI: 10.1103/physreve.103.l020302] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2020] [Accepted: 02/05/2021] [Indexed: 11/07/2022]
Abstract
We investigate how the properties of inhomogeneous patterns of activity, appearing in many natural and social phenomena, depend on the temporal resolution used to define individual bursts of activity. To this end, we consider time series of microscopic events produced by a self-exciting Hawkes process, and leverage a percolation framework to study the formation of macroscopic bursts of activity as a function of the resolution parameter. We find that the very same process may result in different distributions of avalanche size and duration, which are understood in terms of the competition between the 1D percolation and the branching process universality class. Pure regimes for the individual classes are observed at specific values of the resolution parameter corresponding to the critical points of the percolation diagram. A regime of crossover characterized by a mixture of the two universal behaviors is observed in a wide region of the diagram. The hybrid scaling appears to be a likely outcome for an analysis of the time series based on a reasonably chosen, but not precisely adjusted, value of the resolution parameter.
Collapse
Affiliation(s)
- Daniele Notarmuzi
- Center for Complex Networks and Systems Research, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, Indiana 47408, USA
| | - Claudio Castellano
- Istituto dei Sistemi Complessi (ISC-CNR), Via dei Taurini 19, I-00185 Rome, Italy
| | - Alessandro Flammini
- Center for Complex Networks and Systems Research, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, Indiana 47408, USA
| | - Dario Mazzilli
- Center for Complex Networks and Systems Research, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, Indiana 47408, USA
| | - Filippo Radicchi
- Center for Complex Networks and Systems Research, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, Indiana 47408, USA
| |
Collapse
|
12
|
Gross T. Not One, but Many Critical States: A Dynamical Systems Perspective. Front Neural Circuits 2021; 15:614268. [PMID: 33737868 PMCID: PMC7960911 DOI: 10.3389/fncir.2021.614268] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2020] [Accepted: 02/05/2021] [Indexed: 02/01/2023] Open
Abstract
The past decade has seen growing support for the critical brain hypothesis, i.e., the possibility that the brain could operate at or very near a critical state between two different dynamical regimes. Such critical states are well-studied in different disciplines, therefore there is potential for a continued transfer of knowledge. Here, I revisit foundations of bifurcation theory, the mathematical theory of transitions. While the mathematics is well-known it's transfer to neural dynamics leads to new insights and hypothesis.
Collapse
Affiliation(s)
- Thilo Gross
- Helmholtz Institute for Functional Marine Biodiversity (HIFMB), Oldenburg, Germany
- Institute for Chemistry and Biology of the Marine Environment (ICBM), Carl-von-Ossietzky Universität Oldenburg, Oldenburg, Germany
- Helmholtz Center for Marine and Polar Research, Alfred-Wegener-Institute, Bremerhaven, Germany
| |
Collapse
|
13
|
Landmann S, Baumgarten L, Bornholdt S. Self-organized criticality in neural networks from activity-based rewiring. Phys Rev E 2021; 103:032304. [PMID: 33862737 DOI: 10.1103/physreve.103.032304] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2020] [Accepted: 02/12/2021] [Indexed: 06/12/2023]
Abstract
Neural systems process information in a dynamical regime between silence and chaotic dynamics. This has lead to the criticality hypothesis, which suggests that neural systems reach such a state by self-organizing toward the critical point of a dynamical phase transition. Here, we study a minimal neural network model that exhibits self-organized criticality in the presence of stochastic noise using a rewiring rule which only utilizes local information. For network evolution, incoming links are added to a node or deleted, depending on the node's average activity. Based on this rewiring-rule only, the network evolves toward a critical state, showing typical power-law-distributed avalanche statistics. The observed exponents are in accord with criticality as predicted by dynamical scaling theory, as well as with the observed exponents of neural avalanches. The critical state of the model is reached autonomously without the need for parameter tuning, is independent of initial conditions, is robust under stochastic noise, and independent of details of the implementation as different variants of the model indicate. We argue that this supports the hypothesis that real neural systems may utilize such a mechanism to self-organize toward criticality, especially during early developmental stages.
Collapse
Affiliation(s)
- Stefan Landmann
- Institut für Theoretische Physik, Universität Bremen, Germany
| | | | | |
Collapse
|
14
|
Heiney K, Huse Ramstad O, Fiskum V, Christiansen N, Sandvig A, Nichele S, Sandvig I. Criticality, Connectivity, and Neural Disorder: A Multifaceted Approach to Neural Computation. Front Comput Neurosci 2021; 15:611183. [PMID: 33643017 PMCID: PMC7902700 DOI: 10.3389/fncom.2021.611183] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2020] [Accepted: 01/18/2021] [Indexed: 01/03/2023] Open
Abstract
It has been hypothesized that the brain optimizes its capacity for computation by self-organizing to a critical point. The dynamical state of criticality is achieved by striking a balance such that activity can effectively spread through the network without overwhelming it and is commonly identified in neuronal networks by observing the behavior of cascades of network activity termed "neuronal avalanches." The dynamic activity that occurs in neuronal networks is closely intertwined with how the elements of the network are connected and how they influence each other's functional activity. In this review, we highlight how studying criticality with a broad perspective that integrates concepts from physics, experimental and theoretical neuroscience, and computer science can provide a greater understanding of the mechanisms that drive networks to criticality and how their disruption may manifest in different disorders. First, integrating graph theory into experimental studies on criticality, as is becoming more common in theoretical and modeling studies, would provide insight into the kinds of network structures that support criticality in networks of biological neurons. Furthermore, plasticity mechanisms play a crucial role in shaping these neural structures, both in terms of homeostatic maintenance and learning. Both network structures and plasticity have been studied fairly extensively in theoretical models, but much work remains to bridge the gap between theoretical and experimental findings. Finally, information theoretical approaches can tie in more concrete evidence of a network's computational capabilities. Approaching neural dynamics with all these facets in mind has the potential to provide a greater understanding of what goes wrong in neural disorders. Criticality analysis therefore holds potential to identify disruptions to healthy dynamics, granted that robust methods and approaches are considered.
Collapse
Affiliation(s)
- Kristine Heiney
- Department of Computer Science, Oslo Metropolitan University, Oslo, Norway
- Department of Computer Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Ola Huse Ramstad
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Vegard Fiskum
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Nicholas Christiansen
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Axel Sandvig
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Department of Clinical Neuroscience, Umeå University Hospital, Umeå, Sweden
- Department of Neurology, St. Olav's Hospital, Trondheim, Norway
| | - Stefano Nichele
- Department of Computer Science, Oslo Metropolitan University, Oslo, Norway
- Department of Holistic Systems, Simula Metropolitan, Oslo, Norway
| | - Ioanna Sandvig
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| |
Collapse
|
15
|
Whitelam S. Improving the Accuracy of Nearest-Neighbor Classification Using Principled Construction and Stochastic Sampling of Training-Set Centroids. ENTROPY (BASEL, SWITZERLAND) 2021; 23:149. [PMID: 33530507 PMCID: PMC7911166 DOI: 10.3390/e23020149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/03/2021] [Accepted: 01/21/2021] [Indexed: 11/16/2022]
Abstract
A conceptually simple way to classify images is to directly compare test-set data and training-set data. The accuracy of this approach is limited by the method of comparison used, and by the extent to which the training-set data cover configuration space. Here we show that this coverage can be substantially increased using coarse-graining (replacing groups of images by their centroids) and stochastic sampling (using distinct sets of centroids in combination). We use the MNIST and Fashion-MNIST data sets to show that a principled coarse-graining algorithm can convert training images into fewer image centroids without loss of accuracy of classification of test-set images by nearest-neighbor classification. Distinct batches of centroids can be used in combination as a means of stochastically sampling configuration space, and can classify test-set data more accurately than can the unaltered training set. On the MNIST and Fashion-MNIST data sets this approach converts nearest-neighbor classification from a mid-ranking- to an upper-ranking member of the set of classical machine-learning techniques.
Collapse
Affiliation(s)
- Stephen Whitelam
- Molecular Foundry, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720, USA
| |
Collapse
|
16
|
Wilting J, Priesemann V. Between Perfectly Critical and Fully Irregular: A Reverberating Model Captures and Predicts Cortical Spike Propagation. Cereb Cortex 2020; 29:2759-2770. [PMID: 31008508 PMCID: PMC6519697 DOI: 10.1093/cercor/bhz049] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2018] [Revised: 01/20/2019] [Indexed: 12/11/2022] Open
Abstract
Knowledge about the collective dynamics of cortical spiking is very informative about the underlying coding principles. However, even most basic properties are not known with certainty, because their assessment is hampered by spatial subsampling, i.e., the limitation that only a tiny fraction of all neurons can be recorded simultaneously with millisecond precision. Building on a novel, subsampling-invariant estimator, we fit and carefully validate a minimal model for cortical spike propagation. The model interpolates between two prominent states: asynchronous and critical. We find neither of them in cortical spike recordings across various species, but instead identify a narrow "reverberating" regime. This approach enables us to predict yet unknown properties from very short recordings and for every circuit individually, including responses to minimal perturbations, intrinsic network timescales, and the strength of external input compared to recurrent activation "thereby informing about the underlying coding principles for each circuit, area, state and task.
Collapse
Affiliation(s)
- J Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faß berg 17, Göttingen, Germany
| | - V Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faß berg 17, Göttingen, Germany.,Bernstein-Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
17
|
Effects of Turbulent Environment on Self-Organized Critical Behavior: Isotropy vs. Anisotropy. UNIVERSE 2020. [DOI: 10.3390/universe6090145] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
We study a self-organized critical system under the influence of turbulent motion of the environment. The system is described by the anisotropic continuous stochastic equation proposed by Hwa and Kardar [Phys. Rev. Lett.62: 1813 (1989)]. The motion of the environment is modelled by the isotropic Kazantsev–Kraichnan “rapid-change” ensemble for an incompressible fluid: it is Gaussian with vanishing correlation time and the pair correlation function of the form ∝δ(t−t′)/kd+ξ, where k is the wave number and ξ is an arbitrary exponent with the most realistic values ξ=4/3 (Kolmogorov turbulence) and ξ→2 (Batchelor’s limit). Using the field-theoretic renormalization group, we find infrared attractive fixed points of the renormalization group equation associated with universality classes, i.e., with regimes of critical behavior. The most realistic values of the spatial dimension d=2 and the exponent ξ=4/3 correspond to the universality class of pure turbulent advection where the nonlinearity of the Hwa–Kardar (HK) equation is irrelevant. Nevertheless, the universality class where both the (anisotropic) nonlinearity of the HK equation and the (isotropic) advecting velocity field are relevant also exists for some values of the parameters ε=4−d and ξ. Depending on what terms (anisotropic, isotropic, or both) are relevant in specific universality class, different types of scaling behavior (ordinary one or generalized) are established.
Collapse
|
18
|
Virkar YS, Restrepo JG, Shew WL, Ott E. Dynamic regulation of resource transport induces criticality in interdependent networks of excitable units. Phys Rev E 2020; 101:022303. [PMID: 32168577 DOI: 10.1103/physreve.101.022303] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2018] [Accepted: 12/24/2019] [Indexed: 11/06/2022]
Abstract
Various functions of a network of excitable units can be enhanced if the network is in the "critical regime," where excitations are, on average, neither damped nor amplified. An important question is how can such networks self-organize to operate in the critical regime. Previously, it was shown that regulation via resource transport on a secondary network can robustly maintain the primary network dynamics in a balanced state where activity doesn't grow or decay. Here we show that this internetwork regulation process robustly produces a power-law distribution of activity avalanches, as observed in experiments, over ranges of model parameters spanning orders of magnitude. We also show that the resource transport over the secondary network protects the system against the destabilizing effect of local variations in parameters and heterogeneity in network structure. For homogeneous networks, we derive a reduced three-dimensional map which reproduces the behavior of the full system.
Collapse
Affiliation(s)
- Yogesh S Virkar
- Department of Computer Science, University of Colorado at Boulder, Boulder, Colorado 80309, USA
| | - Juan G Restrepo
- Department of Applied Mathematics, University of Colorado at Boulder, Boulder, Colorado 80309, USA
| | - Woodrow L Shew
- Department of Physics, University of Arkansas, Fayetteville, Arkansas 72701, USA
| | - Edward Ott
- Departments of Electrical and Computer Engineering and of Physics, University of Maryland, College Park, Maryland 20742, USA
| |
Collapse
|
19
|
Zierenberg J, Wilting J, Priesemann V, Levina A. Description of spreading dynamics by microscopic network models and macroscopic branching processes can differ due to coalescence. Phys Rev E 2020; 101:022301. [PMID: 32168601 DOI: 10.1103/physreve.101.022301] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2019] [Accepted: 12/17/2019] [Indexed: 06/10/2023]
Abstract
Spreading processes are conventionally monitored on a macroscopic level by counting the number of incidences over time. The spreading process can then be modeled either on the microscopic level, assuming an underlying interaction network, or directly on the macroscopic level, assuming that microscopic contributions are negligible. The macroscopic characteristics of both descriptions are commonly assumed to be identical. In this work we show that these characteristics of microscopic and macroscopic descriptions can be different due to coalescence, i.e., a node being activated at the same time by multiple sources. In particular, we consider a (microscopic) branching network (probabilistic cellular automaton) with annealed connectivity disorder, record the macroscopic activity, and then approximate this activity by a (macroscopic) branching process. In this framework we analytically calculate the effect of coalescence on the collective dynamics. We show that coalescence leads to a universal nonlinear scaling function for the conditional expectation value of successive network activity. This allows us to quantify the difference between the microscopic model parameter and established estimates of the macroscopic branching parameter. To overcome this difference, we propose a nonlinear estimator that correctly infers the microscopic model parameter for all system sizes.
Collapse
Affiliation(s)
- Johannes Zierenberg
- Max Planck Institute for Dynamics and Self-Organization, Am Fassberg 17, 37077 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Am Fassberg 17, 37077 Göttingen, Germany
| | - Jens Wilting
- Max Planck Institute for Dynamics and Self-Organization, Am Fassberg 17, 37077 Göttingen, Germany
| | - Viola Priesemann
- Max Planck Institute for Dynamics and Self-Organization, Am Fassberg 17, 37077 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Am Fassberg 17, 37077 Göttingen, Germany
| | - Anna Levina
- University of Tübingen, Max Planck Ring 8, 72076 Tübingen, Germany
- Max Planck Institute for Biological Cybernetics, Max Planck Ring 8, 72076 Tübingen, Germany
| |
Collapse
|
20
|
Khoshkhou M, Montakhab A. Spike-Timing-Dependent Plasticity With Axonal Delay Tunes Networks of Izhikevich Neurons to the Edge of Synchronization Transition With Scale-Free Avalanches. Front Syst Neurosci 2019; 13:73. [PMID: 31866836 PMCID: PMC6904334 DOI: 10.3389/fnsys.2019.00073] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2019] [Accepted: 11/19/2019] [Indexed: 11/13/2022] Open
Abstract
Critical brain hypothesis has been intensively studied both in experimental and theoretical neuroscience over the past two decades. However, some important questions still remain: (i) What is the critical point the brain operates at? (ii) What is the regulatory mechanism that brings about and maintains such a critical state? (iii) The critical state is characterized by scale-invariant behavior which is seemingly at odds with definitive brain oscillations? In this work we consider a biologically motivated model of Izhikevich neuronal network with chemical synapses interacting via spike-timing-dependent plasticity (STDP) as well as axonal time delay. Under generic and physiologically relevant conditions we show that the system is organized and maintained around a synchronization transition point as opposed to an activity transition point associated with an absorbing state phase transition. However, such a state exhibits experimentally relevant signs of critical dynamics including scale-free avalanches with finite-size scaling as well as critical branching ratios. While the system displays stochastic oscillations with highly correlated fluctuations, it also displays dominant frequency modes seen as sharp peaks in the power spectrum. The role of STDP as well as time delay is crucial in achieving and maintaining such critical dynamics, while the role of inhibition is not as crucial. In this way we provide possible answers to all three questions posed above. We also show that one can achieve supercritical or subcritical dynamics if one changes the average time delay associated with axonal conduction.
Collapse
Affiliation(s)
- Mahsa Khoshkhou
- Department of Physics, College of Sciences, Shiraz University, Shiraz, Iran
| | - Afshin Montakhab
- Department of Physics, College of Sciences, Shiraz University, Shiraz, Iran
| |
Collapse
|
21
|
Skilling QM, Ognjanovski N, Aton SJ, Zochowski M. Critical Dynamics Mediate Learning of New Distributed Memory Representations in Neuronal Networks. ENTROPY 2019; 21:1043. [PMCID: PMC7514347 DOI: 10.3390/e21111043] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Accepted: 10/23/2019] [Indexed: 02/01/2025]
Abstract
We explore the possible role of network dynamics near a critical point in the storage of new information in silico and in vivo, and show that learning and memory may rely on neuronal network features mediated by the vicinity of criticality. Using a mean-field, attractor-based model, we show that new information can be consolidated into attractors through state-based learning in a dynamical regime associated with maximal susceptibility at the critical point. Then, we predict that the subsequent consolidation process results in a shift from critical to sub-critical dynamics to fully encapsulate the new information. We go on to corroborate these findings using analysis of rodent hippocampal CA1 activity during contextual fear memory (CFM) consolidation. We show that the dynamical state of the CA1 network is inherently poised near criticality, but the network also undergoes a shift towards sub-critical dynamics due to successful consolidation of the CFM. Based on these findings, we propose that dynamical features associated with criticality may be universally necessary for storing new memories.
Collapse
Affiliation(s)
- Quinton M. Skilling
- Biophysics Program, University of Michigan, 930 N University Ave., Ann Arbor, MI 48109, USA;
| | - Nicolette Ognjanovski
- Department of Molecular, Cellular, and Developmental Biology, University of Michigan, 1105 N University Ave., Ann Arbor, MI 48109, USA; (N.O.) (S.J.A.)
| | - Sara J. Aton
- Department of Molecular, Cellular, and Developmental Biology, University of Michigan, 1105 N University Ave., Ann Arbor, MI 48109, USA; (N.O.) (S.J.A.)
| | - Michal Zochowski
- Biophysics Program, University of Michigan, 930 N University Ave., Ann Arbor, MI 48109, USA;
- Department of Physics, University of Michigan, 450 Church St, Ann Arbor, MI 48109, USA
| |
Collapse
|
22
|
Zhang G, Zhang C, Zhang W. Evolutionary echo state network for long-term time series prediction: on the edge of chaos. APPL INTELL 2019. [DOI: 10.1007/s10489-019-01546-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
23
|
Wilting J, Priesemann V. 25 years of criticality in neuroscience - established results, open controversies, novel concepts. Curr Opin Neurobiol 2019; 58:105-111. [PMID: 31546053 DOI: 10.1016/j.conb.2019.08.002] [Citation(s) in RCA: 66] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2019] [Accepted: 08/25/2019] [Indexed: 12/19/2022]
Abstract
Twenty-five years ago, Dunkelmann and Radons (1994) showed that neural networks can self-organize to a critical state. In models, the critical state offers a number of computational advantages. Thus this hypothesis, and in particular the experimental work by Beggs and Plenz (2003), has triggered an avalanche of research, with thousands of studies referring to it. Nonetheless, experimental results are still contradictory. How is it possible, that a hypothesis has attracted active research for decades, but nonetheless remains controversial? We discuss the experimental and conceptual controversy, and then present a parsimonious solution that (i) unifies the contradictory experimental results, (ii) avoids disadvantages of a critical state, and (iii) enables rapid, adaptive tuning of network properties to task requirements.
Collapse
Affiliation(s)
- J Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - V Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany; Bernstein-Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
24
|
Okujeni S, Egert U. Self-organization of modular network architecture by activity-dependent neuronal migration and outgrowth. eLife 2019; 8:47996. [PMID: 31526478 PMCID: PMC6783273 DOI: 10.7554/elife.47996] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2019] [Accepted: 09/16/2019] [Indexed: 12/17/2022] Open
Abstract
The spatial distribution of neurons and activity-dependent neurite outgrowth shape long-range interaction, recurrent local connectivity and the modularity in neuronal networks. We investigated how this mesoscale architecture develops by interaction of neurite outgrowth, cell migration and activity in cultured networks of rat cortical neurons and show that simple rules can explain variations of network modularity. In contrast to theoretical studies on activity-dependent outgrowth but consistent with predictions for modular networks, spontaneous activity and the rate of synchronized bursts increased with clustering, whereas peak firing rates in bursts increased in highly interconnected homogeneous networks. As Ca2+ influx increased exponentially with increasing network recruitment during bursts, its modulation was highly correlated to peak firing rates. During network maturation, long-term estimates of Ca2+ influx showed convergence, even for highly different mesoscale architectures, neurite extent, connectivity, modularity and average activity levels, indicating homeostatic regulation towards a common set-point of Ca2+ influx.
Collapse
Affiliation(s)
- Samora Okujeni
- Laboratory for Biomicrotechnology, Department of Microsystems Engineering-IMTEK, University of Freiburg, Freiburg, Germany.,Bernstein Center Freiburg, University of Freiburg, Freiburg, Germany
| | - Ulrich Egert
- Laboratory for Biomicrotechnology, Department of Microsystems Engineering-IMTEK, University of Freiburg, Freiburg, Germany.,Bernstein Center Freiburg, University of Freiburg, Freiburg, Germany
| |
Collapse
|
25
|
Wilting J, Dehning J, Pinheiro Neto J, Rudelt L, Wibral M, Zierenberg J, Priesemann V. Operating in a Reverberating Regime Enables Rapid Tuning of Network States to Task Requirements. Front Syst Neurosci 2018; 12:55. [PMID: 30459567 PMCID: PMC6232511 DOI: 10.3389/fnsys.2018.00055] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2018] [Accepted: 10/09/2018] [Indexed: 01/02/2023] Open
Abstract
Neural circuits are able to perform computations under very diverse conditions and requirements. The required computations impose clear constraints on their fine-tuning: a rapid and maximally informative response to stimuli in general requires decorrelated baseline neural activity. Such network dynamics is known as asynchronous-irregular. In contrast, spatio-temporal integration of information requires maintenance and transfer of stimulus information over extended time periods. This can be realized at criticality, a phase transition where correlations, sensitivity and integration time diverge. Being able to flexibly switch, or even combine the above properties in a task-dependent manner would present a clear functional advantage. We propose that cortex operates in a “reverberating regime” because it is particularly favorable for ready adaptation of computational properties to context and task. This reverberating regime enables cortical networks to interpolate between the asynchronous-irregular and the critical state by small changes in effective synaptic strength or excitation-inhibition ratio. These changes directly adapt computational properties, including sensitivity, amplification, integration time and correlation length within the local network. We review recent converging evidence that cortex in vivo operates in the reverberating regime, and that various cortical areas have adapted their integration times to processing requirements. In addition, we propose that neuromodulation enables a fine-tuning of the network, so that local circuits can either decorrelate or integrate, and quench or maintain their input depending on task. We argue that this task-dependent tuning, which we call “dynamic adaptive computation,” presents a central organization principle of cortical networks and discuss first experimental evidence.
Collapse
Affiliation(s)
- Jens Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Jonas Dehning
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Joao Pinheiro Neto
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Lucas Rudelt
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Michael Wibral
- Magnetoencephalography Unit, Brain Imaging Center, Johann-Wolfgang-Goethe University, Frankfurt, Germany
| | - Johannes Zierenberg
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany.,Bernstein-Center for Computational Neuroscience, Göttingen, Germany
| | - Viola Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany.,Bernstein-Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|