1
|
Liang J, Yang Z, Zhou C. Excitation-Inhibition Balance, Neural Criticality, and Activities in Neuronal Circuits. Neuroscientist 2025; 31:31-46. [PMID: 38291889 DOI: 10.1177/10738584231221766] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2024]
Abstract
Neural activities in local circuits exhibit complex and multilevel dynamic features. Individual neurons spike irregularly, which is believed to originate from receiving balanced amounts of excitatory and inhibitory inputs, known as the excitation-inhibition balance. The spatial-temporal cascades of clustered neuronal spikes occur in variable sizes and durations, manifested as neural avalanches with scale-free features. These may be explained by the neural criticality hypothesis, which posits that neural systems operate around the transition between distinct dynamic states. Here, we summarize the experimental evidence for and the underlying theory of excitation-inhibition balance and neural criticality. Furthermore, we review recent studies of excitatory-inhibitory networks with synaptic kinetics as a simple solution to reconcile these two apparently distinct theories in a single circuit model. This provides a more unified understanding of multilevel neural activities in local circuits, from spontaneous to stimulus-response dynamics.
Collapse
Affiliation(s)
- Junhao Liang
- Eberhard Karls University of Tübingen and Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Zhuda Yang
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Life Science Imaging Centre, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Research Centre, Hong Kong Baptist University Institute of Research and Continuing Education, Shenzhen, China
| |
Collapse
|
2
|
Xie H, Liu K, Li D, Zhang CS, Hilgetag CC, Guan JS. Rectified activity-dependent population plasticity implicates cortical adaptation for memory and cognitive functions. Commun Biol 2024; 7:1487. [PMID: 39528683 PMCID: PMC11555404 DOI: 10.1038/s42003-024-07186-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2024] [Accepted: 10/31/2024] [Indexed: 11/16/2024] Open
Abstract
Cortical network undergoes rewiring everyday due to learning and memory events. To investigate the trends of population adaptation in neocortex overtime, we record cellular activity of large-scale cortical populations in response to neutral environments and conditioned contexts and identify a general intrinsic cortical adaptation mechanism, naming rectified activity-dependent population plasticity (RAPP). Comparing each adjacent day, the previously activated neurons reduce activity, but remain with residual potentiation, and increase population variability in proportion to their activity during previous recall trials. RAPP predicts both the decay of context-induced activity patterns and the emergence of sparse memory traces. Simulation analysis reveal that the local inhibitory connections might account for the residual potentiation in RAPP. Intriguingly, introducing the RAPP phenomenon in the artificial neural network show promising improvement in small sample size pattern recognition tasks. Thus, RAPP represents a phenomenon of cortical adaptation, contributing to the emergence of long-lasting memory and high cognitive functions.
Collapse
Affiliation(s)
- Hong Xie
- School of Artificial Intelligence Science and Technology, University of Shanghai for Science and Technology, Shanghai, China.
- Institute of Photonic Chips, University of Shanghai for Science and Technology, Shanghai, China.
| | - Kaiyuan Liu
- School of Life Science and Technology, Shanghai Tech University, Shanghai, China
- School of Life Sciences, Tsinghua University, Beijing, China
| | - Dong Li
- Institut für Computational Neuroscience, Universitätsklinikum Hamburg-Eppendorf, Martinistr. 52, Hamburg, Germany
| | - Chang-Shui Zhang
- Department of Automation, Tsinghua University, Beijing, China
- State Key Lab of Intelligent Technologies and Systems, Tsinghua National Laboratory for Information Science and Technology (TNList), Beijing, P.R. China
| | - Claus C Hilgetag
- Institut für Computational Neuroscience, Universitätsklinikum Hamburg-Eppendorf, Martinistr. 52, Hamburg, Germany
| | - Ji-Song Guan
- School of Life Science and Technology, Shanghai Tech University, Shanghai, China.
- State Key Laboratory of Advanced Medical Materials and Devices, ShanghaiTech University, Shanghai, China.
| |
Collapse
|
3
|
Hutt A, Trotter D, Pariz A, Valiante TA, Lefebvre J. Diversity-induced trivialization and resilience of neural dynamics. CHAOS (WOODBURY, N.Y.) 2024; 34:013147. [PMID: 38285722 DOI: 10.1063/5.0165773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Accepted: 01/01/2024] [Indexed: 01/31/2024]
Abstract
Heterogeneity is omnipresent across all living systems. Diversity enriches the dynamical repertoire of these systems but remains challenging to reconcile with their manifest robustness and dynamical persistence over time, a fundamental feature called resilience. To better understand the mechanism underlying resilience in neural circuits, we considered a nonlinear network model, extracting the relationship between excitability heterogeneity and resilience. To measure resilience, we quantified the number of stationary states of this network, and how they are affected by various control parameters. We analyzed both analytically and numerically gradient and non-gradient systems modeled as non-linear sparse neural networks evolving over long time scales. Our analysis shows that neuronal heterogeneity quenches the number of stationary states while decreasing the susceptibility to bifurcations: a phenomenon known as trivialization. Heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in network size and connection probability by quenching the system's dynamic volatility.
Collapse
Affiliation(s)
- Axel Hutt
- MLMS, MIMESIS, Université de Strasbourg, CNRS, Inria, ICube, 67000 Strasbourg, France
| | - Daniel Trotter
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
| | - Aref Pariz
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
| | - Taufik A Valiante
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Electrical and Computer Engineering, Institute of Medical Science, Institute of Biomedical Engineering, Division of Neurosurgery, Department of Surgery, CRANIA (Center for Advancing Neurotechnological Innovation to Application), Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, Ontario M5S 3G8, Canada
| | - Jérémie Lefebvre
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, Ontario M5S 2E4, Canada
| |
Collapse
|
4
|
Changeux JP, Goulas A, Hilgetag CC. A Connectomic Hypothesis for the Hominization of the Brain. Cereb Cortex 2021; 31:2425-2449. [PMID: 33367521 PMCID: PMC8023825 DOI: 10.1093/cercor/bhaa365] [Citation(s) in RCA: 37] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2020] [Revised: 10/30/2020] [Accepted: 11/02/2020] [Indexed: 02/06/2023] Open
Abstract
Cognitive abilities of the human brain, including language, have expanded dramatically in the course of our recent evolution from nonhuman primates, despite only minor apparent changes at the gene level. The hypothesis we propose for this paradox relies upon fundamental features of human brain connectivity, which contribute to a characteristic anatomical, functional, and computational neural phenotype, offering a parsimonious framework for connectomic changes taking place upon the human-specific evolution of the genome. Many human connectomic features might be accounted for by substantially increased brain size within the global neural architecture of the primate brain, resulting in a larger number of neurons and areas and the sparsification, increased modularity, and laminar differentiation of cortical connections. The combination of these features with the developmental expansion of upper cortical layers, prolonged postnatal brain development, and multiplied nongenetic interactions with the physical, social, and cultural environment gives rise to categorically human-specific cognitive abilities including the recursivity of language. Thus, a small set of genetic regulatory events affecting quantitative gene expression may plausibly account for the origins of human brain connectivity and cognition.
Collapse
Affiliation(s)
- Jean-Pierre Changeux
- CNRS UMR 3571, Institut Pasteur, 75724 Paris, France
- Communications Cellulaires, Collège de France, 75005 Paris, France
| | - Alexandros Goulas
- Institute of Computational Neuroscience, University Medical Center Eppendorf, Hamburg University, 20246 Hamburg, Germany
| | - Claus C Hilgetag
- Institute of Computational Neuroscience, University Medical Center Eppendorf, Hamburg University, 20246 Hamburg, Germany
- Department of Health Sciences, Boston University, Boston, MA 02115, USA
| |
Collapse
|
5
|
Rentzeperis I, van Leeuwen C. Adaptive Rewiring in Weighted Networks Shows Specificity, Robustness, and Flexibility. Front Syst Neurosci 2021; 15:580569. [PMID: 33737871 PMCID: PMC7960922 DOI: 10.3389/fnsys.2021.580569] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Accepted: 02/02/2021] [Indexed: 11/13/2022] Open
Abstract
Brain network connections rewire adaptively in response to neural activity. Adaptive rewiring may be understood as a process which, at its every step, is aimed at optimizing the efficiency of signal diffusion. In evolving model networks, this amounts to creating shortcut connections in regions with high diffusion and pruning where diffusion is low. Adaptive rewiring leads over time to topologies akin to brain anatomy: small worlds with rich club and modular or centralized structures. We continue our investigation of adaptive rewiring by focusing on three desiderata: specificity of evolving model network architectures, robustness of dynamically maintained architectures, and flexibility of network evolution to stochastically deviate from specificity and robustness. Our adaptive rewiring model simulations show that specificity and robustness characterize alternative modes of network operation, controlled by a single parameter, the rewiring interval. Small control parameter shifts across a critical transition zone allow switching between the two modes. Adaptive rewiring exhibits greater flexibility for skewed, lognormal connection weight distributions than for normally distributed ones. The results qualify adaptive rewiring as a key principle of self-organized complexity in network architectures, in particular of those that characterize the variety of functional architectures in the brain.
Collapse
Affiliation(s)
| | - Cees van Leeuwen
- Brain and Cognition Research Unit, KU Leuven, Leuven, Belgium
- Department of Cognitive and Developmental Psychology, University of Technology Kaiserslautern, Kaiserslautern, Germany
| |
Collapse
|
6
|
Tumulty JS, Royster M, Cruz L. Columnar grouping preserves synchronization in neuronal networks with distance-dependent time delays. Phys Rev E 2021; 101:022408. [PMID: 32168702 DOI: 10.1103/physreve.101.022408] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2019] [Accepted: 01/10/2020] [Indexed: 11/07/2022]
Abstract
Neuronal connectivity at the cellular level in the cerebral cortex is far from random, with characteristics that point to a hierarchical design with intricately connected neuronal clusters. Here we investigate computationally the effects of varying neuronal cluster connectivity on network synchronization for two different spatial distributions of clusters: one where clusters are arranged in columns in a grid and the other where neurons from different clusters are spatially intermixed. We characterize each case by measuring the degree of neuronal spiking synchrony as a function of the number of connections per neuron and the degree of intercluster connectivity. We find that in both cases as the number of connections per neuron increases, there is an asynchronous to synchronous transition dependent only on intrinsic parameters of the biophysical model. We also observe in both cases that with very low intercluster connectivity clusters have independent firing dynamics yielding a low degree of synchrony. More importantly, we find that for a high number of connections per neuron but intermediate intercluster connectivity, the two spatial distributions of clusters differ in their response where the clusters in a grid have a higher degree of synchrony than the clusters that are intermixed.
Collapse
Affiliation(s)
- Joseph S Tumulty
- Department of Physics, Drexel University, 3141 Chestnut Street, Philadelphia, Pennsylvania 19104, United States
| | - Michael Royster
- Department of Physics, Drexel University, 3141 Chestnut Street, Philadelphia, Pennsylvania 19104, United States
| | - Luis Cruz
- Department of Physics, Drexel University, 3141 Chestnut Street, Philadelphia, Pennsylvania 19104, United States
| |
Collapse
|
7
|
Liang J, Zhou T, Zhou C. Hopf Bifurcation in Mean Field Explains Critical Avalanches in Excitation-Inhibition Balanced Neuronal Networks: A Mechanism for Multiscale Variability. Front Syst Neurosci 2020; 14:580011. [PMID: 33324179 PMCID: PMC7725680 DOI: 10.3389/fnsys.2020.580011] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2020] [Accepted: 11/02/2020] [Indexed: 12/14/2022] Open
Abstract
Cortical neural circuits display highly irregular spiking in individual neurons but variably sized collective firing, oscillations and critical avalanches at the population level, all of which have functional importance for information processing. Theoretically, the balance of excitation and inhibition inputs is thought to account for spiking irregularity and critical avalanches may originate from an underlying phase transition. However, the theoretical reconciliation of these multilevel dynamic aspects in neural circuits remains an open question. Herein, we study excitation-inhibition (E-I) balanced neuronal network with biologically realistic synaptic kinetics. It can maintain irregular spiking dynamics with different levels of synchrony and critical avalanches emerge near the synchronous transition point. We propose a novel semi-analytical mean-field theory to derive the field equations governing the network macroscopic dynamics. It reveals that the E-I balanced state of the network manifesting irregular individual spiking is characterized by a macroscopic stable state, which can be either a fixed point or a periodic motion and the transition is predicted by a Hopf bifurcation in the macroscopic field. Furthermore, by analyzing public data, we find the coexistence of irregular spiking and critical avalanches in the spontaneous spiking activities of mouse cortical slice in vitro, indicating the universality of the observed phenomena. Our theory unveils the mechanism that permits complex neural activities in different spatiotemporal scales to coexist and elucidates a possible origin of the criticality of neural systems. It also provides a novel tool for analyzing the macroscopic dynamics of E-I balanced networks and its relationship to the microscopic counterparts, which can be useful for large-scale modeling and computation of cortical dynamics.
Collapse
Affiliation(s)
- Junhao Liang
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Key Laboratory of Computational Mathematics, Guangdong Province, and School of Mathematics, Sun Yat-sen University, Guangzhou, China
| | - Tianshou Zhou
- Key Laboratory of Computational Mathematics, Guangdong Province, and School of Mathematics, Sun Yat-sen University, Guangzhou, China
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Department of Physics, Zhejiang University, Hangzhou, China
| |
Collapse
|
8
|
Moretti P, Hütt MT. Link-usage asymmetry and collective patterns emerging from rich-club organization of complex networks. Proc Natl Acad Sci U S A 2020; 117:18332-18340. [PMID: 32690716 PMCID: PMC7414146 DOI: 10.1073/pnas.1919785117] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
In models of excitable dynamics on graphs, excitations can travel in both directions of an undirected link. However, as a striking interplay of dynamics and network topology, excitations often establish a directional preference. Some of these cases of "link-usage asymmetry" are local in nature and can be mechanistically understood, for instance, from the degree gradient of a link (i.e., the difference in node degrees at both ends of the link). Other contributions to the link-usage asymmetry are instead, as we show, self-organized in nature, and strictly nonlocal. This is the case for excitation waves, where the preferential propagation of excitations along a link depends on its orientation with respect to a hub acting as a source, even if the link in question is several steps away from the hub itself. Here, we identify and quantify the contribution of such self-organized patterns to link-usage asymmetry and show that they extend to ranges significantly longer than those ascribed to local patterns. We introduce a topological characterization, the hub-set-orientation prevalence of a link, which indicates its average orientation with respect to the hubs of a graph. Our numerical results show that the hub-set-orientation prevalence of a link strongly correlates with the preferential usage of the link in the direction of propagation away from the hub core of the graph. Our methodology is embedding-agnostic and allows for the measurement of wave signals and the sizes of the cores from which they originate.
Collapse
Affiliation(s)
- Paolo Moretti
- Institute of Materials Simulation, Department of Materials Science, Friedrich-Alexander-University Erlangen-Nürnberg, D-90762 Fürth, Germany;
| | - Marc-Thorsten Hütt
- Department of Life Sciences and Chemistry, Jacobs University Bremen, D-28759 Bremen, Germany
| |
Collapse
|
9
|
Safron A. An Integrated World Modeling Theory (IWMT) of Consciousness: Combining Integrated Information and Global Neuronal Workspace Theories With the Free Energy Principle and Active Inference Framework; Toward Solving the Hard Problem and Characterizing Agentic Causation. Front Artif Intell 2020; 3:30. [PMID: 33733149 PMCID: PMC7861340 DOI: 10.3389/frai.2020.00030] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2019] [Accepted: 04/03/2020] [Indexed: 01/01/2023] Open
Abstract
The Free Energy Principle and Active Inference Framework (FEP-AI) begins with the understanding that persisting systems must regulate environmental exchanges and prevent entropic accumulation. In FEP-AI, minds and brains are predictive controllers for autonomous systems, where action-driven perception is realized as probabilistic inference. Integrated Information Theory (IIT) begins with considering the preconditions for a system to intrinsically exist, as well as axioms regarding the nature of consciousness. IIT has produced controversy because of its surprising entailments: quasi-panpsychism; subjectivity without referents or dynamics; and the possibility of fully-intelligent-yet-unconscious brain simulations. Here, I describe how these controversies might be resolved by integrating IIT with FEP-AI, where integrated information only entails consciousness for systems with perspectival reference frames capable of generating models with spatial, temporal, and causal coherence for self and world. Without that connection with external reality, systems could have arbitrarily high amounts of integrated information, but nonetheless would not entail subjective experience. I further describe how an integration of these frameworks may contribute to their evolution as unified systems theories and models of emergent causation. Then, inspired by both Global Neuronal Workspace Theory (GNWT) and the Harmonic Brain Modes framework, I describe how streams of consciousness may emerge as an evolving generation of sensorimotor predictions, with the precise composition of experiences depending on the integration abilities of synchronous complexes as self-organizing harmonic modes (SOHMs). These integrating dynamics may be particularly likely to occur via richly connected subnetworks affording body-centric sources of phenomenal binding and executive control. Along these connectivity backbones, SOHMs are proposed to implement turbo coding via loopy message-passing over predictive (autoencoding) networks, thus generating maximum a posteriori estimates as coherent vectors governing neural evolution, with alpha frequencies generating basic awareness, and cross-frequency phase-coupling within theta frequencies for access consciousness and volitional control. These dynamic cores of integrated information also function as global workspaces, centered on posterior cortices, but capable of being entrained with frontal cortices and interoceptive hierarchies, thus affording agentic causation. Integrated World Modeling Theory (IWMT) represents a synthetic approach to understanding minds that reveals compatibility between leading theories of consciousness, thus enabling inferential synergy.
Collapse
Affiliation(s)
- Adam Safron
- Indiana University, Bloomington, IN, United States
| |
Collapse
|
10
|
Optimal Interplay between Synaptic Strengths and Network Structure Enhances Activity Fluctuations and Information Propagation in Hierarchical Modular Networks. Brain Sci 2020; 10:brainsci10040228. [PMID: 32290351 PMCID: PMC7226268 DOI: 10.3390/brainsci10040228] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2020] [Revised: 04/03/2020] [Accepted: 04/04/2020] [Indexed: 01/21/2023] Open
Abstract
In network models of spiking neurons, the joint impact of network structure and synaptic parameters on activity propagation is still an open problem. Here, we use an information-theoretical approach to investigate activity propagation in spiking networks with a hierarchical modular topology. We observe that optimized pairwise information propagation emerges due to the increase of either (i) the global synaptic strength parameter or (ii) the number of modules in the network, while the network size remains constant. At the population level, information propagation of activity among adjacent modules is enhanced as the number of modules increases until a maximum value is reached and then decreases, showing that there is an optimal interplay between synaptic strength and modularity for population information flow. This is in contrast to information propagation evaluated among pairs of neurons, which attains maximum value at the maximum values of these two parameter ranges. By examining the network behavior under the increase of synaptic strength and the number of modules, we find that these increases are associated with two different effects: (i) the increase of autocorrelations among individual neurons and (ii) the increase of cross-correlations among pairs of neurons. The second effect is associated with better information propagation in the network. Our results suggest roles that link topological features and synaptic strength levels to the transmission of information in cortical networks.
Collapse
|
11
|
Marhl U, Gosak M. Proper spatial heterogeneities expand the regime of scale-free behavior in a lattice of excitable elements. Phys Rev E 2019; 100:062203. [PMID: 31962506 DOI: 10.1103/physreve.100.062203] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2019] [Indexed: 06/10/2023]
Abstract
Signatures of criticality, such as power law scaling of observables, have been empirically found in a plethora of real-life settings, including biological systems. The presence of critical states is believed to have many functional advantages and is associated with optimal operational abilities. Typically, critical dynamics arises in the proximity of phase transition points between absorbing disordered states (subcriticality) and ordered active regimes (supercriticality) and requires a high degree of fine tuning to emerge, which is unlikely to occur in real biological systems. In the present study we propose a rather simple, and biologically relevant mechanism that profoundly expands the critical-like region. In particular, by means of numerical simulation we show that incorporating spatial heterogeneities into the square lattice of map-based excitable oscillators broadens the parameter space in which the distribution of excitation wave sizes follows closely a power law. Most importantly, this behavior is only observed if the spatial profile exhibits intermediate-sized patches with similar excitability levels, whereas for large and small spatial clusters only marginal widening of the critical state is detected. Furthermore, it turned out that the presence of spatial disorder in general amplifies the size of excitation waves, whereby the relatively highest contributions are observed in the proximity of the critical point. We argue that the reported mechanism is of particular importance for excitable systems with local interactions between individual elements.
Collapse
Affiliation(s)
- Urban Marhl
- Faculty of Natural Sciences and Mathematics, University of Maribor, Koroška cesta 160, SI-2000 Maribor, Slovenia
- Institute of Mathematics, Physics and Mechanics, Jadranska ulica 19, SI-1000 Ljubljana, Slovenia
| | - Marko Gosak
- Faculty of Natural Sciences and Mathematics, University of Maribor, Koroška cesta 160, SI-2000 Maribor, Slovenia
- Institute of Physiology, Faculty of Medicine, University of Maribor, Taborska ulica 8, SI-2000 Maribor, Slovenia
| |
Collapse
|
12
|
The scale-invariant, temporal profile of neuronal avalanches in relation to cortical γ-oscillations. Sci Rep 2019; 9:16403. [PMID: 31712632 PMCID: PMC6848117 DOI: 10.1038/s41598-019-52326-y] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2019] [Accepted: 10/14/2019] [Indexed: 11/08/2022] Open
Abstract
Activity cascades are found in many complex systems. In the cortex, they arise in the form of neuronal avalanches that capture ongoing and evoked neuronal activities at many spatial and temporal scales. The scale-invariant nature of avalanches suggests that the brain is in a critical state, yet predictions from critical theory on the temporal unfolding of avalanches have yet to be confirmed in vivo. Here we show in awake nonhuman primates that the temporal profile of avalanches follows a symmetrical, inverted parabola spanning up to hundreds of milliseconds. This parabola constrains how avalanches initiate locally, extend spatially and shrink as they evolve in time. Importantly, parabolas of different durations can be collapsed with a scaling exponent close to 2 supporting critical generational models of neuronal avalanches. Spontaneously emerging, transient γ-oscillations coexist with and modulate these avalanche parabolas thereby providing a temporal segmentation to inherently scale-invariant, critical dynamics. Our results identify avalanches and oscillations as dual principles in the temporal organization of brain activity.
Collapse
|
13
|
Barranca VJ, Zhou D. Compressive Sensing Inference of Neuronal Network Connectivity in Balanced Neuronal Dynamics. Front Neurosci 2019; 13:1101. [PMID: 31680835 PMCID: PMC6811502 DOI: 10.3389/fnins.2019.01101] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2019] [Accepted: 09/30/2019] [Indexed: 12/30/2022] Open
Abstract
Determining the structure of a network is of central importance to understanding its function in both neuroscience and applied mathematics. However, recovering the structural connectivity of neuronal networks remains a fundamental challenge both theoretically and experimentally. While neuronal networks operate in certain dynamical regimes, which may influence their connectivity reconstruction, there is widespread experimental evidence of a balanced neuronal operating state in which strong excitatory and inhibitory inputs are dynamically adjusted such that neuronal voltages primarily remain near resting potential. Utilizing the dynamics of model neurons in such a balanced regime in conjunction with the ubiquitous sparse connectivity structure of neuronal networks, we develop a compressive sensing theoretical framework for efficiently reconstructing network connections by measuring individual neuronal activity in response to a relatively small ensemble of random stimuli injected over a short time scale. By tuning the network dynamical regime, we determine that the highest fidelity reconstructions are achievable in the balanced state. We hypothesize the balanced dynamics observed in vivo may therefore be a result of evolutionary selection for optimal information encoding and expect the methodology developed to be generalizable for alternative model networks as well as experimental paradigms.
Collapse
Affiliation(s)
- Victor J Barranca
- Department of Mathematics and Statistics, Swarthmore College, Swarthmore, PA, United States
| | - Douglas Zhou
- School of Mathematical Sciences, Shanghai Jiao Tong University, Shanghai, China.,Ministry of Education Key Laboratory of Scientific and Engineering Computing, Shanghai Jiao Tong University, Shanghai, China.,Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
14
|
Wang R, Lin P, Liu M, Wu Y, Zhou T, Zhou C. Hierarchical Connectome Modes and Critical State Jointly Maximize Human Brain Functional Diversity. PHYSICAL REVIEW LETTERS 2019; 123:038301. [PMID: 31386449 DOI: 10.1103/physrevlett.123.038301] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/21/2018] [Revised: 06/05/2019] [Indexed: 06/10/2023]
Abstract
The brain requires diverse segregated and integrated processing to perform normal functions in terms of anatomical structure and self-organized dynamics with critical features, but the fundamental relationships between the complex structural connectome, critical state, and functional diversity remain unknown. Herein, we extend the eigenmode analysis to investigate the joint contribution of hierarchical modular structural organization and critical state to brain functional diversity. We show that the structural modes inherent to the hierarchical modular structural connectome allow a nested functional segregation and integration across multiple spatiotemporal scales. The real brain hierarchical modular organization provides large structural capacity for diverse functional interactions, which are generated by sequentially activating and recruiting the hierarchical connectome modes, and the critical state can best explore the capacity to maximize the functional diversity. Our results reveal structural and dynamical mechanisms that jointly support a balanced segregated and integrated brain processing with diverse functional interactions, and they also shed light on dysfunctional segregation and integration in neurodegenerative diseases and neuropsychiatric disorders.
Collapse
Affiliation(s)
- Rong Wang
- State Key Laboratory for Strength and Vibration of Mechanical Structures, Shaanxi Engineering Laboratory for Vibration Control of Aerospace Structures, School of Aerospace Engineering, Xi'an Jiaotong University, Xi'an 710049, China
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- College of Science, Xi'an University of Science and Technology, Xi'an 710054, China
| | - Pan Lin
- Key Laboratory of Cognitive Science, College of Biomedical Engineering, South-Central University for Nationalities, Wuhan 430074, China
| | - Mianxin Liu
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
| | - Ying Wu
- State Key Laboratory for Strength and Vibration of Mechanical Structures, Shaanxi Engineering Laboratory for Vibration Control of Aerospace Structures, School of Aerospace Engineering, Xi'an Jiaotong University, Xi'an 710049, China
| | - Tao Zhou
- Complex Lab, University of Electronic Science and Technology of China, Chengdu 611731, People's Republic of China
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Research Centre, HKBU Institute of Research and Continuing Education, Shenzhen 518057, China
- Beijing Computational Science Research Center, Beijing 100084, China
- Department of Physics, Zhejiang University, Hangzhou 310058, China
| |
Collapse
|
15
|
Qian Y, Zhang G, Wang Y, Yao C, Zheng Z. Winfree loop sustained oscillation in two-dimensional excitable lattices: Prediction and realization. CHAOS (WOODBURY, N.Y.) 2019; 29:073106. [PMID: 31370411 DOI: 10.1063/1.5085644] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2018] [Accepted: 06/20/2019] [Indexed: 06/10/2023]
Abstract
The problem of self-sustained oscillations in excitable complex networks is the central issue under investigation, among which the prediction and the realization of self-sustained oscillations in different kinds of excitable networks are the challenging tasks. In this paper, we extensively investigate the prediction and the realization of a Winfree loop sustained oscillation (WLSO) in two-dimensional (2D) excitable lattices. By analyzing the network structure, the fundamental oscillation source structure (FOSS) of WLSO in a 2D excitable lattice is exposed explicitly. For the suitable combinations of system parameters, the Winfree loop can self-organize on the FOSS to form an oscillation source sustaining the oscillation, and these suitable parameter combinations are predicted by calculating the minimum Winfree loop length and have been further confirmed in numerical simulations. However, the FOSS cannot spontaneously offer the WLSO in 2D excitable lattices in usual cases due to the coupling bidirectionality and the symmetry properties of the lattice. A targeted protection scheme of the oscillation source is proposed by overcoming these two drawbacks. Finally, the WLSO is realized in the 2D excitable lattice successfully.
Collapse
Affiliation(s)
- Yu Qian
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji 721007, China
| | - Gang Zhang
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji 721007, China
| | - Yafeng Wang
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji 721007, China
| | - Chenggui Yao
- Department of Mathematics, Shaoxing University, Shaoxing 312000, China
| | - Zhigang Zheng
- Institute of Systems Science, Huaqiao University, Xiamen 361021, China
| |
Collapse
|
16
|
Rodriguez N, Izquierdo E, Ahn YY. Optimal modularity and memory capacity of neural reservoirs. Netw Neurosci 2019; 3:551-566. [PMID: 31089484 PMCID: PMC6497001 DOI: 10.1162/netn_a_00082] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2018] [Accepted: 02/25/2019] [Indexed: 11/04/2022] Open
Abstract
The neural network is a powerful computing framework that has been exploited by biological evolution and by humans for solving diverse problems. Although the computational capabilities of neural networks are determined by their structure, the current understanding of the relationships between a neural network's architecture and function is still primitive. Here we reveal that a neural network's modular architecture plays a vital role in determining the neural dynamics and memory performance of the network of threshold neurons. In particular, we demonstrate that there exists an optimal modularity for memory performance, where a balance between local cohesion and global connectivity is established, allowing optimally modular networks to remember longer. Our results suggest that insights from dynamical analysis of neural networks and information-spreading processes can be leveraged to better design neural networks and may shed light on the brain's modular organization.
Collapse
Affiliation(s)
- Nathaniel Rodriguez
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA
| | - Eduardo Izquierdo
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA
- Cognitive Science Program, Indiana University, Bloomington, IN, USA
| | - Yong-Yeol Ahn
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA
- Indiana University Network Science Institute, Bloomington, IN, USA
| |
Collapse
|
17
|
Agrawal V, Chakraborty S, Knöpfel T, Shew WL. Scale-Change Symmetry in the Rules Governing Neural Systems. iScience 2019; 12:121-131. [PMID: 30682624 PMCID: PMC6352707 DOI: 10.1016/j.isci.2019.01.009] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2018] [Revised: 12/05/2018] [Accepted: 01/04/2019] [Indexed: 11/16/2022] Open
Abstract
Similar universal phenomena can emerge in different complex systems when those systems share a common symmetry in their governing laws. In physical systems operating near a critical phase transition, the governing physical laws obey a fractal symmetry; they are the same whether considered at fine or coarse scales. This scale-change symmetry is responsible for universal critical phenomena found across diverse systems. Experiments suggest that the cerebral cortex can also operate near a critical phase transition. Thus we hypothesize that the laws governing cortical dynamics may obey scale-change symmetry. Here we develop a practical approach to test this hypothesis. We confirm, using two different computational models, that neural dynamical laws exhibit scale-change symmetry near a dynamical phase transition. Moreover, we show that as a mouse awakens from anesthesia, scale-change symmetry emerges. Scale-change symmetry of the rules governing cortical dynamics may explain observations of similar critical phenomena across diverse neural systems.
Collapse
Affiliation(s)
- Vidit Agrawal
- Department of Physics, University of Arkansas, Fayetteville, AR 72701, USA
| | - Srimoy Chakraborty
- Department of Physics, University of Arkansas, Fayetteville, AR 72701, USA
| | - Thomas Knöpfel
- Laboratory for Neuronal Circuit Dynamics, Faculty of Medicine Imperial College London, London W12 0NN, UK; Centre for Neurotechnology, Institute of Biomedical Engineering, Imperial College London, London SW7 2AZ, UK
| | - Woodrow L Shew
- Department of Physics, University of Arkansas, Fayetteville, AR 72701, USA.
| |
Collapse
|
18
|
Pena RFO, Zaks MA, Roque AC. Dynamics of spontaneous activity in random networks with multiple neuron subtypes and synaptic noise : Spontaneous activity in networks with synaptic noise. J Comput Neurosci 2018; 45:1-28. [PMID: 29923159 PMCID: PMC6061197 DOI: 10.1007/s10827-018-0688-6] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2017] [Revised: 05/19/2018] [Accepted: 05/23/2018] [Indexed: 12/18/2022]
Abstract
Spontaneous cortical population activity exhibits a multitude of oscillatory patterns, which often display synchrony during slow-wave sleep or under certain anesthetics and stay asynchronous during quiet wakefulness. The mechanisms behind these cortical states and transitions among them are not completely understood. Here we study spontaneous population activity patterns in random networks of spiking neurons of mixed types modeled by Izhikevich equations. Neurons are coupled by conductance-based synapses subject to synaptic noise. We localize the population activity patterns on the parameter diagram spanned by the relative inhibitory synaptic strength and the magnitude of synaptic noise. In absence of noise, networks display transient activity patterns, either oscillatory or at constant level. The effect of noise is to turn transient patterns into persistent ones: for weak noise, all activity patterns are asynchronous non-oscillatory independently of synaptic strengths; for stronger noise, patterns have oscillatory and synchrony characteristics that depend on the relative inhibitory synaptic strength. In the region of parameter space where inhibitory synaptic strength exceeds the excitatory synaptic strength and for moderate noise magnitudes networks feature intermittent switches between oscillatory and quiescent states with characteristics similar to those of synchronous and asynchronous cortical states, respectively. We explain these oscillatory and quiescent patterns by combining a phenomenological global description of the network state with local descriptions of individual neurons in their partial phase spaces. Our results point to a bridge from events at the molecular scale of synapses to the cellular scale of individual neurons to the collective scale of neuronal populations.
Collapse
Affiliation(s)
- Rodrigo F. O. Pena
- Department of Physics, Faculty of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP Brazil
| | - Michael A. Zaks
- Department of Physics, Faculty of Mathematics and Natural Sciences, Humboldt University of Berlin, Berlin, Germany
| | - Antonio C. Roque
- Department of Physics, Faculty of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP Brazil
| |
Collapse
|
19
|
Rich S, Zochowski M, Booth V. Dichotomous Dynamics in E-I Networks with Strongly and Weakly Intra-connected Inhibitory Neurons. Front Neural Circuits 2017; 11:104. [PMID: 29326558 PMCID: PMC5733501 DOI: 10.3389/fncir.2017.00104] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2017] [Accepted: 12/04/2017] [Indexed: 11/13/2022] Open
Abstract
The interconnectivity between excitatory and inhibitory neural networks informs mechanisms by which rhythmic bursts of excitatory activity can be produced in the brain. One such mechanism, Pyramidal Interneuron Network Gamma (PING), relies primarily upon reciprocal connectivity between the excitatory and inhibitory networks, while also including intra-connectivity of inhibitory cells. The causal relationship between excitatory activity and the subsequent burst of inhibitory activity is of paramount importance to the mechanism and has been well studied. However, the role of the intra-connectivity of the inhibitory network, while important for PING, has not been studied in detail, as most analyses of PING simply assume that inhibitory intra-connectivity is strong enough to suppress subsequent firing following the initial inhibitory burst. In this paper we investigate the role that the strength of inhibitory intra-connectivity plays in determining the dynamics of PING-style networks. We show that networks with weak inhibitory intra-connectivity exhibit variations in burst dynamics of both the excitatory and inhibitory cells that are not obtained with strong inhibitory intra-connectivity. Networks with weak inhibitory intra-connectivity exhibit excitatory rhythmic bursts with weak excitatory-to-inhibitory synapses for which classical PING networks would show no rhythmic activity. Additionally, variations in dynamics of these networks as the excitatory-to-inhibitory synaptic weight increases illustrates the important role that consistent pattern formation in the inhibitory cells serves in maintaining organized and periodic excitatory bursts. Finally, motivated by these results and the known diversity of interneurons, we show that a PING-style network with two inhibitory subnetworks, one strongly intra-connected and one weakly intra-connected, exhibits organized and periodic excitatory activity over a larger parameter regime than networks with a homogeneous inhibitory population. Taken together, these results serve to better articulate the role of inhibitory intra-connectivity in generating PING-like rhythms, while also revealing how heterogeneity amongst inhibitory synapses might make such rhythms more robust to a variety of network parameters.
Collapse
Affiliation(s)
- Scott Rich
- Applied and Interdisciplinary Mathematics, University of Michigan, Ann Arbor, MI, United States
| | - Michal Zochowski
- Department of Physics and Biophysics, University of Michigan, Ann Arbor, MI, United States
| | - Victoria Booth
- Department of Mathematics and Anesthesiology, University of Michigan, Ann Arbor, MI, United States
| |
Collapse
|
20
|
Qian Y, Liu F, Yang K, Zhang G, Yao C, Ma J. Spatiotemporal dynamics in excitable homogeneous random networks composed of periodically self-sustained oscillation. Sci Rep 2017; 7:11885. [PMID: 28928389 PMCID: PMC5605731 DOI: 10.1038/s41598-017-12333-3] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2017] [Accepted: 09/07/2017] [Indexed: 11/26/2022] Open
Abstract
The collective behaviors of networks are often dependent on the network connections and bifurcation parameters, also the local kinetics plays an important role in contributing the consensus of coupled oscillators. In this paper, we systematically investigate the influence of network structures and system parameters on the spatiotemporal dynamics in excitable homogeneous random networks (EHRNs) composed of periodically self-sustained oscillation (PSO). By using the dominant phase-advanced driving (DPAD) method, the one-dimensional (1D) Winfree loop is exposed as the oscillation source supporting the PSO, and the accurate wave propagation pathways from the oscillation source to the whole network are uncovered. Then, an order parameter is introduced to quantitatively study the influence of network structures and system parameters on the spatiotemporal dynamics of PSO in EHRNs. Distinct results induced by the network structures and the system parameters are observed. Importantly, the corresponding mechanisms are revealed. PSO influenced by the network structures are induced not only by the change of average path length (APL) of network, but also by the invasion of 1D Winfree loop from the outside linking nodes. Moreover, PSO influenced by the system parameters are determined by the excitation threshold and the minimum 1D Winfree loop. Finally, we confirmed that the excitation threshold and the minimum 1D Winfree loop determined PSO will degenerate as the system size is expanded.
Collapse
Affiliation(s)
- Yu Qian
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji, 721007, China.
| | - Fei Liu
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji, 721007, China
| | - Keli Yang
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji, 721007, China
| | - Ge Zhang
- Department of Physics, Lanzhou University of Technology, Lanzhou, 730050, China
| | - Chenggui Yao
- Department of Mathematics, Shaoxing University, Shaoxing, 312000, China
| | - Jun Ma
- Department of Physics, Lanzhou University of Technology, Lanzhou, 730050, China.,King Abdulaziz Univ, Fac Sci, Dept Math, NAAM Res Grp, Jeddah, 21589, Saudi Arabia
| |
Collapse
|
21
|
Li X, Chen Q, Xue F. Biological modelling of a computational spiking neural network with neuronal avalanches. PHILOSOPHICAL TRANSACTIONS. SERIES A, MATHEMATICAL, PHYSICAL, AND ENGINEERING SCIENCES 2017; 375:20160286. [PMID: 28507231 PMCID: PMC5434077 DOI: 10.1098/rsta.2016.0286] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 12/12/2016] [Indexed: 05/24/2023]
Abstract
In recent years, an increasing number of studies have demonstrated that networks in the brain can self-organize into a critical state where dynamics exhibit a mixture of ordered and disordered patterns. This critical branching phenomenon is termed neuronal avalanches. It has been hypothesized that the homeostatic level balanced between stability and plasticity of this critical state may be the optimal state for performing diverse neural computational tasks. However, the critical region for high performance is narrow and sensitive for spiking neural networks (SNNs). In this paper, we investigated the role of the critical state in neural computations based on liquid-state machines, a biologically plausible computational neural network model for real-time computing. The computational performance of an SNN when operating at the critical state and, in particular, with spike-timing-dependent plasticity for updating synaptic weights is investigated. The network is found to show the best computational performance when it is subjected to critical dynamic states. Moreover, the active-neuron-dominant structure refined from synaptic learning can remarkably enhance the robustness of the critical state and further improve computational accuracy. These results may have important implications in the modelling of spiking neural networks with optimal computational performance.This article is part of the themed issue 'Mathematical methods in medicine: neuroscience, cardiology and pathology'.
Collapse
Affiliation(s)
- Xiumin Li
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, People's Republic of China
- College of Automation, Chongqing University, Chongqing 400044, People's Republic of China
| | - Qing Chen
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, People's Republic of China
- College of Automation, Chongqing University, Chongqing 400044, People's Republic of China
| | - Fangzheng Xue
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, People's Republic of China
- College of Automation, Chongqing University, Chongqing 400044, People's Republic of China
| |
Collapse
|
22
|
Yang DP, Zhou HJ, Zhou C. Co-emergence of multi-scale cortical activities of irregular firing, oscillations and avalanches achieves cost-efficient information capacity. PLoS Comput Biol 2017; 13:e1005384. [PMID: 28192429 PMCID: PMC5330539 DOI: 10.1371/journal.pcbi.1005384] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2016] [Revised: 02/28/2017] [Accepted: 01/29/2017] [Indexed: 11/19/2022] Open
Abstract
The brain is highly energy consuming, therefore is under strong selective pressure to achieve cost-efficiency in both cortical connectivities and activities. However, cost-efficiency as a design principle for cortical activities has been rarely studied. Especially it is not clear how cost-efficiency is related to ubiquitously observed multi-scale properties: irregular firing, oscillations and neuronal avalanches. Here we demonstrate that these prominent properties can be simultaneously observed in a generic, biologically plausible neural circuit model that captures excitation-inhibition balance and realistic dynamics of synaptic conductance. Their co-emergence achieves minimal energy cost as well as maximal energy efficiency on information capacity, when neuronal firing are coordinated and shaped by moderate synchrony to reduce otherwise redundant spikes, and the dynamical clusterings are maintained in the form of neuronal avalanches. Such cost-efficient neural dynamics can be employed as a foundation for further efficient information processing under energy constraint. The adult human brain consumes more than 20% of the resting metabolism, despite constituting only 2% of the body’s mass. Most energy is consumed by the cerebral cortex with billions of neurons, mainly to restore ion gradients across membranes for generating and propagating action potentials and synaptic transmission. Even small increases in the average spike rate of cortical neurons could cause the cortex to exceed the energy budget for the whole brain. Consequently, the cortex is likely to be under considerable selective pressure to reduce spike rates but to maintain efficient information processing. Experimentally, cortical activities are ubiquitously observed at multiple scales with prominent features: irregular individual firing, synchronized oscillations and neuronal avalanches. Do these features of cortical activities reflect cost-efficiency on the aspect of information capacity? We employ a generic but biologically plausible local neural circuit to compare various dynamical modes with different degrees of synchrony. Our simulations show that these features of cortical activities can be observed simultaneously and their co-emergence indeed robustly achieves maximal energy efficiency and minimal energy cost. Our work thus suggests that basic neurobiological and dynamical mechanisms can support the foundation for efficient neural information processing under the energy constraint.
Collapse
Affiliation(s)
- Dong-Ping Yang
- Department of Physics, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- School of Physics, University of Sydney, Sydney, New South Wales, Australia
- * E-mail: (DPY); (CZ)
| | - Hai-Jun Zhou
- Institute of Theoretical Physics, Chinese Academy of Sciences, Beijing, China
| | - Changsong Zhou
- Department of Physics, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Beijing Computational Science Research Center, Beijing, China
- Research Center, HKBU Institute of Research and Continuing Education, Virtual University Park Building, South Area Hi-tech Industrial Park, Shenzhen, China
- * E-mail: (DPY); (CZ)
| |
Collapse
|
23
|
Fagerholm ED, Scott G, Shew WL, Song C, Leech R, Knöpfel T, Sharp DJ. Cortical Entropy, Mutual Information and Scale-Free Dynamics in Waking Mice. Cereb Cortex 2016; 26:3945-52. [PMID: 27384059 PMCID: PMC5028006 DOI: 10.1093/cercor/bhw200] [Citation(s) in RCA: 57] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2016] [Accepted: 02/06/2016] [Indexed: 12/11/2022] Open
Abstract
Some neural circuits operate with simple dynamics characterized by one or a few well-defined spatiotemporal scales (e.g. central pattern generators). In contrast, cortical neuronal networks often exhibit richer activity patterns in which all spatiotemporal scales are represented. Such “scale-free” cortical dynamics manifest as cascades of activity with cascade sizes that are distributed according to a power-law. Theory and in vitro experiments suggest that information transmission among cortical circuits is optimized by scale-free dynamics. In vivo tests of this hypothesis have been limited by experimental techniques with insufficient spatial coverage and resolution, i.e., restricted access to a wide range of scales. We overcame these limitations by using genetically encoded voltage imaging to track neural activity in layer 2/3 pyramidal cells across the cortex in mice. As mice recovered from anesthesia, we observed three changes: (a) cortical information capacity increased, (b) information transmission among cortical regions increased and (c) neural activity became scale-free. Our results demonstrate that both information capacity and information transmission are maximized in the awake state in cortical regions with scale-free network dynamics.
Collapse
Affiliation(s)
- Erik D Fagerholm
- The Computational, Cognitive and Clinical Neuroimaging Laboratory, The Centre for Neuroscience, The Division of Brain Sciences, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London, W12 0NN, UK
| | - Gregory Scott
- The Computational, Cognitive and Clinical Neuroimaging Laboratory, The Centre for Neuroscience, The Division of Brain Sciences, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London, W12 0NN, UK
| | - Woodrow L Shew
- University of Arkansas, Department of Physics, Fayetteville, AR 72701, USA
| | - Chenchen Song
- Division of Brain Sciences, Department of Medicine, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London, W12 0NN, UK
| | - Robert Leech
- The Computational, Cognitive and Clinical Neuroimaging Laboratory, The Centre for Neuroscience, The Division of Brain Sciences, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London, W12 0NN, UK
| | - Thomas Knöpfel
- Division of Brain Sciences, Department of Medicine, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London, W12 0NN, UK Centre for Neurotechnology, Institute of Biomedical Engineering, Imperial College London, South Kensington, London SW7 2AZ, UK
| | - David J Sharp
- The Computational, Cognitive and Clinical Neuroimaging Laboratory, The Centre for Neuroscience, The Division of Brain Sciences, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London, W12 0NN, UK
| |
Collapse
|
24
|
Abstract
Biology is the study of dynamical systems. Yet most of us working in biology have limited pedagogical training in the theory of dynamical systems, an unfortunate historical fact that can be remedied for future generations of life scientists. In my particular field of systems neuroscience, neural circuits are rife with nonlinearities at all levels of description, rendering simple methodologies and our own intuition unreliable. Therefore, our ideas are likely to be wrong unless informed by good models. These models should be based on the mathematical theories of dynamical systems since functioning neurons are dynamic—they change their membrane potential and firing rates with time. Thus, selecting the appropriate type of dynamical system upon which to base a model is an important first step in the modeling process. This step all too easily goes awry, in part because there are many frameworks to choose from, in part because the sparsely sampled data can be consistent with a variety of dynamical processes, and in part because each modeler has a preferred modeling approach that is difficult to move away from. This brief review summarizes some of the main dynamical paradigms that can arise in neural circuits, with comments on what they can achieve computationally and what signatures might reveal their presence within empirical data. I provide examples of different dynamical systems using simple circuits of two or three cells, emphasizing that any one connectivity pattern is compatible with multiple, diverse functions.
Collapse
Affiliation(s)
- Paul Miller
- Volen National Center for Complex Systems, Brandeis University, Waltham, Massachusetts, 02454-9110, USA
| |
Collapse
|
25
|
Knight JC, Tully PJ, Kaplan BA, Lansner A, Furber SB. Large-Scale Simulations of Plastic Neural Networks on Neuromorphic Hardware. Front Neuroanat 2016; 10:37. [PMID: 27092061 PMCID: PMC4823276 DOI: 10.3389/fnana.2016.00037] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2015] [Accepted: 03/18/2016] [Indexed: 11/17/2022] Open
Abstract
SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 2.0 × 104 neurons and 5.1 × 107 plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately 45× more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models.
Collapse
Affiliation(s)
- James C Knight
- Advanced Processor Technologies Group, School of Computer Science, University of Manchester Manchester, UK
| | - Philip J Tully
- Department of Computational Biology, Royal Institute of TechnologyStockholm, Sweden; Stockholm Brain Institute, Karolinska InstituteStockholm, Sweden; Institute for Adaptive and Neural Computation, School of Informatics, University of EdinburghEdinburgh, UK
| | - Bernhard A Kaplan
- Department of Visualization and Data Analysis, Zuse Institute Berlin Berlin, Germany
| | - Anders Lansner
- Department of Computational Biology, Royal Institute of TechnologyStockholm, Sweden; Stockholm Brain Institute, Karolinska InstituteStockholm, Sweden; Department of Numerical analysis and Computer Science, Stockholm UniversityStockholm, Sweden
| | - Steve B Furber
- Advanced Processor Technologies Group, School of Computer Science, University of Manchester Manchester, UK
| |
Collapse
|
26
|
Tomov P, Pena RFO, Roque AC, Zaks MA. Mechanisms of Self-Sustained Oscillatory States in Hierarchical Modular Networks with Mixtures of Electrophysiological Cell Types. Front Comput Neurosci 2016; 10:23. [PMID: 27047367 PMCID: PMC4803744 DOI: 10.3389/fncom.2016.00023] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2015] [Accepted: 03/04/2016] [Indexed: 11/18/2022] Open
Abstract
In a network with a mixture of different electrophysiological types of neurons linked by excitatory and inhibitory connections, temporal evolution leads through repeated epochs of intensive global activity separated by intervals with low activity level. This behavior mimics “up” and “down” states, experimentally observed in cortical tissues in absence of external stimuli. We interpret global dynamical features in terms of individual dynamics of the neurons. In particular, we observe that the crucial role both in interruption and in resumption of global activity is played by distributions of the membrane recovery variable within the network. We also demonstrate that the behavior of neurons is more influenced by their presynaptic environment in the network than by their formal types, assigned in accordance with their response to constant current.
Collapse
Affiliation(s)
- Petar Tomov
- Institute of Mathematics, Humboldt University of Berlin Berlin, Germany
| | - Rodrigo F O Pena
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São PauloSão Paulo, Brazil; Institute of Physics, Humboldt University of BerlinBerlin, Germany
| | - Antonio C Roque
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo São Paulo, Brazil
| | - Michael A Zaks
- Institute of Physics and Astronomy, University of Potsdam Potsdam, Germany
| |
Collapse
|
27
|
Qu J, Wang SJ, Jusup M, Wang Z. Effects of random rewiring on the degree correlation of scale-free networks. Sci Rep 2015; 5:15450. [PMID: 26482005 PMCID: PMC4611853 DOI: 10.1038/srep15450] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2015] [Accepted: 09/08/2015] [Indexed: 01/01/2023] Open
Abstract
Random rewiring is used to generate null networks for the purpose of analyzing the topological properties of scale-free networks, yet the effects of random rewiring on the degree correlation are subject to contradicting interpretations in the literature. We comprehensively analyze the degree correlation of randomly rewired scale-free networks and show that random rewiring increases disassortativity by reducing the average degree of the nearest neighbors of high-degree nodes. The effect can be captured by the measures of the degree correlation that consider all links in the network, but not by analogous measures that consider only links between degree peers, hence the potential for contradicting interpretations. We furthermore find that random and directional rewiring affect the topology of a scale-free network differently, even if the degree correlation of the rewired networks is the same. Consequently, the network dynamics is changed, which is proven here by means of the biased random walk.
Collapse
Affiliation(s)
- Jing Qu
- School of Physics and Information Technology, Shaanxi Normal University, Xi’an 710119, China
| | - Sheng-Jun Wang
- School of Physics and Information Technology, Shaanxi Normal University, Xi’an 710119, China
| | - Marko Jusup
- Faculty of Sciences, Kyushu University, Fukuoka 819-0395, Japan
| | - Zhen Wang
- Interdisciplinary Graduate School of Engineering Sciences, Kyushu University, Fukuoka 816-8580, Japan
- School of Automation, Northwestern Polytechnical University, Xi’an 710072, China
| |
Collapse
|
28
|
Stratton P, Wiles J. Global segregation of cortical activity and metastable dynamics. Front Syst Neurosci 2015; 9:119. [PMID: 26379514 PMCID: PMC4548222 DOI: 10.3389/fnsys.2015.00119] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2015] [Accepted: 08/07/2015] [Indexed: 11/23/2022] Open
Abstract
Cortical activity exhibits persistent metastable dynamics. Assemblies of neurons transiently couple (integrate) and decouple (segregate) at multiple spatiotemporal scales; both integration and segregation are required to support metastability. Integration of distant brain regions can be achieved through long range excitatory projections, but the mechanism supporting long range segregation is not clear. We argue that the thalamocortical matrix connections, which project diffusely from the thalamus to the cortex and have long been thought to support cortical gain control, play an equally-important role in cortical segregation. We present a computational model of the diffuse thalamocortical loop, called the competitive cross-coupling (CXC) spiking network. Simulations of the model show how different levels of tonic input from the brainstem to the thalamus could control dynamical complexity in the cortex, directing transitions between sleep, wakefulness and high attention or vigilance. The model also explains how mutually-exclusive activity could arise across large portions of the cortex, such as between the default-mode and task-positive networks. It is robust to noise but does not require noise to autonomously generate metastability. We conclude that the long range segregation observed in brain activity and required for global metastable dynamics could be provided by the thalamocortical matrix, and is strongly modulated by brainstem input to the thalamus.
Collapse
Affiliation(s)
- Peter Stratton
- Queensland Brain Institute, The University of Queensland Brisbane, QLD, Australia ; Centre for Clinical Research, The University of Queensland Brisbane, QLD, Australia
| | - Janet Wiles
- School of Information Technology and Electrical Engineering, The University of Queensland Brisbane, QLD, Australia
| |
Collapse
|
29
|
Bellay T, Klaus A, Seshadri S, Plenz D. Irregular spiking of pyramidal neurons organizes as scale-invariant neuronal avalanches in the awake state. eLife 2015; 4:e07224. [PMID: 26151674 PMCID: PMC4492006 DOI: 10.7554/elife.07224] [Citation(s) in RCA: 84] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2015] [Accepted: 06/10/2015] [Indexed: 12/22/2022] Open
Abstract
Spontaneous fluctuations in neuronal activity emerge at many spatial and temporal scales in cortex. Population measures found these fluctuations to organize as scale-invariant neuronal avalanches, suggesting cortical dynamics to be critical. Macroscopic dynamics, though, depend on physiological states and are ambiguous as to their cellular composition, spatiotemporal origin, and contributions from synaptic input or action potential (AP) output. Here, we study spontaneous firing in pyramidal neurons (PNs) from rat superficial cortical layers in vivo and in vitro using 2-photon imaging. As the animal transitions from the anesthetized to awake state, spontaneous single neuron firing increases in irregularity and assembles into scale-invariant avalanches at the group level. In vitro spike avalanches emerged naturally yet required balanced excitation and inhibition. This demonstrates that neuronal avalanches are linked to the global physiological state of wakefulness and that cortical resting activity organizes as avalanches from firing of local PN groups to global population activity. DOI:http://dx.doi.org/10.7554/eLife.07224.001 Even when we are not engaged in any specific task, the brain shows coordinated patterns of spontaneous activity that can be monitored using electrodes placed on the scalp. This resting activity shapes the way that the brain responds to subsequent stimuli. Changes in resting activity patterns are seen in various neurological and psychiatric disorders, as well as in healthy individuals following sleep deprivation. The brain's outer layer is known as the cortex. On a large scale, when monitoring many thousands of neurons, resting activity in the cortex demonstrates propagation in the brain in an organized manner. Specifically, resting activity was found to organize as so-called neuronal avalanches, in which large bursts of neuronal activity are grouped with medium-sized and smaller bursts in a very characteristic order. In fact, the sizes of these bursts—that is, the number of neurons that fire—are found to be scale-invariant, that is, the ratio of large bursts to medium-sized bursts is the same as that of medium-sized to small bursts. Such scale-invariance suggests that neuronal bursts are not independent of one another. However, it is largely unclear how neuronal avalanches arise from individual neurons, which fire simply in a noisy, irregular manner. Bellay, Klaus et al. have now provided insights into this process by examining patterns of firing of a particular type of neuron—known as a pyramidal cell—in the cortex of rats as they recover from anesthesia. As the animals awaken, the firing of individual pyramidal cells in the cortex becomes even more irregular than under anesthesia. However, by considering the activity of a group of these neurons, Bellay, Klaus et al. realized that it is this more irregular firing that gives rise to neuronal avalanches, and that this occurs only when the animals are awake. Further experiments on individual pyramidal cells grown in the laboratory confirmed that neuronal avalanches emerge spontaneously from the irregular firing of individual neurons. These avalanches depend on there being a balance between two types of activity among the cells: ‘excitatory’ activity that causes other neurons to fire, and ‘inhibitory’ activity that prevents neuronal firing. Given that resting activity influences the brain's responses to the outside world, the origins of neuronal avalanches are likely to provide clues about the way the brain processes information. Future experiments should also examine the possibility that the emergence of neuronal avalanches marks the transition from unconsciousness to wakefulness within the brain. DOI:http://dx.doi.org/10.7554/eLife.07224.002
Collapse
Affiliation(s)
- Timothy Bellay
- Section on Critical Brain Dynamics, National Institute of Mental Health, Bethesda, United States
| | - Andreas Klaus
- Section on Critical Brain Dynamics, National Institute of Mental Health, Bethesda, United States
| | - Saurav Seshadri
- Section on Critical Brain Dynamics, National Institute of Mental Health, Bethesda, United States
| | - Dietmar Plenz
- Section on Critical Brain Dynamics, National Institute of Mental Health, Bethesda, United States
| |
Collapse
|
30
|
Valverde S, Ohse S, Turalska M, West BJ, Garcia-Ojalvo J. Structural determinants of criticality in biological networks. Front Physiol 2015; 6:127. [PMID: 26005422 PMCID: PMC4424853 DOI: 10.3389/fphys.2015.00127] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2015] [Accepted: 04/10/2015] [Indexed: 01/09/2023] Open
Abstract
Many adaptive evolutionary systems display spatial and temporal features, such as long-range correlations, typically associated with the critical point of a phase transition in statistical physics. Empirical and theoretical studies suggest that operating near criticality enhances the functionality of biological networks, such as brain and gene networks, in terms for instance of information processing, robustness, and evolvability. While previous studies have explained criticality with specific system features, we still lack a general theory of critical behavior in biological systems. Here we look at this problem from the complex systems perspective, since in principle all critical biological circuits have in common the fact that their internal organization can be described as a complex network. An important question is how self-similar structure influences self-similar dynamics. Modularity and heterogeneity, for instance, affect the location of critical points and can be used to tune the system toward criticality. We review and discuss recent studies on the criticality of neuronal and genetic networks, and discuss the implications of network theory when assessing the evolutionary features of criticality.
Collapse
Affiliation(s)
- Sergi Valverde
- ICREA-Complex Systems Lab, Universitat Pompeu FabraBarcelona, Spain
- Institute of Evolutionary Biology (CSIC-UPF), Universitat Pompeu FabraBarcelona, Spain
| | - Sebastian Ohse
- Institute of Molecular Medicine and Cell Research, Albert-Ludwigs-Universität FreiburgFreiburg, Germany
| | | | - Bruce J. West
- Department of Physics, Duke UniversityDurham, NC, USA
- Mathematical and Information Sciences Directorate, U.S. Army Research Office, Research Triangle ParkNC, USA
| | - Jordi Garcia-Ojalvo
- Department of Experimental and Health Sciences, Universitat Pompeu FabraBarcelona, Spain
| |
Collapse
|
31
|
|
32
|
Ros T, J Baars B, Lanius RA, Vuilleumier P. Tuning pathological brain oscillations with neurofeedback: a systems neuroscience framework. Front Hum Neurosci 2014; 8:1008. [PMID: 25566028 PMCID: PMC4270171 DOI: 10.3389/fnhum.2014.01008] [Citation(s) in RCA: 109] [Impact Index Per Article: 9.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2014] [Accepted: 11/26/2014] [Indexed: 12/03/2022] Open
Abstract
Neurofeedback (NFB) is emerging as a promising technique that enables self-regulation of ongoing brain oscillations. However, despite a rise in empirical evidence attesting to its clinical benefits, a solid theoretical basis is still lacking on the manner in which NFB is able to achieve these outcomes. The present work attempts to bring together various concepts from neurobiology, engineering, and dynamical systems so as to propose a contemporary theoretical framework for the mechanistic effects of NFB. The objective is to provide a firmly neurophysiological account of NFB, which goes beyond traditional behaviorist interpretations that attempt to explain psychological processes solely from a descriptive standpoint whilst treating the brain as a “black box”. To this end, we interlink evidence from experimental findings that encompass a broad range of intrinsic brain phenomena: starting from “bottom-up” mechanisms of neural synchronization, followed by “top-down” regulation of internal brain states, moving to dynamical systems plus control-theoretic principles, and concluding with activity-dependent as well as homeostatic forms of brain plasticity. In support of our framework, we examine the effects of NFB in several brain disorders, including attention-deficit hyperactivity (ADHD) and post-traumatic stress disorder (PTSD). In sum, it is argued that pathological oscillations emerge from an abnormal formation of brain-state attractor landscape(s). The central thesis put forward is that NFB tunes brain oscillations toward a homeostatic set-point which affords an optimal balance between network flexibility and stability (i.e., self-organised criticality (SOC)).
Collapse
Affiliation(s)
- Tomas Ros
- Laboratory for Neurology and Imaging of Cognition, Department of Neurosciences, University of Geneva Geneva, Switzerland
| | - Bernard J Baars
- Theoretical Neurobiology, The Neurosciences Institute La Jolla, CA, USA
| | - Ruth A Lanius
- Department of Psychiatry, University of Western Ontario London, ON, Canada
| | - Patrik Vuilleumier
- Laboratory for Neurology and Imaging of Cognition, Department of Neurosciences, University of Geneva Geneva, Switzerland
| |
Collapse
|
33
|
Hütt MT, Kaiser M, Hilgetag CC. Perspective: network-guided pattern formation of neural dynamics. Philos Trans R Soc Lond B Biol Sci 2014; 369:20130522. [PMID: 25180302 PMCID: PMC4150299 DOI: 10.1098/rstb.2013.0522] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The understanding of neural activity patterns is fundamentally linked to an understanding of how the brain's network architecture shapes dynamical processes. Established approaches rely mostly on deviations of a given network from certain classes of random graphs. Hypotheses about the supposed role of prominent topological features (for instance, the roles of modularity, network motifs or hierarchical network organization) are derived from these deviations. An alternative strategy could be to study deviations of network architectures from regular graphs (rings and lattices) and consider the implications of such deviations for self-organized dynamic patterns on the network. Following this strategy, we draw on the theory of spatio-temporal pattern formation and propose a novel perspective for analysing dynamics on networks, by evaluating how the self-organized dynamics are confined by network architecture to a small set of permissible collective states. In particular, we discuss the role of prominent topological features of brain connectivity, such as hubs, modules and hierarchy, in shaping activity patterns. We illustrate the notion of network-guided pattern formation with numerical simulations and outline how it can facilitate the understanding of neural dynamics.
Collapse
Affiliation(s)
- Marc-Thorsten Hütt
- School of Engineering and Science, Jacobs University Bremen, Bremen, Germany
| | - Marcus Kaiser
- School of Computing Science, Newcastle University, Claremont Tower, Newcastle upon Tyne NE1 7RU, UK Institute of Neuroscience, Newcastle University, Framlington Place, Newcastle upon Tyne NE2 4HH, UK
| | - Claus C Hilgetag
- Department of Computational Neuroscience, University Medical Center Eppendorf, Hamburg, Germany Department of Health Sciences, Boston University, Boston, MA, USA
| |
Collapse
|
34
|
Lohse C, Bassett DS, Lim KO, Carlson JM. Resolving anatomical and functional structure in human brain organization: identifying mesoscale organization in weighted network representations. PLoS Comput Biol 2014; 10:e1003712. [PMID: 25275860 PMCID: PMC4183375 DOI: 10.1371/journal.pcbi.1003712] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2014] [Accepted: 05/28/2014] [Indexed: 11/18/2022] Open
Abstract
Human brain anatomy and function display a combination of modular and hierarchical organization, suggesting the importance of both cohesive structures and variable resolutions in the facilitation of healthy cognitive processes. However, tools to simultaneously probe these features of brain architecture require further development. We propose and apply a set of methods to extract cohesive structures in network representations of brain connectivity using multi-resolution techniques. We employ a combination of soft thresholding, windowed thresholding, and resolution in community detection, that enable us to identify and isolate structures associated with different weights. One such mesoscale structure is bipartivity, which quantifies the extent to which the brain is divided into two partitions with high connectivity between partitions and low connectivity within partitions. A second, complementary mesoscale structure is modularity, which quantifies the extent to which the brain is divided into multiple communities with strong connectivity within each community and weak connectivity between communities. Our methods lead to multi-resolution curves of these network diagnostics over a range of spatial, geometric, and structural scales. For statistical comparison, we contrast our results with those obtained for several benchmark null models. Our work demonstrates that multi-resolution diagnostic curves capture complex organizational profiles in weighted graphs. We apply these methods to the identification of resolution-specific characteristics of healthy weighted graph architecture and altered connectivity profiles in psychiatric disease.
Collapse
Affiliation(s)
- Christian Lohse
- Kirchhoff Institute for Physics, University of Heidelberg, Heidelberg, Germany
| | - Danielle S. Bassett
- Department of Physics, University of California, Santa Barbara, California, United States of America
- Sage Center for the Study of the Mind, University of California, Santa Barbara, California, United States of America
- Department of Bioengineering, University of Pennsylvania, Philadelphia, Pennsylvania, United States of America
- * E-mail:
| | - Kelvin O. Lim
- Department of Psychiatry, University of Minnesota, Minneapolis, Minnesota, United States of America
| | - Jean M. Carlson
- Department of Physics, University of California, Santa Barbara, California, United States of America
| |
Collapse
|
35
|
Tomov P, Pena RFO, Zaks MA, Roque AC. Sustained oscillations, irregular firing, and chaotic dynamics in hierarchical modular networks with mixtures of electrophysiological cell types. Front Comput Neurosci 2014; 8:103. [PMID: 25228879 PMCID: PMC4151042 DOI: 10.3389/fncom.2014.00103] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2014] [Accepted: 08/13/2014] [Indexed: 11/13/2022] Open
Abstract
The cerebral cortex exhibits neural activity even in the absence of external stimuli. This self-sustained activity is characterized by irregular firing of individual neurons and population oscillations with a broad frequency range. Questions that arise in this context, are: What are the mechanisms responsible for the existence of neuronal spiking activity in the cortex without external input? Do these mechanisms depend on the structural organization of the cortical connections? Do they depend on intrinsic characteristics of the cortical neurons? To approach the answers to these questions, we have used computer simulations of cortical network models. Our networks have hierarchical modular architecture and are composed of combinations of neuron models that reproduce the firing behavior of the five main cortical electrophysiological cell classes: regular spiking (RS), chattering (CH), intrinsically bursting (IB), low threshold spiking (LTS), and fast spiking (FS). The population of excitatory neurons is built of RS cells (always present) and either CH or IB cells. Inhibitory neurons belong to the same class, either LTS or FS. Long-lived self-sustained activity states in our network simulations display irregular single neuron firing and oscillatory activity similar to experimentally measured ones. The duration of self-sustained activity strongly depends on the initial conditions, suggesting a transient chaotic regime. Extensive analysis of the self-sustained activity states showed that their lifetime expectancy increases with the number of network modules and is favored when the network is composed of excitatory neurons of the RS and CH classes combined with inhibitory neurons of the LTS class. These results indicate that the existence and properties of the self-sustained cortical activity states depend on both the topology of the network and the neuronal mixture that comprises the network.
Collapse
Affiliation(s)
- Petar Tomov
- Institute of Mathematics, Humboldt University of Berlin Berlin, Germany
| | - Rodrigo F O Pena
- Laboratory of Neural Systems, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo Ribeirão Preto, Brazil
| | - Michael A Zaks
- Institute of Mathematics, Humboldt University of Berlin Berlin, Germany
| | - Antonio C Roque
- Laboratory of Neural Systems, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo Ribeirão Preto, Brazil
| |
Collapse
|
36
|
Qian Y. Emergence of self-sustained oscillations in excitable Erdös-Rényi random networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 90:032807. [PMID: 25314482 DOI: 10.1103/physreve.90.032807] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/21/2014] [Indexed: 06/04/2023]
Abstract
We investigate the emergence of self-sustained oscillations in excitable Erdös-Rényi random networks (EERRNs). Interestingly, periodical self-sustained oscillations have been found at a moderate connection probability P. For smaller or larger P, the system evolves into a homogeneous rest state with distinct mechanisms. One-dimensional Winfree loops are discovered as the sources to maintain the oscillations. Moreover, by analyzing these oscillation sources, we propose two criteria to explain the spatiotemporal dynamics obtained in EERRNs. Finally, the two critical connection probabilities for which self-sustained oscillations can emerge are approximately predicted based on these two criteria.
Collapse
Affiliation(s)
- Yu Qian
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji 721007, China
| |
Collapse
|
37
|
Villegas P, Moretti P, Muñoz MA. Frustrated hierarchical synchronization and emergent complexity in the human connectome network. Sci Rep 2014; 4:5990. [PMID: 25103684 PMCID: PMC4126002 DOI: 10.1038/srep05990] [Citation(s) in RCA: 50] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2014] [Accepted: 06/18/2014] [Indexed: 11/15/2022] Open
Abstract
The spontaneous emergence of coherent behavior through synchronization plays a key role in neural function, and its anomalies often lie at the basis of pathologies. Here we employ a parsimonious (mesoscopic) approach to study analytically and computationally the synchronization (Kuramoto) dynamics on the actual human-brain connectome network. We elucidate the existence of a so-far-uncovered intermediate phase, placed between the standard synchronous and asynchronous phases, i.e. between order and disorder. This novel phase stems from the hierarchical modular organization of the connectome. Where one would expect a hierarchical synchronization process, we show that the interplay between structural bottlenecks and quenched intrinsic frequency heterogeneities at many different scales, gives rise to frustrated synchronization, metastability, and chimera-like states, resulting in a very rich and complex phenomenology. We uncover the origin of the dynamic freezing behind these features by using spectral graph theory and discuss how the emerging complex synchronization patterns relate to the need for the brain to access –in a robust though flexible way– a large variety of functional attractors and dynamical repertoires without ad hoc fine-tuning to a critical point.
Collapse
Affiliation(s)
- Pablo Villegas
- Departamento de Electromagnetismo y Física de la Materia e Instituto Carlos I de Física Teórica y Computacional. Universidad de Granada, E-18071 Granada, Spain
| | - Paolo Moretti
- Departamento de Electromagnetismo y Física de la Materia e Instituto Carlos I de Física Teórica y Computacional. Universidad de Granada, E-18071 Granada, Spain
| | - Miguel A Muñoz
- Departamento de Electromagnetismo y Física de la Materia e Instituto Carlos I de Física Teórica y Computacional. Universidad de Granada, E-18071 Granada, Spain
| |
Collapse
|
38
|
Resolving structural variability in network models and the brain. PLoS Comput Biol 2014; 10:e1003491. [PMID: 24675546 PMCID: PMC3967917 DOI: 10.1371/journal.pcbi.1003491] [Citation(s) in RCA: 51] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2013] [Accepted: 01/15/2014] [Indexed: 01/09/2023] Open
Abstract
Large-scale white matter pathways crisscrossing the cortex create a complex pattern of connectivity that underlies human cognitive function. Generative mechanisms for this architecture have been difficult to identify in part because little is known in general about mechanistic drivers of structured networks. Here we contrast network properties derived from diffusion spectrum imaging data of the human brain with 13 synthetic network models chosen to probe the roles of physical network embedding and temporal network growth. We characterize both the empirical and synthetic networks using familiar graph metrics, but presented here in a more complete statistical form, as scatter plots and distributions, to reveal the full range of variability of each measure across scales in the network. We focus specifically on the degree distribution, degree assortativity, hierarchy, topological Rentian scaling, and topological fractal scaling—in addition to several summary statistics, including the mean clustering coefficient, the shortest path-length, and the network diameter. The models are investigated in a progressive, branching sequence, aimed at capturing different elements thought to be important in the brain, and range from simple random and regular networks, to models that incorporate specific growth rules and constraints. We find that synthetic models that constrain the network nodes to be physically embedded in anatomical brain regions tend to produce distributions that are most similar to the corresponding measurements for the brain. We also find that network models hardcoded to display one network property (e.g., assortativity) do not in general simultaneously display a second (e.g., hierarchy). This relative independence of network properties suggests that multiple neurobiological mechanisms might be at play in the development of human brain network architecture. Together, the network models that we develop and employ provide a potentially useful starting point for the statistical inference of brain network structure from neuroimaging data. White matter tracts crisscrossing the human cortex are linked in a complex pattern that constrains human thought and behavior. Why the human brain displays the complex pattern that it does is a fascinating open question. Progress in uncovering generative mechanisms for this architecture requires greater knowledge about mechanistic drivers of anatomical networks. Here we contrast network properties derived from images of the human brain with 13 synthetic network models investigated in a progressive, branching sequence, chosen to probe the roles of physical embedding and temporal growth. We characterize both the empirical and synthetic networks using network diagnostics presented here in statistical form, as scatter plots and distributions, to reveal the full range of variability of each measure. We find that synthetic models that constrain the network nodes to be physically embedded in anatomical brain regions tend to produce distributions that are most similar to the corresponding measurements for the brain. We also find that network models hardcoded to display one network property do not in general simultaneously display a second, suggesting that multiple neurobiological mechanisms drive human brain network development. The network models that we develop and employ enable statistical inference of brain network structure from neuroimaging data.
Collapse
|
39
|
Russo R, Herrmann HJ, de Arcangelis L. Brain modularity controls the critical behavior of spontaneous activity. Sci Rep 2014; 4:4312. [PMID: 24621482 PMCID: PMC3952147 DOI: 10.1038/srep04312] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2013] [Accepted: 02/19/2014] [Indexed: 11/29/2022] Open
Abstract
The human brain exhibits a complex structure made of scale-free highly connected modules loosely interconnected by weaker links to form a small-world network. These features appear in healthy patients whereas neurological diseases often modify this structure. An important open question concerns the role of brain modularity in sustaining the critical behaviour of spontaneous activity. Here we analyse the neuronal activity of a model, successful in reproducing on non-modular networks the scaling behaviour observed in experimental data, on a modular network implementing the main statistical features measured in human brain. We show that on a modular network, regardless the strength of the synaptic connections or the modular size and number, activity is never fully scale-free. Neuronal avalanches can invade different modules which results in an activity depression, hindering further avalanche propagation. Critical behaviour is solely recovered if inter-module connections are added, modifying the modular into a more random structure.
Collapse
Affiliation(s)
- R. Russo
- Physics Department, University of Naples Federico II, Napoli, Italy
| | - H. J. Herrmann
- Institute Computational Physics for Engineering Materials, ETH, Zürich, CH
- Departamento de Física, Universidade Federal do Ceará, 60451-970 Fortaleza, Ceará, Brazil
| | - L. de Arcangelis
- Department of Industrial and Information Engineering, Second University of Naples and INFN Gr. Coll. Salerno, Aversa (CE), Italy
| |
Collapse
|
40
|
Garcia GC, Lesne A, Hütt MT, Hilgetag CC. Building blocks of self-sustained activity in a simple deterministic model of excitable neural networks. Front Comput Neurosci 2012; 6:50. [PMID: 22888317 PMCID: PMC3412572 DOI: 10.3389/fncom.2012.00050] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2012] [Accepted: 07/01/2012] [Indexed: 12/04/2022] Open
Abstract
Understanding the interplay of topology and dynamics of excitable neural networks is one of the major challenges in computational neuroscience. Here we employ a simple deterministic excitable model to explore how network-wide activation patterns are shaped by network architecture. Our observables are co-activation patterns, together with the average activity of the network and the periodicities in the excitation density. Our main results are: (1) the dependence of the correlation between the adjacency matrix and the instantaneous (zero time delay) co-activation matrix on global network features (clustering, modularity, scale-free degree distribution), (2) a correlation between the average activity and the amount of small cycles in the graph, and (3) a microscopic understanding of the contributions by 3-node and 4-node cycles to sustained activity.
Collapse
Affiliation(s)
- Guadalupe C Garcia
- School of Engineering and Science, Jacobs University Bremen Bremen, Germany
| | | | | | | |
Collapse
|
41
|
Batista CAS, Lameu EL, Batista AM, Lopes SR, Pereira T, Zamora-López G, Kurths J, Viana RL. Phase synchronization of bursting neurons in clustered small-world networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2012; 86:016211. [PMID: 23005511 DOI: 10.1103/physreve.86.016211] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/30/2012] [Indexed: 06/01/2023]
Abstract
We investigate the collective dynamics of bursting neurons on clustered networks. The clustered network model is composed of subnetworks, each of them presenting the so-called small-world property. This model can also be regarded as a network of networks. In each subnetwork a neuron is connected to other ones with regular as well as random connections, the latter with a given intracluster probability. Moreover, in a given subnetwork each neuron has an intercluster probability to be connected to the other subnetworks. The local neuron dynamics has two time scales (fast and slow) and is modeled by a two-dimensional map. In such small-world network the neuron parameters are chosen to be slightly different such that, if the coupling strength is large enough, there may be synchronization of the bursting (slow) activity. We give bounds for the critical coupling strength to obtain global burst synchronization in terms of the network structure, that is, the probabilities of intracluster and intercluster connections. We find that, as the heterogeneity in the network is reduced, the network global synchronizability is improved. We show that the transitions to global synchrony may be abrupt or smooth depending on the intercluster probability.
Collapse
Affiliation(s)
- C A S Batista
- Graduate Program in Physics, State University of Ponta Grossa, Ponta Grossa, Paraná, Brazil
| | | | | | | | | | | | | | | |
Collapse
|
42
|
Wu JJS, Shih HC, Yen CT, Shyu BC. Network dynamics in nociceptive pathways assessed by the neuronal avalanche model. Mol Pain 2012; 8:33. [PMID: 22537828 PMCID: PMC3478175 DOI: 10.1186/1744-8069-8-33] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2012] [Accepted: 04/26/2012] [Indexed: 01/04/2023] Open
Abstract
Background Traditional electroencephalography provides a critical assessment of pain responses. The perception of pain, however, may involve a series of signal transmission pathways in higher cortical function. Recent studies have shown that a mathematical method, the neuronal avalanche model, may be applied to evaluate higher-order network dynamics. The neuronal avalanche is a cascade of neuronal activity, the size distribution of which can be approximated by a power law relationship manifested by the slope of a straight line (i.e., the α value). We investigated whether the neuronal avalanche could be a useful index for nociceptive assessment. Findings Neuronal activity was recorded with a 4 × 8 multichannel electrode array in the primary somatosensory cortex (S1) and anterior cingulate cortex (ACC). Under light anesthesia, peripheral pinch stimulation increased the slope of the α value in both the ACC and S1, whereas brush stimulation increased the α value only in the S1. The increase in α values was blocked in both regions under deep anesthesia. The increase in α values in the ACC induced by peripheral pinch stimulation was blocked by medial thalamic lesion, but the increase in α values in the S1 induced by brush and pinch stimulation was not affected. Conclusions The neuronal avalanche model shows a critical state in the cortical network for noxious-related signal processing. The α value may provide an index of brain network activity that distinguishes the responses to somatic stimuli from the control state. These network dynamics may be valuable for the evaluation of acute nociceptive processes and may be applied to chronic pathological pain conditions.
Collapse
Affiliation(s)
- José Jiun-Shian Wu
- Institute of Biomedical Science, Academia Sinica, Taipei, Republic of China
| | | | | | | |
Collapse
|
43
|
Li D, Zhou C. Organization of Anti-Phase Synchronization Pattern in Neural Networks: What are the Key Factors? Front Syst Neurosci 2011; 5:100. [PMID: 22232576 PMCID: PMC3233683 DOI: 10.3389/fnsys.2011.00100] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2011] [Accepted: 11/19/2011] [Indexed: 11/13/2022] Open
Abstract
Anti-phase oscillation has been widely observed in cortical neural network. Elucidating the mechanism underlying the organization of anti-phase pattern is of significance for better understanding more complicated pattern formations in brain networks. In dynamical systems theory, the organization of anti-phase oscillation pattern has usually been considered to relate to time delay in coupling. This is consistent to conduction delays in real neural networks in the brain due to finite propagation velocity of action potentials. However, other structural factors in cortical neural network, such as modular organization (connection density) and the coupling types (excitatory or inhibitory), could also play an important role. In this work, we investigate the anti-phase oscillation pattern organized on a two-module network of either neuronal cell model or neural mass model, and analyze the impact of the conduction delay times, the connection densities, and coupling types. Our results show that delay times and coupling types can play key roles in this organization. The connection densities may have an influence on the stability if an anti-phase pattern exists due to the other factors. Furthermore, we show that anti-phase synchronization of slow oscillations can be achieved with small delay times if there is interaction between slow and fast oscillations. These results are significant for further understanding more realistic spatiotemporal dynamics of cortico-cortical communications.
Collapse
Affiliation(s)
- Dong Li
- Department of Physics, Centre for Nonlinear Studies and The Beijing-Hong Kong-Singapore Joint Centre for Non-linear and Complex Systems (Hong Kong), Hong Kong Baptist UniversityHong Kong, China
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies and The Beijing-Hong Kong-Singapore Joint Centre for Non-linear and Complex Systems (Hong Kong), Hong Kong Baptist UniversityHong Kong, China
| |
Collapse
|