1
|
Sasi S, Sen Bhattacharya B, Sreeraj VS, Venkatasubramanian G. Neural mass modelling of brain stimulation to Alleviate Schizophrenia biomarkers in brain rhythms. Comput Biol Med 2025; 192:110190. [PMID: 40258319 DOI: 10.1016/j.compbiomed.2025.110190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2024] [Revised: 03/11/2025] [Accepted: 04/08/2025] [Indexed: 04/23/2025]
Abstract
We present a neural mass model (NMM) of the brain thalamo-cortico-thalamic (TCT) network to understand the effectiveness of non-invasive treatment with transcranial Direct Current Stimulation (tDCS) in reversing the anomalous electroencephalogram (EEG) oscillations in Schizophrenia. Our TCT NMM consists of twelve neural populations representing the thalamus and cortex modules of the visual pathway connected in a closed loop; the synaptic pathways are modelled with a 3-state kinetic framework allowing the inclusion of the slow excitatory N-methyl-D-aspartate-receptors (NMDAR). Indeed, a popular hypothesis in Schizophrenia is the hypofunction of the Glutamatergic neurotransmitter receptors, NMDAR, associated with the inhibitory Gamma-amino-butyric-acid (GABA-)ergic populations in the cortex, leading to anomalous brain oscillations. Experimental studies simulate the EEG conditions in Schizophrenia by administering sub-anesthetic dosage of Ketamine, which blocks NMDAR channels at the Magnesium binding sites. We could simulate the Ketamine-induced NMDAR channel blocking by varying the Magnesium concentration in the 3-state synaptic models of appropriate pathways. Our results show Ketamine-induced increased excitatory behaviour in the model output; the changes in the γ and σ band oscillations conform to experimental studies. A model to factor in the neuroplasticity effects of applying tDCS (after (Riedinger and Hutt, 2022)) is interfaced with the TCT NMM. Informed by experimental literature, the simulated extrinsic current induced by tDCS is set to affect the plasticity in selected pathways. With appropriate parameterisation, we could simulate the reversal of the Ketamine-induced altered EEG oscillations. Overall, our in silico study emphasises the potential of NMM in predicting protocols for tDCS towards effective personalised treatment of Schizophrenia.
Collapse
|
2
|
Bastiaens SP, Momi D, Griffiths JD. A comprehensive investigation of intracortical and corticothalamic models of the alpha rhythm. PLoS Comput Biol 2025; 21:e1012926. [PMID: 40209165 DOI: 10.1371/journal.pcbi.1012926] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2024] [Accepted: 03/03/2025] [Indexed: 04/12/2025] Open
Abstract
The electroencephalographic alpha rhythm is one of the most robustly observed and widely studied empirical phenomena in all of neuroscience. However, despite its extensive implication in a wide range of cognitive processes and clinical pathologies, the mechanisms underlying alpha generation in neural circuits remain poorly understood. In this paper we offer a renewed foundation for research on this question, by undertaking a systematic comparison and synthesis of the most prominent theoretical models of alpha rhythmogenesis in the published literature. We focus on four models, each studied intensively by multiple authors over the past three decades: (i) Jansen-Rit, (ii) Moran-David-Friston, (iii) Robinson-Rennie-Wright, and (iv) Liley-Wright. Several common elements are identified, such as the use of second-order differential equations and sigmoidal potential-to-rate operators to represent population-level neural activity. Major differences are seen in other features such as wiring topologies and conduction delays. Through a series of mathematical analyses and numerical simulations, we nevertheless demonstrate that the selected models can be meaningfully compared, by associating parameters and circuit motifs of analogous biological significance. With this established, we conduct explorations of rate constant and synaptic connectivity parameter spaces, with the aim of identifying common patterns in key behaviours, such as the role of excitatory-inhibitory interactions in the generation of oscillations. Finally, using linear stability analysis we identify two qualitatively different alpha-generating dynamical regimes across the models: (i) noise-driven fluctuations and (ii) self-sustained limit-cycle oscillations, emerging due to an Andronov-Hopf bifurcation. The comprehensive survey and synthesis developed here can, we suggest, be used to help guide future theoretical and experimental work aimed at disambiguating these and other candidate theories of alpha rhythmogenesis.
Collapse
Affiliation(s)
- Sorenza P Bastiaens
- Institute of Medical Sciences, University of Toronto, Toronto, Ontario, Canada
- Krembil Centre for Neuroinformatics, Centre for Addiction and Mental Health, Toronto, Ontario, Canada
| | - Davide Momi
- Krembil Centre for Neuroinformatics, Centre for Addiction and Mental Health, Toronto, Ontario, Canada
- Department of Psychiatry and Behavioral Sciences, Stanford University Medical Center, Stanford, California, United States of America
| | - John D Griffiths
- Institute of Medical Sciences, University of Toronto, Toronto, Ontario, Canada
- Krembil Centre for Neuroinformatics, Centre for Addiction and Mental Health, Toronto, Ontario, Canada
- Department of Psychiatry, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
3
|
Marino R, Buffoni L, Chicchi L, Patti FD, Febbe D, Giambagli L, Fanelli D. Learning in Wilson-Cowan Model for Metapopulation. Neural Comput 2025; 37:701-741. [PMID: 40030137 DOI: 10.1162/neco_a_01744] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2024] [Accepted: 12/05/2024] [Indexed: 03/19/2025]
Abstract
The Wilson-Cowan model for metapopulation, a neural mass network model, treats different subcortical regions of the brain as connected nodes, with connections representing various types of structural, functional, or effective neuronal connectivity between these regions. Each region comprises interacting populations of excitatory and inhibitory cells, consistent with the standard Wilson-Cowan model. In this article, we show how to incorporate stable attractors into such a metapopulation model's dynamics. By doing so, we transform the neural mass network model into a biologically inspired learning algorithm capable of solving different classification tasks. We test it on MNIST and Fashion MNIST in combination with convolutional neural networks, as well as on CIFAR-10 and TF-FLOWERS, and in combination with a transformer architecture (BERT) on IMDB, consistently achieving high classification accuracy.
Collapse
Affiliation(s)
- Raffaele Marino
- Department of Physics and Astronomy, University of Florence, 50019 Sesto Fiorentino, Florence, Italy
| | - Lorenzo Buffoni
- Department of Physics and Astronomy, University of Florence, 50019 Sesto Fiorentino, Florence, Italy
| | - Lorenzo Chicchi
- Department of Physics and Astronomy, University of Florence, 50019 Sesto Fiorentino, Florence, Italy
| | - Francesca Di Patti
- Department of Mathematics and Computer Science, University of Florence, 50134 Florence, Italy
| | - Diego Febbe
- Department of Physics and Astronomy, University of Florence, 50019 Sesto Fiorentino, Florence, Italy
| | - Lorenzo Giambagli
- Department of Physics and Astronomy, University of Florence, 50019 Sesto Fiorentino, Florence, Italy
| | - Duccio Fanelli
- Department of Physics and Astronomy, University of Florence, 50019 Sesto Fiorentino, Florence, Italy
| |
Collapse
|
4
|
Fennelly N, Neff A, Lambiotte R, Keane A, Byrne Á. Mean-field approximation for networks with synchrony-driven adaptive coupling. CHAOS (WOODBURY, N.Y.) 2025; 35:013152. [PMID: 39869927 DOI: 10.1063/5.0231457] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/31/2024] [Accepted: 12/29/2024] [Indexed: 01/29/2025]
Abstract
Synaptic plasticity plays a fundamental role in neuronal dynamics, governing how connections between neurons evolve in response to experience. In this study, we extend a network model of θ-neuron oscillators to include a realistic form of adaptive plasticity. In place of the less tractable spike-timing-dependent plasticity, we employ recently validated phase-difference-dependent plasticity rules, which adjust coupling strengths based on the relative phases of θ-neuron oscillators. We explore two distinct implementations of this plasticity: pairwise updates to individual coupling strengths and global updates applied to the mean coupling strength. We derive a mean-field approximation and assess its accuracy by comparing it to θ-neuron simulations across various stability regimes. The synchrony of the system is quantified using the Kuramoto order parameter. Through bifurcation analysis and the calculation of maximal Lyapunov exponents, we uncover interesting phenomena such as bistability and chaotic dynamics via period-doubling and boundary crisis bifurcations. These behaviors emerge as a direct result of adaptive coupling and are absent in systems without such plasticity.
Collapse
Affiliation(s)
- N Fennelly
- School of Mathematics and Statistics, University College Dublin, Dublin 4 D04 V1W8, Ireland
| | - A Neff
- School of Mathematics, University of Edinburgh, Edinburgh EH9 3FD, United Kingdom
| | - R Lambiotte
- Mathematical Institute, University of Oxford, Oxford OX2 6GG, United Kingdom
| | - A Keane
- School of Mathematical Sciences, University College Cork, Cork T12 XF62, Ireland
| | - Á Byrne
- School of Mathematics and Statistics, University College Dublin, Dublin 4 D04 V1W8, Ireland
| |
Collapse
|
5
|
Forrester M, Petros S, Cattell O, Lai YM, O'Dea RD, Sotiropoulos S, Coombes S. Whole brain functional connectivity: Insights from next generation neural mass modelling incorporating electrical synapses. PLoS Comput Biol 2024; 20:e1012647. [PMID: 39637233 DOI: 10.1371/journal.pcbi.1012647] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2024] [Revised: 12/17/2024] [Accepted: 11/18/2024] [Indexed: 12/07/2024] Open
Abstract
The ready availability of brain connectome data has both inspired and facilitated the modelling of whole brain activity using networks of phenomenological neural mass models that can incorporate both interaction strength and tract length between brain regions. Recently, a new class of neural mass model has been developed from an exact mean field reduction of a network of spiking cortical cell models with a biophysically realistic model of the chemical synapse. Moreover, this new population dynamics model can naturally incorporate electrical synapses. Here we demonstrate the ability of this new modelling framework, when combined with data from the Human Connectome Project, to generate patterns of functional connectivity (FC) of the type observed in both magnetoencephalography and functional magnetic resonance neuroimaging. Some limited explanatory power is obtained via an eigenmode description of frequency-specific FC patterns, obtained via a linear stability analysis of the network steady state in the neigbourhood of a Hopf bifurcation. However, direct numerical simulations show that empirical data is more faithfully recapitulated in the nonlinear regime, and exposes a key role of gap junction coupling strength in generating empirically-observed neural activity, and associated FC patterns and their evolution. Thereby, we emphasise the importance of maintaining known links with biological reality when developing multi-scale models of brain dynamics. As a tool for the study of dynamic whole brain models of the type presented here we further provide a suite of C++ codes for the efficient, and user friendly, simulation of neural mass networks with multiple delayed interactions.
Collapse
Affiliation(s)
- Michael Forrester
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Sammy Petros
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Oliver Cattell
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Yi Ming Lai
- Faculty of Medicine & Health Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Reuben D O'Dea
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Stamatios Sotiropoulos
- Faculty of Medicine & Health Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Stephen Coombes
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| |
Collapse
|
6
|
Nandi MK, Valla M, di Volo M. Bursting gamma oscillations in neural mass models. Front Comput Neurosci 2024; 18:1422159. [PMID: 39281982 PMCID: PMC11392745 DOI: 10.3389/fncom.2024.1422159] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2024] [Accepted: 08/08/2024] [Indexed: 09/18/2024] Open
Abstract
Gamma oscillations (30-120 Hz) in the brain are not periodic cycles, but they typically appear in short-time windows, often called oscillatory bursts. While the origin of this bursting phenomenon is still unclear, some recent studies hypothesize its origin in the external or endogenous noise of neural networks. We demonstrate that an exact neural mass model of excitatory and inhibitory quadratic-integrate and fire-spiking neurons theoretically predicts the emergence of a different regime of intrinsic bursting gamma (IBG) oscillations without any noise source, a phenomenon due to collective chaos. This regime is indeed observed in the direct simulation of spiking neurons, characterized by highly irregular spiking activity. IBG oscillations are distinguished by higher phase-amplitude coupling to slower theta oscillations concerning noise-induced bursting oscillations, thus indicating an increased capacity for information transfer between brain regions. We demonstrate that this phenomenon is present in both globally coupled and sparse networks of spiking neurons. These results propose a new mechanism for gamma oscillatory activity, suggesting deterministic collective chaos as a good candidate for the origin of gamma bursts.
Collapse
Affiliation(s)
- Manoj Kumar Nandi
- Université Claude Bernard Lyon 1, Lyon, Rhône-Alpes, France
- INSERM U1208 Institut Cellule Souche et Cerveau, Bron, France
| | - Michele Valla
- Université Claude Bernard Lyon 1, Lyon, Rhône-Alpes, France
- INSERM U1208 Institut Cellule Souche et Cerveau, Bron, France
| | - Matteo di Volo
- Université Claude Bernard Lyon 1, Lyon, Rhône-Alpes, France
- INSERM U1208 Institut Cellule Souche et Cerveau, Bron, France
| |
Collapse
|
7
|
Qi Y. Moment neural network and an efficient numerical method for modeling irregular spiking activity. Phys Rev E 2024; 110:024310. [PMID: 39295055 DOI: 10.1103/physreve.110.024310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2024] [Accepted: 07/01/2024] [Indexed: 09/21/2024]
Abstract
Continuous rate-based neural networks have been widely applied to modeling the dynamics of cortical circuits. However, cortical neurons in the brain exhibit irregular spiking activity with complex correlation structures that cannot be captured by mean firing rate alone. To close this gap, we consider a framework for modeling irregular spiking activity, called the moment neural network, which naturally generalizes rate models to second-order moments and can accurately capture the firing statistics of spiking neural networks. We propose an efficient numerical method that allows for rapid evaluation of moment mappings for neuronal activations without solving the underlying Fokker-Planck equation. This allows simulation of coupled interactions of mean firing rate and firing variability of large-scale neural circuits while retaining the advantage of analytical tractability of continuous rate models. We demonstrate how the moment neural network can explain a range of phenomena including diverse Fano factor in networks with quenched disorder and the emergence of irregular oscillatory dynamics in excitation-inhibition networks with delay.
Collapse
Affiliation(s)
- Yang Qi
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Shanghai 200433, China; and MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China
| |
Collapse
|
8
|
Pietras B. Pulse Shape and Voltage-Dependent Synchronization in Spiking Neuron Networks. Neural Comput 2024; 36:1476-1540. [PMID: 39028958 DOI: 10.1162/neco_a_01680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Accepted: 03/18/2024] [Indexed: 07/21/2024]
Abstract
Pulse-coupled spiking neural networks are a powerful tool to gain mechanistic insights into how neurons self-organize to produce coherent collective behavior. These networks use simple spiking neuron models, such as the θ-neuron or the quadratic integrate-and-fire (QIF) neuron, that replicate the essential features of real neural dynamics. Interactions between neurons are modeled with infinitely narrow pulses, or spikes, rather than the more complex dynamics of real synapses. To make these networks biologically more plausible, it has been proposed that they must also account for the finite width of the pulses, which can have a significant impact on the network dynamics. However, the derivation and interpretation of these pulses are contradictory, and the impact of the pulse shape on the network dynamics is largely unexplored. Here, I take a comprehensive approach to pulse coupling in networks of QIF and θ-neurons. I argue that narrow pulses activate voltage-dependent synaptic conductances and show how to implement them in QIF neurons such that their effect can last through the phase after the spike. Using an exact low-dimensional description for networks of globally coupled spiking neurons, I prove for instantaneous interactions that collective oscillations emerge due to an effective coupling through the mean voltage. I analyze the impact of the pulse shape by means of a family of smooth pulse functions with arbitrary finite width and symmetric or asymmetric shapes. For symmetric pulses, the resulting voltage coupling is not very effective in synchronizing neurons, but pulses that are slightly skewed to the phase after the spike readily generate collective oscillations. The results unveil a voltage-dependent spike synchronization mechanism at the heart of emergent collective behavior, which is facilitated by pulses of finite width and complementary to traditional synaptic transmission in spiking neuron networks.
Collapse
Affiliation(s)
- Bastian Pietras
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, 08018, Barcelona, Spain
| |
Collapse
|
9
|
Kazemi S, Farokhniaee A, Jamali Y. Criticality and partial synchronization analysis in Wilson-Cowan and Jansen-Rit neural mass models. PLoS One 2024; 19:e0292910. [PMID: 38959236 PMCID: PMC11221676 DOI: 10.1371/journal.pone.0292910] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2023] [Accepted: 06/04/2024] [Indexed: 07/05/2024] Open
Abstract
Synchronization is a phenomenon observed in neuronal networks involved in diverse brain activities. Neural mass models such as Wilson-Cowan (WC) and Jansen-Rit (JR) manifest synchronized states. Despite extensive research on these models over the past several decades, their potential of manifesting second-order phase transitions (SOPT) and criticality has not been sufficiently acknowledged. In this study, two networks of coupled WC and JR nodes with small-world topologies were constructed and Kuramoto order parameter (KOP) was used to quantify the amount of synchronization. In addition, we investigated the presence of SOPT using the synchronization coefficient of variation. Both networks reached high synchrony by changing the coupling weight between their nodes. Moreover, they exhibited abrupt changes in the synchronization at certain values of the control parameter not necessarily related to a phase transition. While SOPT was observed only in JR model, neither WC nor JR model showed power-law behavior. Our study further investigated the global synchronization phenomenon that is known to exist in pathological brain states, such as seizure. JR model showed global synchronization, while WC model seemed to be more suitable in producing partially synchronized patterns.
Collapse
Affiliation(s)
- Sheida Kazemi
- Biomathematics Laboratory, Department of Applied Mathematics, School of Mathematical Sciences, Tarbiat Modares University, Tehran, Iran
| | - AmirAli Farokhniaee
- School of Electrical and Electronic Engineering, University College Dublin, Dublin, Ireland
| | - Yousef Jamali
- Biomathematics Laboratory, Department of Applied Mathematics, School of Mathematical Sciences, Tarbiat Modares University, Tehran, Iran
| |
Collapse
|
10
|
Coombes S, O'Dea R, Nicks R. Brain anatomy and dynamics: A commentary on "Does the brain behave like a (complex) network? I. Dynamics" by Papo and Buldú (2024). Phys Life Rev 2024; 49:38-39. [PMID: 38513521 DOI: 10.1016/j.plrev.2024.03.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2024] [Accepted: 03/11/2024] [Indexed: 03/23/2024]
Abstract
Papo and Buldú [1] ask whether the brain truly acts as a network, or whether it is a convenient coincidence that it can be described with the tools of complex network theory, and the emerging field of network neuroscience. After a broad ranging discussion of networkness they explore some of the ways in which the combination of brain structure and dynamics can indeed better be understood as realising a complex network that subserves brain function. To complement and bolster this perspective, which is informed largely from a physics viewpoint, we direct the reader to additional tools, approaches and insights available from applied mathematics that may further help address some of the many remaining open challenges in this field.
Collapse
Affiliation(s)
- Stephen Coombes
- School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, UK.
| | - Reuben O'Dea
- School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, UK.
| | - Rachel Nicks
- School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, UK.
| |
Collapse
|
11
|
van Nieuwenhuizen H, Chesebro AG, Polizu C, Clarke K, Strey HH, Weistuch C, Mujica-Parodi LR. Ketosis regulates K + ion channels, strengthening brain-wide signaling disrupted by age. IMAGING NEUROSCIENCE (CAMBRIDGE, MASS.) 2024; 2:10.1162/imag_a_00163. [PMID: 39664914 PMCID: PMC11633768 DOI: 10.1162/imag_a_00163] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/13/2024]
Abstract
Aging is associated with impaired signaling between brain regions when measured using resting-state fMRI. This age-related destabilization and desynchronization of brain networks reverses itself when the brain switches from metabolizing glucose to ketones. Here, we probe the mechanistic basis for these effects. First, we confirmed their robustness across measurement modalities using two datasets acquired from resting-state EEG (Lifespan: standard diet, 20-80 years, N = 201; Metabolic: individually weight-dosed and calorically-matched glucose and ketone ester challenge,μ a g e = 26.9 ± 11.2 years , N = 36). Then, using a multiscale conductance-based neural mass model, we identified the unique set of mechanistic parameters consistent with our clinical data. Together, our results implicate potassium (K+) gradient dysregulation as a mechanism for age-related neural desynchronization and its reversal with ketosis, the latter finding of which is consistent with direct measurement of ion channels. As such, the approach facilitates the connection between macroscopic brain activity and cellular-level mechanisms.
Collapse
Affiliation(s)
- Helena van Nieuwenhuizen
- Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY, 11790, USA
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, Charlestown, MA, 02129, USA
| | - Anthony G. Chesebro
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, Charlestown, MA, 02129, USA
- Department of Biomedical Engineering, Stony Brook University, Stony Brook, NY, 11790, USA
- Renaissance School of Medicine, Stony Brook University, Stony Brook, NY, 11790, USA
| | - Claire Polizu
- Renaissance School of Medicine, Stony Brook University, Stony Brook, NY, 11790, USA
| | - Kieran Clarke
- Department of Physiology, Oxford University, Oxford OX1 3PT, UK
| | - Helmut H. Strey
- Department of Biomedical Engineering, Stony Brook University, Stony Brook, NY, 11790, USA
- Laufer Center for Physical and Quantitative Biology, Stony Brook University, Stony Brook, NY, 11790, USA
| | - Corey Weistuch
- Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, NY, 10065, USA
| | - Lilianne R. Mujica-Parodi
- Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY, 11790, USA
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, Charlestown, MA, 02129, USA
- Department of Biomedical Engineering, Stony Brook University, Stony Brook, NY, 11790, USA
- Laufer Center for Physical and Quantitative Biology, Stony Brook University, Stony Brook, NY, 11790, USA
| |
Collapse
|
12
|
Liu Q, Wei C, Qu Y, Liang Z. Modelling and Controlling System Dynamics of the Brain: An Intersection of Machine Learning and Control Theory. ADVANCES IN NEUROBIOLOGY 2024; 41:63-87. [PMID: 39589710 DOI: 10.1007/978-3-031-69188-1_3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/27/2024]
Abstract
The human brain, as a complex system, has long captivated multidisciplinary researchers aiming to decode its intricate structure and function. This intricate network has driven scientific pursuits to advance our understanding of cognition, behavior, and neurological disorders by delving into the complex mechanisms underlying brain function and dysfunction. Modelling brain dynamics using machine learning techniques deepens our comprehension of brain dynamics from a computational perspective. These computational models allow researchers to simulate and analyze neural interactions, facilitating the identification of dysfunctions in connectivity or activity patterns. Additionally, the trained dynamical system, serving as a surrogate model, optimizes neurostimulation strategies under the guidelines of control theory. In this chapter, we discuss the recent studies on modelling and controlling brain dynamics at the intersection of machine learning and control theory, providing a framework to understand and improve cognitive function, and treat neurological and psychiatric disorders.
Collapse
Affiliation(s)
- Quanying Liu
- Department of Biomedical Engineering, Southern University of Science and Technology, Shenzhen, GD, P.R. China.
| | - Chen Wei
- Department of Biomedical Engineering, Southern University of Science and Technology, Shenzhen, GD, P.R. China
| | - Youzhi Qu
- Department of Biomedical Engineering, Southern University of Science and Technology, Shenzhen, GD, P.R. China
| | - Zhichao Liang
- Department of Biomedical Engineering, Southern University of Science and Technology, Shenzhen, GD, P.R. China
| |
Collapse
|
13
|
O'Donnell C. Nonlinear slow-timescale mechanisms in synaptic plasticity. Curr Opin Neurobiol 2023; 82:102778. [PMID: 37657186 DOI: 10.1016/j.conb.2023.102778] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 08/07/2023] [Accepted: 08/09/2023] [Indexed: 09/03/2023]
Abstract
Learning and memory rely on synapses changing their strengths in response to neural activity. However, there is a substantial gap between the timescales of neural electrical dynamics (1-100 ms) and organism behaviour during learning (seconds-minutes). What mechanisms bridge this timescale gap? What are the implications for theories of brain learning? Here I first cover experimental evidence for slow-timescale factors in plasticity induction. Then I review possible underlying cellular and synaptic mechanisms, and insights from recent computational models that incorporate such slow-timescale variables. I conclude that future progress in understanding brain learning across timescales will require both experimental and computational modelling studies that map out the nonlinearities implemented by both fast and slow plasticity mechanisms at synapses, and crucially, their joint interactions.
Collapse
Affiliation(s)
- Cian O'Donnell
- School of Computing, Engineering, and Intelligent Systems, Magee Campus, Ulster University, Derry/Londonderry, UK; School of Computer Science, Electrical and Electronic Engineering, and Engineering Maths, University of Bristol, Bristol, UK.
| |
Collapse
|
14
|
Yamamoto H, Spitzner FP, Takemuro T, Buendía V, Murota H, Morante C, Konno T, Sato S, Hirano-Iwata A, Levina A, Priesemann V, Muñoz MA, Zierenberg J, Soriano J. Modular architecture facilitates noise-driven control of synchrony in neuronal networks. SCIENCE ADVANCES 2023; 9:eade1755. [PMID: 37624893 PMCID: PMC10456864 DOI: 10.1126/sciadv.ade1755] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Accepted: 07/21/2023] [Indexed: 08/27/2023]
Abstract
High-level information processing in the mammalian cortex requires both segregated processing in specialized circuits and integration across multiple circuits. One possible way to implement these seemingly opposing demands is by flexibly switching between states with different levels of synchrony. However, the mechanisms behind the control of complex synchronization patterns in neuronal networks remain elusive. Here, we use precision neuroengineering to manipulate and stimulate networks of cortical neurons in vitro, in combination with an in silico model of spiking neurons and a mesoscopic model of stochastically coupled modules to show that (i) a modular architecture enhances the sensitivity of the network to noise delivered as external asynchronous stimulation and that (ii) the persistent depletion of synaptic resources in stimulated neurons is the underlying mechanism for this effect. Together, our results demonstrate that the inherent dynamical state in structured networks of excitable units is determined by both its modular architecture and the properties of the external inputs.
Collapse
Affiliation(s)
- Hideaki Yamamoto
- Research Institute of Electrical Communication (RIEC), Tohoku University, Sendai, Japan
- Graduate School of Engineering, Tohoku University, Sendai, Japan
| | - F. Paul Spitzner
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Taiki Takemuro
- Research Institute of Electrical Communication (RIEC), Tohoku University, Sendai, Japan
- Graduate School of Biomedical Engineering, Tohoku University, Sendai, Japan
| | - Victor Buendía
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Department of Computer Science, University of Tübingen, Tübingen, Germany
- Departamento de Electromagnetismo y Física de la Materia, Universidad de Granada, Granada, Spain
| | - Hakuba Murota
- Research Institute of Electrical Communication (RIEC), Tohoku University, Sendai, Japan
- Graduate School of Engineering, Tohoku University, Sendai, Japan
| | - Carla Morante
- Departament de Física de la Matèria Condensada, Universitat de Barcelona, Barcelona, Spain
- Universitat de Barcelona Institute of Complex Systems (UBICS), Barcelona, Spain
| | - Tomohiro Konno
- Graduate School of Pharmaceutical Sciences, Tohoku University, Sendai, Japan
| | - Shigeo Sato
- Research Institute of Electrical Communication (RIEC), Tohoku University, Sendai, Japan
- Graduate School of Engineering, Tohoku University, Sendai, Japan
| | - Ayumi Hirano-Iwata
- Research Institute of Electrical Communication (RIEC), Tohoku University, Sendai, Japan
- Graduate School of Engineering, Tohoku University, Sendai, Japan
- Graduate School of Biomedical Engineering, Tohoku University, Sendai, Japan
- Advanced Institute for Materials Research (WPI-AIMR), Tohoku University, Sendai, Japan
| | - Anna Levina
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Department of Computer Science, University of Tübingen, Tübingen, Germany
| | - Viola Priesemann
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, Göttingen, Germany
| | - Miguel A. Muñoz
- Departamento de Electromagnetismo y Física de la Materia, Universidad de Granada, Granada, Spain
- Instituto Carlos I de Física Teórica y Computacional, Universidad de Granada, Granada, Spain
| | | | - Jordi Soriano
- Departament de Física de la Matèria Condensada, Universitat de Barcelona, Barcelona, Spain
- Universitat de Barcelona Institute of Complex Systems (UBICS), Barcelona, Spain
| |
Collapse
|
15
|
Duchet B, Bick C, Byrne Á. Mean-Field Approximations With Adaptive Coupling for Networks With Spike-Timing-Dependent Plasticity. Neural Comput 2023; 35:1481-1528. [PMID: 37437202 PMCID: PMC10422128 DOI: 10.1162/neco_a_01601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2022] [Accepted: 04/26/2023] [Indexed: 07/14/2023]
Abstract
Understanding the effect of spike-timing-dependent plasticity (STDP) is key to elucidating how neural networks change over long timescales and to design interventions aimed at modulating such networks in neurological disorders. However, progress is restricted by the significant computational cost associated with simulating neural network models with STDP and by the lack of low-dimensional description that could provide analytical insights. Phase-difference-dependent plasticity (PDDP) rules approximate STDP in phase oscillator networks, which prescribe synaptic changes based on phase differences of neuron pairs rather than differences in spike timing. Here we construct mean-field approximations for phase oscillator networks with STDP to describe part of the phase space for this very high-dimensional system. We first show that single-harmonic PDDP rules can approximate a simple form of symmetric STDP, while multiharmonic rules are required to accurately approximate causal STDP. We then derive exact expressions for the evolution of the average PDDP coupling weight in terms of network synchrony. For adaptive networks of Kuramoto oscillators that form clusters, we formulate a family of low-dimensional descriptions based on the mean-field dynamics of each cluster and average coupling weights between and within clusters. Finally, we show that such a two-cluster mean-field model can be fitted to synthetic data to provide a low-dimensional approximation of a full adaptive network with symmetric STDP. Our framework represents a step toward a low-dimensional description of adaptive networks with STDP, and could for example inform the development of new therapies aimed at maximizing the long-lasting effects of brain stimulation.
Collapse
Affiliation(s)
- Benoit Duchet
- Nuffield Department of Clinical Neuroscience, University of Oxford, Oxford X3 9DU, U.K
- MRC Brain Network Dynamics Unit, University of Oxford, Oxford X1 3TH, U.K.
| | - Christian Bick
- Department of Mathematics, Vrije Universiteit Amsterdam, Amsterdam 1081 HV, the Netherlands
- Amsterdam Neuroscience-Systems and Network Neuroscience, Amsterdam 1081 HV, the Netherlands
- Mathematical Institute, University of Oxford, Oxford X2 6GG, U.K.
| | - Áine Byrne
- School of Mathematics and Statistics, University College Dublin, Dublin D04 V1W8, Ireland
| |
Collapse
|
16
|
Sawicki J, Berner R, Loos SAM, Anvari M, Bader R, Barfuss W, Botta N, Brede N, Franović I, Gauthier DJ, Goldt S, Hajizadeh A, Hövel P, Karin O, Lorenz-Spreen P, Miehl C, Mölter J, Olmi S, Schöll E, Seif A, Tass PA, Volpe G, Yanchuk S, Kurths J. Perspectives on adaptive dynamical systems. CHAOS (WOODBURY, N.Y.) 2023; 33:071501. [PMID: 37486668 DOI: 10.1063/5.0147231] [Citation(s) in RCA: 18] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/20/2023] [Accepted: 05/24/2023] [Indexed: 07/25/2023]
Abstract
Adaptivity is a dynamical feature that is omnipresent in nature, socio-economics, and technology. For example, adaptive couplings appear in various real-world systems, such as the power grid, social, and neural networks, and they form the backbone of closed-loop control strategies and machine learning algorithms. In this article, we provide an interdisciplinary perspective on adaptive systems. We reflect on the notion and terminology of adaptivity in different disciplines and discuss which role adaptivity plays for various fields. We highlight common open challenges and give perspectives on future research directions, looking to inspire interdisciplinary approaches.
Collapse
Affiliation(s)
- Jakub Sawicki
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Rico Berner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Sarah A M Loos
- DAMTP, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA, United Kingdom
| | - Mehrnaz Anvari
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Fraunhofer Institute for Algorithms and Scientific Computing, Schloss Birlinghoven, 53757 Sankt-Augustin, Germany
| | - Rolf Bader
- Institute of Systematic Musicology, University of Hamburg, Hamburg, Germany
| | - Wolfram Barfuss
- Transdisciplinary Research Area: Sustainable Futures, University of Bonn, 53113 Bonn, Germany
- Center for Development Research (ZEF), University of Bonn, 53113 Bonn, Germany
| | - Nicola Botta
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Computer Science and Engineering, Chalmers University of Technology, 412 96 Göteborg, Sweden
| | - Nuria Brede
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Computer Science, University of Potsdam, An der Bahn 2, 14476 Potsdam, Germany
| | - Igor Franović
- Scientific Computing Laboratory, Center for the Study of Complex Systems, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia
| | - Daniel J Gauthier
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
| | - Sebastian Goldt
- Department of Physics, International School of Advanced Studies (SISSA), Trieste, Italy
| | - Aida Hajizadeh
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Philipp Hövel
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
| | - Omer Karin
- Department of Mathematics, Imperial College London, London SW7 2AZ, United Kingdom
| | - Philipp Lorenz-Spreen
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany
| | - Christoph Miehl
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Jan Mölter
- Department of Mathematics, School of Computation, Information and Technology, Technical University of Munich, Boltzmannstraße 3, 85748 Garching bei München, Germany
| | - Simona Olmi
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Eckehard Schöll
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Alireza Seif
- Pritzker School of Molecular Engineering, The University of Chicago, Chicago, Illinois 60637, USA
| | - Peter A Tass
- Department of Neurosurgery, Stanford University School of Medicine, Stanford, California 94304, USA
| | - Giovanni Volpe
- Department of Physics, University of Gothenburg, Gothenburg, Sweden
| | - Serhiy Yanchuk
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Jürgen Kurths
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
17
|
Klinshov VV, Smelov PS, Kirillov SY. Constructive role of shot noise in the collective dynamics of neural networks. CHAOS (WOODBURY, N.Y.) 2023; 33:2894498. [PMID: 37276575 DOI: 10.1063/5.0147409] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Accepted: 05/18/2023] [Indexed: 06/07/2023]
Abstract
Finite-size effects may significantly influence the collective dynamics of large populations of neurons. Recently, we have shown that in globally coupled networks these effects can be interpreted as additional common noise term, the so-called shot noise, to the macroscopic dynamics unfolding in the thermodynamic limit. Here, we continue to explore the role of the shot noise in the collective dynamics of globally coupled neural networks. Namely, we study the noise-induced switching between different macroscopic regimes. We show that shot noise can turn attractors of the infinitely large network into metastable states whose lifetimes smoothly depend on the system parameters. A surprising effect is that the shot noise modifies the region where a certain macroscopic regime exists compared to the thermodynamic limit. This may be interpreted as a constructive role of the shot noise since a certain macroscopic state appears in a parameter region where it does not exist in an infinite network.
Collapse
Affiliation(s)
- V V Klinshov
- Institute of Applied Physics of the Russian Academy of Sciences, Ulyanova Street 46, 603950 Nizhny Novgorod, Russia
- National Research University Higher School of Economics, 25/12 Bol'shaya Pecherskaya Street, Nizhny Novgorod 603155, Russia
| | - P S Smelov
- Institute of Applied Physics of the Russian Academy of Sciences, Ulyanova Street 46, 603950 Nizhny Novgorod, Russia
| | - S Yu Kirillov
- Institute of Applied Physics of the Russian Academy of Sciences, Ulyanova Street 46, 603950 Nizhny Novgorod, Russia
| |
Collapse
|
18
|
Mitjans AG, Linares DP, Naranjo CL, Gonzalez AA, Li M, Wang Y, Reyes RG, Bringas-Vega ML, Minati L, Evans AC, Valdés-Sosa PA. Accurate and Efficient Simulation of Very High-Dimensional Neural Mass Models with Distributed-Delay Connectome Tensors. Neuroimage 2023; 274:120137. [PMID: 37116767 DOI: 10.1016/j.neuroimage.2023.120137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2023] [Revised: 04/16/2023] [Accepted: 04/25/2023] [Indexed: 04/30/2023] Open
Abstract
This paper introduces methods and a novel toolbox that efficiently integrates high-dimensional Neural Mass Models (NMMs) specified by two essential components. The first is the set of nonlinear Random Differential Equations (RDEs) of the dynamics of each neural mass. The second is the highly sparse three-dimensional Connectome Tensor (CT) that encodes the strength of the connections and the delays of information transfer along the axons of each connection. To date, simplistic assumptions prevail about delays in the CT, often assumed to be Dirac-delta functions. In reality, delays are distributed due to heterogeneous conduction velocities of the axons connecting neural masses. These distributed-delay CTs are challenging to model. Our approach implements these models by leveraging several innovations. Semi-analytical integration of RDEs is done with the Local Linearization (LL) scheme for each neural mass model, ensuring dynamical fidelity to the original continuous-time nonlinear dynamic. This semi-analytic LL integration is highly computationally-efficient. In addition, a tensor representation of the CT facilitates parallel computation. It also seamlessly allows modeling distributed delays CT with any level of complexity or realism. This ease of implementation includes models with distributed-delay CTs. Consequently, our algorithm scales linearly with the number of neural masses and the number of equations they are represented with, contrasting with more traditional methods that scale quadratically at best. To illustrate the toolbox's usefulness, we simulate a single Zetterberg-Jansen-Rit (ZJR) cortical column, a single thalmo-cortical unit, and a toy example comprising 1000 interconnected ZJR columns. These simulations demonstrate the consequences of modifying the CT, especially by introducing distributed delays. The examples illustrate the complexity of explaining EEG oscillations, e.g., split alpha peaks, since they only appear for distinct neural masses. We provide an open-source Script for the toolbox.
Collapse
Affiliation(s)
- Anisleidy González Mitjans
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China; Department of Mathematics, University of Havana, Havana, Cuba.
| | - Deirel Paz Linares
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China; Department of Neuroinformatics, Cuban Neuroscience Center, Havana, Cuba.
| | - Carlos López Naranjo
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China.
| | - Ariosky Areces Gonzalez
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China; Department of Informatics, University of Pinar del Rio, Pinar del Rio, Cuba.
| | - Min Li
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China.
| | - Ying Wang
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China.
| | | | - María L Bringas-Vega
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China; Department of Neuroinformatics, Cuban Neuroscience Center, Havana, Cuba.
| | - Ludovico Minati
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China; Center for Mind/Brain Sciences (CIMeC), University of Trento, 38100 Trento, Italy.
| | - Alan C Evans
- McGill Centre for Integrative Neuroscience, Ludmer Centre for Neuroinformatics and Mental Health, Montreal Neurological Institute, Canada.
| | - Pedro A Valdés-Sosa
- University of Electronic Science and Technology of China, Chengdu, Sichuan, China; Department of Neuroinformatics, Cuban Neuroscience Center, Havana, Cuba.
| |
Collapse
|
19
|
Safavi S, Panagiotaropoulos TI, Kapoor V, Ramirez-Villegas JF, Logothetis NK, Besserve M. Uncovering the organization of neural circuits with Generalized Phase Locking Analysis. PLoS Comput Biol 2023; 19:e1010983. [PMID: 37011110 PMCID: PMC10109521 DOI: 10.1371/journal.pcbi.1010983] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2022] [Revised: 04/17/2023] [Accepted: 02/27/2023] [Indexed: 04/05/2023] Open
Abstract
Despite the considerable progress of in vivo neural recording techniques, inferring the biophysical mechanisms underlying large scale coordination of brain activity from neural data remains challenging. One obstacle is the difficulty to link high dimensional functional connectivity measures to mechanistic models of network activity. We address this issue by investigating spike-field coupling (SFC) measurements, which quantify the synchronization between, on the one hand, the action potentials produced by neurons, and on the other hand mesoscopic "field" signals, reflecting subthreshold activities at possibly multiple recording sites. As the number of recording sites gets large, the amount of pairwise SFC measurements becomes overwhelmingly challenging to interpret. We develop Generalized Phase Locking Analysis (GPLA) as an interpretable dimensionality reduction of this multivariate SFC. GPLA describes the dominant coupling between field activity and neural ensembles across space and frequencies. We show that GPLA features are biophysically interpretable when used in conjunction with appropriate network models, such that we can identify the influence of underlying circuit properties on these features. We demonstrate the statistical benefits and interpretability of this approach in various computational models and Utah array recordings. The results suggest that GPLA, used jointly with biophysical modeling, can help uncover the contribution of recurrent microcircuits to the spatio-temporal dynamics observed in multi-channel experimental recordings.
Collapse
Affiliation(s)
- Shervin Safavi
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- IMPRS for Cognitive and Systems Neuroscience, University of Tübingen, Tübingen, Germany
| | - Theofanis I. Panagiotaropoulos
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Cognitive Neuroimaging Unit, INSERM, CEA, CNRS, Université Paris-Saclay, NeuroSpin center, 91191 Gif/Yvette, France
| | - Vishal Kapoor
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- International Center for Primate Brain Research (ICPBR), Center for Excellence in Brain Science and Intelligence Technology (CEBSIT), Chinese Academy of Sciences (CAS), Shanghai 201602, China
| | - Juan F. Ramirez-Villegas
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Institute of Science and Technology Austria (IST Austria), Klosterneuburg, Austria
| | - Nikos K. Logothetis
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- International Center for Primate Brain Research (ICPBR), Center for Excellence in Brain Science and Intelligence Technology (CEBSIT), Chinese Academy of Sciences (CAS), Shanghai 201602, China
- Centre for Imaging Sciences, Biomedical Imaging Institute, The University of Manchester, Manchester, United Kingdom
| | - Michel Besserve
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Department of Empirical Inference, Max Planck Institute for Intelligent Systems and MPI-ETH Center for Learning Systems, Tübingen, Germany
| |
Collapse
|
20
|
Clusella P, Köksal-Ersöz E, Garcia-Ojalvo J, Ruffini G. Comparison between an exact and a heuristic neural mass model with second-order synapses. BIOLOGICAL CYBERNETICS 2023; 117:5-19. [PMID: 36454267 PMCID: PMC10160168 DOI: 10.1007/s00422-022-00952-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Accepted: 10/23/2022] [Indexed: 05/05/2023]
Abstract
Neural mass models (NMMs) are designed to reproduce the collective dynamics of neuronal populations. A common framework for NMMs assumes heuristically that the output firing rate of a neural population can be described by a static nonlinear transfer function (NMM1). However, a recent exact mean-field theory for quadratic integrate-and-fire (QIF) neurons challenges this view by showing that the mean firing rate is not a static function of the neuronal state but follows two coupled nonlinear differential equations (NMM2). Here we analyze and compare these two descriptions in the presence of second-order synaptic dynamics. First, we derive the mathematical equivalence between the two models in the infinitely slow synapse limit, i.e., we show that NMM1 is an approximation of NMM2 in this regime. Next, we evaluate the applicability of this limit in the context of realistic physiological parameter values by analyzing the dynamics of models with inhibitory or excitatory synapses. We show that NMM1 fails to reproduce important dynamical features of the exact model, such as the self-sustained oscillations of an inhibitory interneuron QIF network. Furthermore, in the exact model but not in the limit one, stimulation of a pyramidal cell population induces resonant oscillatory activity whose peak frequency and amplitude increase with the self-coupling gain and the external excitatory input. This may play a role in the enhanced response of densely connected networks to weak uniform inputs, such as the electric fields produced by noninvasive brain stimulation.
Collapse
Affiliation(s)
- Pau Clusella
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain.
| | - Elif Köksal-Ersöz
- LTSI - UMR 1099, INSERM, Univ Rennes, Campus Beaulieu, 35000, Rennes, France
| | - Jordi Garcia-Ojalvo
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain
| | - Giulio Ruffini
- Brain Modeling Department, Neuroelectrics, Av. Tibidabo, 47b, 08035, Barcelona, Spain.
| |
Collapse
|
21
|
Ferrara A, Angulo-Garcia D, Torcini A, Olmi S. Population spiking and bursting in next-generation neural masses with spike-frequency adaptation. Phys Rev E 2023; 107:024311. [PMID: 36932567 DOI: 10.1103/physreve.107.024311] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Accepted: 02/03/2023] [Indexed: 06/18/2023]
Abstract
Spike-frequency adaptation (SFA) is a fundamental neuronal mechanism taking into account the fatigue due to spike emissions and the consequent reduction of the firing activity. We have studied the effect of this adaptation mechanism on the macroscopic dynamics of excitatory and inhibitory networks of quadratic integrate-and-fire (QIF) neurons coupled via exponentially decaying post-synaptic potentials. In particular, we have studied the population activities by employing an exact mean-field reduction, which gives rise to next-generation neural mass models. This low-dimensional reduction allows for the derivation of bifurcation diagrams and the identification of the possible macroscopic regimes emerging both in a single and in two identically coupled neural masses. In single populations SFA favors the emergence of population bursts in excitatory networks, while it hinders tonic population spiking for inhibitory ones. The symmetric coupling of two neural masses, in absence of adaptation, leads to the emergence of macroscopic solutions with broken symmetry, namely, chimera-like solutions in the inhibitory case and antiphase population spikes in the excitatory one. The addition of SFA leads to new collective dynamical regimes exhibiting cross-frequency coupling (CFC) among the fast synaptic timescale and the slow adaptation one, ranging from antiphase slow-fast nested oscillations to symmetric and asymmetric bursting phenomena. The analysis of these CFC rhythms in the θ-γ range has revealed that a reduction of SFA leads to an increase of the θ frequency joined to a decrease of the γ one. This is analogous to what has been reported experimentally for the hippocampus and the olfactory cortex of rodents under cholinergic modulation, which is known to reduce SFA.
Collapse
Affiliation(s)
- Alberto Ferrara
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, 75012 Paris, France
| | - David Angulo-Garcia
- Departamento de Matemáticas y Estadística, Universidad Nacional de Colombia (UNAL), Cra 27 No. 64-60, 170003, Manizales, Colombia
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
- CNR, Consiglio Nazionale delle Ricerche, Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
- INFN, Sezione di Firenze, via Sansone 1, 50019 Sesto Fiorentino, Italy
| | - Simona Olmi
- CNR, Consiglio Nazionale delle Ricerche, Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
- INFN, Sezione di Firenze, via Sansone 1, 50019 Sesto Fiorentino, Italy
| |
Collapse
|
22
|
Alexandersen CG, de Haan W, Bick C, Goriely A. A multi-scale model explains oscillatory slowing and neuronal hyperactivity in Alzheimer's disease. J R Soc Interface 2023; 20:20220607. [PMID: 36596460 PMCID: PMC9810432 DOI: 10.1098/rsif.2022.0607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
Alzheimer's disease is the most common cause of dementia and is linked to the spreading of pathological amyloid-β and tau proteins throughout the brain. Recent studies have highlighted stark differences in how amyloid-β and tau affect neurons at the cellular scale. On a larger scale, Alzheimer's patients are observed to undergo a period of early-stage neuronal hyperactivation followed by neurodegeneration and frequency slowing of neuronal oscillations. Herein, we model the spreading of both amyloid-β and tau across a human connectome and investigate how the neuronal dynamics are affected by disease progression. By including the effects of both amyloid-β and tau pathology, we find that our model explains AD-related frequency slowing, early-stage hyperactivation and late-stage hypoactivation. By testing different hypotheses, we show that hyperactivation and frequency slowing are not due to the topological interactions between different regions but are mostly the result of local neurotoxicity induced by amyloid-β and tau protein.
Collapse
Affiliation(s)
| | - Willem de Haan
- Alzheimer Center Amsterdam, Department of Neurology, Amsterdam Neuroscience, Vrije Universiteit Amsterdam, Amsterdam UMC, Amsterdam, The Netherlands
| | - Christian Bick
- Mathematical Institute, University of Oxford, Oxford, UK,Department of Mathematics, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands,Amsterdam Neuroscience—Systems and Network Neuroscience, Amsterdam, The Netherlands
| | - Alain Goriely
- Mathematical Institute, University of Oxford, Oxford, UK
| |
Collapse
|
23
|
Klinshov VV, Kirillov SY. Shot noise in next-generation neural mass models for finite-size networks. Phys Rev E 2022; 106:L062302. [PMID: 36671128 DOI: 10.1103/physreve.106.l062302] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Accepted: 12/12/2022] [Indexed: 06/17/2023]
Abstract
Neural mass models is a general name for various models describing the collective dynamics of large neural populations in terms of averaged macroscopic variables. Recently, the so-called next-generation neural mass models have attracted a lot of attention due to their ability to account for the degree of synchrony. Being exact in the limit of infinitely large number of neurons, these models provide only an approximate description of finite-size networks. In the present Letter we study finite-size effects in the collective behavior of neural networks and prove that these effects can be captured by appropriately modified neural mass models. Namely, we show that the finite size of the network leads to the emergence of the so-called shot noise appearing as a stochastic term in the neural mass model. The power spectrum of this shot noise contains pronounced peaks, therefore its impact on the collective dynamics might be crucial due to resonance effects.
Collapse
Affiliation(s)
- Vladimir V Klinshov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, Nizhny Novgorod 603950, Russia and Faculty of Informatics, Mathematics, and Computer Science, National Research University Higher School of Economics, 25/12 Bol'shaya Pecherskaya Street, Nizhny Novgorod 603155, Russia
| | - Sergey Yu Kirillov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, Nizhny Novgorod 603950, Russia
| |
Collapse
|
24
|
Reaction-diffusion models in weighted and directed connectomes. PLoS Comput Biol 2022; 18:e1010507. [DOI: 10.1371/journal.pcbi.1010507] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2022] [Revised: 11/23/2022] [Accepted: 08/22/2022] [Indexed: 11/07/2022] Open
Abstract
Connectomes represent comprehensive descriptions of neural connections in a nervous system to better understand and model central brain function and peripheral processing of afferent and efferent neural signals. Connectomes can be considered as a distinctive and necessary structural component alongside glial, vascular, neurochemical, and metabolic networks of the nervous systems of higher organisms that are required for the control of body functions and interaction with the environment. They are carriers of functional epiphenomena such as planning behavior and cognition, which are based on the processing of highly dynamic neural signaling patterns. In this study, we examine more detailed connectomes with edge weighting and orientation properties, in which reciprocal neuronal connections are also considered. Diffusion processes are a further necessary condition for generating dynamic bioelectric patterns in connectomes. Based on our high-precision connectome data, we investigate different diffusion-reaction models to study the propagation of dynamic concentration patterns in control and lesioned connectomes. Therefore, differential equations for modeling diffusion were combined with well-known reaction terms to allow the use of connection weights, connectivity orientation and spatial distances.
Three reaction-diffusion systems Gray-Scott, Gierer-Meinhardt and Mimura-Murray were investigated. For this purpose, implicit solvers were implemented in a numerically stable reaction-diffusion system within the framework of neuroVIISAS. The implemented reaction-diffusion systems were applied to a subconnectome which shapes the mechanosensitive pathway that is strongly affected in the multiple sclerosis demyelination disease. It was found that demyelination modeling by connectivity weight modulation changes the oscillations of the target region, i.e. the primary somatosensory cortex, of the mechanosensitive pathway.
In conclusion, a new application of reaction-diffusion systems to weighted and directed connectomes has been realized. Because the implementation were performed in the neuroVIISAS framework many possibilities for the study of dynamic reaction-diffusion processes in empirical connectomes as well as specific randomized network models are available now.
Collapse
|
25
|
Crofts JJ, Forrester M, Coombes S, O'Dea RD. Structure-function clustering in weighted brain networks. Sci Rep 2022; 12:16793. [PMID: 36202837 PMCID: PMC9537289 DOI: 10.1038/s41598-022-19994-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2022] [Accepted: 09/07/2022] [Indexed: 11/09/2022] Open
Abstract
Functional networks, which typically describe patterns of activity taking place across the cerebral cortex, are widely studied in neuroscience. The dynamical features of these networks, and in particular their deviation from the relatively static structural network, are thought to be key to higher brain function. The interactions between such structural networks and emergent function, and the multimodal neuroimaging approaches and common analysis according to frequency band motivate a multilayer network approach. However, many such investigations rely on arbitrary threshold choices that convert dense, weighted networks to sparse, binary structures. Here, we generalise a measure of multiplex clustering to describe weighted multiplexes with arbitrarily-many layers. Moreover, we extend a recently-developed measure of structure-function clustering (that describes the disparity between anatomical connectivity and functional networks) to the weighted case. To demonstrate its utility we combine human connectome data with simulated neural activity and bifurcation analysis. Our results indicate that this new measure can extract neurologically relevant features not readily apparent in analogous single-layer analyses. In particular, we are able to deduce dynamical regimes under which multistable patterns of neural activity emerge. Importantly, these findings suggest a role for brain operation just beyond criticality to promote cognitive flexibility.
Collapse
Affiliation(s)
- Jonathan J Crofts
- Department of Physics and Mathematics, Nottingham Trent University, Nottingham, NG11 8NS, UK
| | - Michael Forrester
- School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, UK.
| | - Stephen Coombes
- School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, UK
| | - Reuben D O'Dea
- School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, UK
| |
Collapse
|
26
|
John YJ, Sawyer KS, Srinivasan K, Müller EJ, Munn BR, Shine JM. It's about time: Linking dynamical systems with human neuroimaging to understand the brain. Netw Neurosci 2022; 6:960-979. [PMID: 36875012 PMCID: PMC9976648 DOI: 10.1162/netn_a_00230] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2021] [Accepted: 01/04/2022] [Indexed: 11/04/2022] Open
Abstract
Most human neuroscience research to date has focused on statistical approaches that describe stationary patterns of localized neural activity or blood flow. While these patterns are often interpreted in light of dynamic, information-processing concepts, the static, local, and inferential nature of the statistical approach makes it challenging to directly link neuroimaging results to plausible underlying neural mechanisms. Here, we argue that dynamical systems theory provides the crucial mechanistic framework for characterizing both the brain's time-varying quality and its partial stability in the face of perturbations, and hence, that this perspective can have a profound impact on the interpretation of human neuroimaging results and their relationship with behavior. After briefly reviewing some key terminology, we identify three key ways in which neuroimaging analyses can embrace a dynamical systems perspective: by shifting from a local to a more global perspective, by focusing on dynamics instead of static snapshots of neural activity, and by embracing modeling approaches that map neural dynamics using "forward" models. Through this approach, we envisage ample opportunities for neuroimaging researchers to enrich their understanding of the dynamic neural mechanisms that support a wide array of brain functions, both in health and in the setting of psychopathology.
Collapse
Affiliation(s)
- Yohan J. John
- Neural Systems Laboratory, Department of Health Sciences, Boston University, Boston, MA, USA
| | - Kayle S. Sawyer
- Departments of Anatomy and Neurobiology, Boston University, Boston University, Boston, MA, USA
- Department of Radiology, Massachusetts General Hospital, Boston, MA, USA
- Boston VA Healthcare System, Boston, MA, USA
- Sawyer Scientific, LLC, Boston, MA, USA
| | - Karthik Srinivasan
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Eli J. Müller
- Brain and Mind Center, University of Sydney, Sydney, NSW, Australia
| | - Brandon R. Munn
- Brain and Mind Center, University of Sydney, Sydney, NSW, Australia
| | - James M. Shine
- Brain and Mind Center, University of Sydney, Sydney, NSW, Australia
| |
Collapse
|
27
|
Raj A, Verma P, Nagarajan S. Structure-function models of temporal, spatial, and spectral characteristics of non-invasive whole brain functional imaging. Front Neurosci 2022; 16:959557. [PMID: 36110093 PMCID: PMC9468900 DOI: 10.3389/fnins.2022.959557] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Accepted: 07/27/2022] [Indexed: 11/29/2022] Open
Abstract
We review recent advances in using mathematical models of the relationship between the brain structure and function that capture features of brain dynamics. We argue the need for models that can jointly capture temporal, spatial, and spectral features of brain functional activity. We present recent work on spectral graph theory based models that can accurately capture spectral as well as spatial patterns across multiple frequencies in MEG reconstructions.
Collapse
Affiliation(s)
- Ashish Raj
- Department of Radiology and Biomedical Imaging, University of California, San Francisco, San Francisco, CA, United States
| | | | | |
Collapse
|
28
|
Laing CR. Chimeras on annuli. CHAOS (WOODBURY, N.Y.) 2022; 32:083105. [PMID: 36049938 DOI: 10.1063/5.0103669] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2022] [Accepted: 07/08/2022] [Indexed: 06/15/2023]
Abstract
Chimeras occur in networks of coupled oscillators and are characterized by the coexistence of synchronous and asynchronous groups of oscillators in different parts of the network. We consider a network of nonlocally coupled phase oscillators on an annular domain. The Ott/Antonsen ansatz is used to derive a continuum level description of the oscillators' expected dynamics in terms of a complex-valued order parameter. The equations for this order parameter are numerically analyzed in order to investigate solutions with the same symmetry as the domain and chimeras which are analogous to the "multi-headed" chimeras observed on one-dimensional domains. Such solutions are stable only for domains with widths that are neither too large nor too small. We also study rotating waves with different winding numbers, which are similar to spiral wave chimeras seen in two-dimensional domains. We determine ranges of parameters, such as the size of the domain for which such solutions exist and are stable, and the bifurcations by which they lose stability. All of these bifurcations appear subcritical.
Collapse
Affiliation(s)
- Carlo R Laing
- School of Mathematical and Computational Sciences, Massey University, Private Bag 102-904, North Shore Mail Centre, Auckland, New Zealand
| |
Collapse
|
29
|
Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 2022; 50:445-469. [PMID: 35834100 DOI: 10.1007/s10827-022-00825-9] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 06/15/2022] [Indexed: 10/17/2022]
Abstract
Networks of spiking neurons with adaption have been shown to be able to reproduce a wide range of neural activities, including the emergent population bursting and spike synchrony that underpin brain disorders and normal function. Exact mean-field models derived from spiking neural networks are extremely valuable, as such models can be used to determine how individual neurons and the network they reside within interact to produce macroscopic network behaviours. In the paper, we derive and analyze a set of exact mean-field equations for the neural network with spike frequency adaptation. Specifically, our model is a network of Izhikevich neurons, where each neuron is modeled by a two dimensional system consisting of a quadratic integrate and fire equation plus an equation which implements spike frequency adaptation. Previous work deriving a mean-field model for this type of network, relied on the assumption of sufficiently slow dynamics of the adaptation variable. However, this approximation did not succeed in establishing an exact correspondence between the macroscopic description and the realistic neural network, especially when the adaptation time constant was not large. The challenge lies in how to achieve a closed set of mean-field equations with the inclusion of the mean-field dynamics of the adaptation variable. We address this problem by using a Lorentzian ansatz combined with the moment closure approach to arrive at a mean-field system in the thermodynamic limit. The resulting macroscopic description is capable of qualitatively and quantitatively describing the collective dynamics of the neural network, including transition between states where the individual neurons exhibit asynchronous tonic firing and synchronous bursting. We extend the approach to a network of two populations of neurons and discuss the accuracy and efficacy of our mean-field approximations by examining all assumptions that are imposed during the derivation. Numerical bifurcation analysis of our mean-field models reveals bifurcations not previously observed in the models, including a novel mechanism for emergence of bursting in the network. We anticipate our results will provide a tractable and reliable tool to investigate the underlying mechanism of brain function and dysfunction from the perspective of computational neuroscience.
Collapse
|
30
|
Sasi S, Sen Bhattacharya B. In silico Effects of Synaptic Connections in the Visual Thalamocortical Pathway. FRONTIERS IN MEDICAL TECHNOLOGY 2022; 4:856412. [PMID: 35450154 PMCID: PMC9016146 DOI: 10.3389/fmedt.2022.856412] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2022] [Accepted: 03/08/2022] [Indexed: 12/23/2022] Open
Abstract
We have studied brain connectivity using a biologically inspired in silico model of the visual pathway consisting of the lateral geniculate nucleus (LGN) of the thalamus, and layers 4 and 6 of the primary visual cortex. The connectivity parameters in the model are informed by the existing anatomical parameters from mammals and rodents. In the base state, the LGN and layer 6 populations in the model oscillate with dominant alpha frequency, while the layer 4 oscillates in the theta band. By changing intra-cortical hyperparameters, specifically inhibition from layer 6 to layer 4, we demonstrate a transition to alpha mode for all the populations. Furthermore, by increasing the feedforward connectivities in the thalamo-cortico-thalamic loop, we could transition into the beta band for all the populations. On looking closely, we observed that the origin of this beta band is in the layer 6 (infragranular layers); lesioning the thalamic feedback from layer 6 removed the beta from the LGN and the layer 4. This agrees with existing physiological studies where it is shown that beta rhythm is generated in the infragranular layers. Lastly, we present a case study to demonstrate a neurological condition in the model. By changing connectivities in the network, we could simulate the condition of significant (P < 0.001) decrease in beta band power and a simultaneous increase in the theta band power, similar to that observed in Schizophrenia patients. Overall, we have shown that the connectivity changes in a simple visual thalamocortical in silico model can simulate state changes in the brain corresponding to both health and disease conditions.
Collapse
|
31
|
Powanwe AS, Longtin A. Mutual information resonances in delay-coupled limit cycle and quasi-cycle brain rhythms. BIOLOGICAL CYBERNETICS 2022; 116:129-146. [PMID: 35486195 DOI: 10.1007/s00422-022-00932-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
We elucidate how coupling delays and noise impact phase and mutual information relationships between two stochastic brain rhythms. This impact depends on the dynamical regime of each PING-based rhythm, as well as on network heterogeneity and coupling asymmetry. The number of peaks at positive and negative time lags in the delayed mutual information between the two bi-directionally communicating rhythms defines our measure of flexibility of information sharing and reflects the number of ways in which the two networks can alternately lead one another. We identify two distinct mechanisms for the appearance of qualitatively similar flexible information sharing. The flexibility in the quasi-cycle regime arises from the coupling delay-induced bimodality of the phase difference distribution, and the related bimodal mutual information. It persists in the presence of asymmetric coupling and heterogeneity but is limited to two routes of information sharing. The second mechanism in noisy limit cycle regime is not induced by the delay. However, delay-coupling and heterogeneity enable communication routes at multiple time lags. Noise disrupts the shared compromise frequency, allowing the expression of individual network frequencies which leads to a slow beating pattern. Simulations of an envelope-phase description for delay-coupled quasi-cycles yield qualitatively similar properties as for the full system. Near the bifurcation from in-phase to out-of-phase behaviour, a single preferred phase difference can coexist with two information sharing routes; further, the phase laggard can be the mutual information leader, or vice versa. Overall, the coupling delay endows a two-rhythm system with an array of lead-lag relationships and mutual information resonances that exist in spite of the noise and across the Hopf bifurcation. These beg to be mapped out experimentally with the help of our predictions.
Collapse
Affiliation(s)
- Arthur S Powanwe
- Department of Physics, University of Ottawa, 150 Louis Pasteur, Ottawa, ON, K1N6N5, Canada.
- Centre for Neural Dynamics, University of Ottawa, Ottawa, Canada.
| | - André Longtin
- Department of Physics, University of Ottawa, 150 Louis Pasteur, Ottawa, ON, K1N6N5, Canada
- Department of Cellular and Molecular Medicine, 451 Smyth Road, Ottawa, ON, K1H8M5, Canada
- Centre for Neural Dynamics, University of Ottawa, Ottawa, Canada
| |
Collapse
|
32
|
Ursino M, Ricci G, Astolfi L, Pichiorri F, Petti M, Magosso E. A Novel Method to Assess Motor Cortex Connectivity and Event Related Desynchronization Based on Mass Models. Brain Sci 2021; 11:brainsci11111479. [PMID: 34827478 PMCID: PMC8615480 DOI: 10.3390/brainsci11111479] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2021] [Revised: 10/28/2021] [Accepted: 11/03/2021] [Indexed: 11/16/2022] Open
Abstract
Knowledge of motor cortex connectivity is of great value in cognitive neuroscience, in order to provide a better understanding of motor organization and its alterations in pathological conditions. Traditional methods provide connectivity estimations which may vary depending on the task. This work aims to propose a new method for motor connectivity assessment based on the hypothesis of a task-independent connectivity network, assuming nonlinear behavior. The model considers six cortical regions of interest (ROIs) involved in hand movement. The dynamics of each region is simulated using a neural mass model, which reproduces the oscillatory activity through the interaction among four neural populations. Parameters of the model have been assigned to simulate both power spectral densities and coherences of a patient with left-hemisphere stroke during resting condition, movement of the affected, and movement of the unaffected hand. The presented model can simulate the three conditions using a single set of connectivity parameters, assuming that only inputs to the ROIs change from one condition to the other. The proposed procedure represents an innovative method to assess a brain circuit, which does not rely on a task-dependent connectivity network and allows brain rhythms and desynchronization to be assessed on a quantitative basis.
Collapse
Affiliation(s)
- Mauro Ursino
- Department of Electrical, Electronic and Information Engineering Guglielmo Marconi, Campus of Cesena, University of Bologna, Via Dell’Università 50, 47521 Cesena, Italy; (G.R.); (E.M.)
- Correspondence:
| | - Giulia Ricci
- Department of Electrical, Electronic and Information Engineering Guglielmo Marconi, Campus of Cesena, University of Bologna, Via Dell’Università 50, 47521 Cesena, Italy; (G.R.); (E.M.)
| | - Laura Astolfi
- Department of Computer, Control and Management Engineering, Sapienza University of Rome, Via Ariosto, 25, 00185 Roma, Italy; (L.A.); (M.P.)
- Fondazione Santa Lucia, IRCCS Via Ardeatina 306/354, 00179 Roma, Italy;
| | | | - Manuela Petti
- Department of Computer, Control and Management Engineering, Sapienza University of Rome, Via Ariosto, 25, 00185 Roma, Italy; (L.A.); (M.P.)
- Fondazione Santa Lucia, IRCCS Via Ardeatina 306/354, 00179 Roma, Italy;
| | - Elisa Magosso
- Department of Electrical, Electronic and Information Engineering Guglielmo Marconi, Campus of Cesena, University of Bologna, Via Dell’Università 50, 47521 Cesena, Italy; (G.R.); (E.M.)
| |
Collapse
|
33
|
Gerster M, Taher H, Škoch A, Hlinka J, Guye M, Bartolomei F, Jirsa V, Zakharova A, Olmi S. Patient-Specific Network Connectivity Combined With a Next Generation Neural Mass Model to Test Clinical Hypothesis of Seizure Propagation. Front Syst Neurosci 2021; 15:675272. [PMID: 34539355 PMCID: PMC8440880 DOI: 10.3389/fnsys.2021.675272] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Accepted: 07/07/2021] [Indexed: 11/13/2022] Open
Abstract
Dynamics underlying epileptic seizures span multiple scales in space and time, therefore, understanding seizure mechanisms requires identifying the relations between seizure components within and across these scales, together with the analysis of their dynamical repertoire. In this view, mathematical models have been developed, ranging from single neuron to neural population. In this study, we consider a neural mass model able to exactly reproduce the dynamics of heterogeneous spiking neural networks. We combine mathematical modeling with structural information from non invasive brain imaging, thus building large-scale brain network models to explore emergent dynamics and test the clinical hypothesis. We provide a comprehensive study on the effect of external drives on neuronal networks exhibiting multistability, in order to investigate the role played by the neuroanatomical connectivity matrices in shaping the emergent dynamics. In particular, we systematically investigate the conditions under which the network displays a transition from a low activity regime to a high activity state, which we identify with a seizure-like event. This approach allows us to study the biophysical parameters and variables leading to multiple recruitment events at the network level. We further exploit topological network measures in order to explain the differences and the analogies among the subjects and their brain regions, in showing recruitment events at different parameter values. We demonstrate, along with the example of diffusion-weighted magnetic resonance imaging (dMRI) connectomes of 20 healthy subjects and 15 epileptic patients, that individual variations in structural connectivity, when linked with mathematical dynamic models, have the capacity to explain changes in spatiotemporal organization of brain dynamics, as observed in network-based brain disorders. In particular, for epileptic patients, by means of the integration of the clinical hypotheses on the epileptogenic zone (EZ), i.e., the local network where highly synchronous seizures originate, we have identified the sequence of recruitment events and discussed their links with the topological properties of the specific connectomes. The predictions made on the basis of the implemented set of exact mean-field equations turn out to be in line with the clinical pre-surgical evaluation on recruited secondary networks.
Collapse
Affiliation(s)
- Moritz Gerster
- Institut für Theoretische Physik, Technische Universität Berlin, Berlin, Germany
| | - Halgurd Taher
- Inria Sophia Antipolis Méditerranée Research Centre, MathNeuro Team, Valbonne, France
| | - Antonín Škoch
- National Institute of Mental Health, Klecany, Czechia
- MR Unit, Department of Diagnostic and Interventional Radiology, Institute for Clinical and Experimental Medicine, Prague, Czechia
| | - Jaroslav Hlinka
- National Institute of Mental Health, Klecany, Czechia
- Institute of Computer Science of the Czech Academy of Sciences, Prague, Czechia
| | - Maxime Guye
- Faculté de Médecine de la Timone, Centre de Résonance Magnétique et Biologique et Médicale (CRMBM, UMR CNRS-AMU 7339), Medical School of Marseille, Aix-Marseille Université, Marseille, France
- Assistance Publique -Hôpitaux de Marseille, Hôpital de la Timone, Pôle d'Imagerie, Marseille, France
| | - Fabrice Bartolomei
- Assistance Publique - Hôpitaux de Marseille, Hôpital de la Timone, Service de Neurophysiologie Clinique, Marseille, France
| | - Viktor Jirsa
- Aix Marseille Université, Inserm, Institut de Neurosciences des Systèmes, UMRS 1106, Marseille, France
| | - Anna Zakharova
- Institut für Theoretische Physik, Technische Universität Berlin, Berlin, Germany
| | - Simona Olmi
- Inria Sophia Antipolis Méditerranée Research Centre, MathNeuro Team, Valbonne, France
- Consiglio Nazionale delle Ricerche, Istituto dei Sistemi Complessi, Sesto Fiorentino, Italy
| |
Collapse
|
34
|
Mechanisms of Flexible Information Sharing through Noisy Oscillations. BIOLOGY 2021; 10:biology10080764. [PMID: 34439996 PMCID: PMC8389573 DOI: 10.3390/biology10080764] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/18/2021] [Revised: 07/11/2021] [Accepted: 07/31/2021] [Indexed: 01/08/2023]
Abstract
Simple Summary To properly interact with our environment, the brain must be able to identify external stimuli, process them, and make the right decisions all in a short time. This may involve several brain regions interacting together by sharing information birectionally via rhythmic activity. Such flexibility requires the functional connectivity between the areas to be dynamic, and a key question is the relevant parameter and operating regimes that make this possible in spite of fixed structural connectivity. Working towards this goal, we consider two coupled brain regions, each of which exhibits a noisy rhythm, a commonly observed type of neural activity. Such rhythms can be induced by the stochasticity in the neural circuitry, or be autonomously generated through nonlinearities and not necessitating noise. For these two types of rhythms, we computed the amount of information shared between the brain areas and the preferred direction(s) of sharing. We found that without the coupling delay, the flexibility needed by the brain to perform cognitive tasks requires the rhythms to be autogenerated rather than noise-induced. This is the case even with asymmetry or heterogeneity. This suggests that the importance of the dynamical regime has to be taken into account when modeling interacting neural rhythms from an information theoretical point of view. Abstract Brain areas must be able to interact and share information in a time-varying, dynamic manner on a fast timescale. Such flexibility in information sharing has been linked to the synchronization of rhythm phases between areas. One definition of flexibility is the number of local maxima in the delayed mutual information curve between two connected areas. However, the precise relationship between phase synchronization and information sharing is not clear, nor is the flexibility in the face of the fixed structural connectivity and noise. Here, we consider two coupled oscillatory excitatory-inhibitory networks connected through zero-delay excitatory connections, each of which mimics a rhythmic brain area. We numerically compute phase-locking and delayed mutual information between the phases of excitatory local field potential (LFPs) of the two networks, which measures the shared information and its direction. The flexibility in information sharing is shown to depend on the dynamical origin of oscillations, and its properties in different regimes are found to persist in the presence of asymmetry in the connectivity as well as system heterogeneity. For coupled noise-induced rhythms (quasi-cycles), phase synchronization is robust even in the presence of asymmetry and heterogeneity. However, they do not show flexibility, in contrast to noise-perturbed rhythms (noisy limit cycles), which are shown here to exhibit two local information maxima, i.e., flexibility. For quasi-cycles, phase difference and information measures for the envelope-phase dynamics obtained from previous analytical work using the Stochastic Averaging Method (SAM) are found to be in good qualitative agreement with those obtained from the original dynamics. The relation between phase synchronization and communication patterns is not trivial, particularly in the noisy limit cycle regime. There, complex patterns of information sharing can be observed for a single value of the phase difference. The mechanisms reported here can be extended to I-I networks since their phase synchronizations are similar. Our results set the stage for investigating information sharing between several connected noisy rhythms in neural and other complex biological networks.
Collapse
|
35
|
Huang CH, Lin CCK. A novel density-based neural mass model for simulating neuronal network dynamics with conductance-based synapses and membrane current adaptation. Neural Netw 2021; 143:183-197. [PMID: 34157643 DOI: 10.1016/j.neunet.2021.06.009] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2020] [Revised: 04/01/2021] [Accepted: 06/06/2021] [Indexed: 10/21/2022]
Abstract
Despite its success in understanding brain rhythms, the neural mass model, as a low-dimensional mean-field network model, is phenomenological in nature, so that it cannot replicate some of rich repertoire of responses seen in real neuronal tissues. Here, using a colored-synapse population density method, we derived a novel neural mass model, termed density-based neural mass model (dNMM), as the mean-field description of network dynamics of adaptive exponential integrate-and-fire (aEIF) neurons, in which two critical neuronal features, i.e., voltage-dependent conductance-based synaptic interactions and adaptation of firing rate responses, were included. Our results showed that the dNMM was capable of correctly estimating firing rate responses of a neuronal population of aEIF neurons receiving stationary or time-varying excitatory and inhibitory inputs. Finally, it was also able to quantitatively describe the effect of spike-frequency adaptation in the generation of asynchronous irregular activity of excitatory-inhibitory cortical networks. We conclude that in terms of its biological reality and calculation efficiency, the dNMM is a suitable candidate to build significantly large-scale network models involving multiple brain areas, where the neuronal population is the smallest dynamic unit.
Collapse
Affiliation(s)
- Chih-Hsu Huang
- Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chou-Ching K Lin
- Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|
36
|
Byrne Á, Ross J, Nicks R, Coombes S. Mean-Field Models for EEG/MEG: From Oscillations to Waves. Brain Topogr 2021; 35:36-53. [PMID: 33993357 PMCID: PMC8813727 DOI: 10.1007/s10548-021-00842-4] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2020] [Accepted: 04/21/2021] [Indexed: 11/24/2022]
Abstract
Neural mass models have been used since the 1970s to model the coarse-grained activity of large populations of neurons. They have proven especially fruitful for understanding brain rhythms. However, although motivated by neurobiological considerations they are phenomenological in nature, and cannot hope to recreate some of the rich repertoire of responses seen in real neuronal tissue. Here we consider a simple spiking neuron network model that has recently been shown to admit an exact mean-field description for both synaptic and gap-junction interactions. The mean-field model takes a similar form to a standard neural mass model, with an additional dynamical equation to describe the evolution of within-population synchrony. As well as reviewing the origins of this next generation mass model we discuss its extension to describe an idealised spatially extended planar cortex. To emphasise the usefulness of this model for EEG/MEG modelling we show how it can be used to uncover the role of local gap-junction coupling in shaping large scale synaptic waves.
Collapse
Affiliation(s)
- Áine Byrne
- School of Mathematics and Statistics, Science Centre, University College Dublin, South Belfield, Dublin 4, Ireland.
| | - James Ross
- School of Mathematical Sciences, Centre for Mathematical Medicine and Biology, University of Nottingham, Nottingham, NG7 2RD, UK
| | - Rachel Nicks
- School of Mathematical Sciences, Centre for Mathematical Medicine and Biology, University of Nottingham, Nottingham, NG7 2RD, UK
| | - Stephen Coombes
- School of Mathematical Sciences, Centre for Mathematical Medicine and Biology, University of Nottingham, Nottingham, NG7 2RD, UK
| |
Collapse
|
37
|
Duchet B, Weerasinghe G, Bick C, Bogacz R. Optimizing deep brain stimulation based on isostable amplitude in essential tremor patient models. J Neural Eng 2021; 18:046023. [PMID: 33821809 PMCID: PMC7610712 DOI: 10.1088/1741-2552/abd90d] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
OBJECTIVE Deep brain stimulation is a treatment for medically refractory essential tremor. To improve the therapy, closed-loop approaches are designed to deliver stimulation according to the system's state, which is constantly monitored by recording a pathological signal associated with symptoms (e.g. brain signal or limb tremor). Since the space of possible closed-loop stimulation strategies is vast and cannot be fully explored experimentally, how to stimulate according to the state should be informed by modeling. A typical modeling goal is to design a stimulation strategy that aims to maximally reduce the Hilbert amplitude of the pathological signal in order to minimize symptoms. Isostables provide a notion of amplitude related to convergence time to the attractor, which can be beneficial in model-based control problems. However, how isostable and Hilbert amplitudes compare when optimizing the amplitude response to stimulation in models constrained by data is unknown. APPROACH We formulate a simple closed-loop stimulation strategy based on models previously fitted to phase-locked deep brain stimulation data from essential tremor patients. We compare the performance of this strategy in suppressing oscillatory power when based on Hilbert amplitude and when based on isostable amplitude. We also compare performance to phase-locked stimulation and open-loop high-frequency stimulation. MAIN RESULTS For our closed-loop phase space stimulation strategy, stimulation based on isostable amplitude is significantly more effective than stimulation based on Hilbert amplitude when amplitude field computation time is limited to minutes. Performance is similar when there are no constraints, however constraints on computation time are expected in clinical applications. Even when computation time is limited to minutes, closed-loop phase space stimulation based on isostable amplitude is advantageous compared to phase-locked stimulation, and is more efficient than high-frequency stimulation. SIGNIFICANCE Our results suggest a potential benefit to using isostable amplitude more broadly for model-based optimization of stimulation in neurological disorders.
Collapse
Affiliation(s)
- Benoit Duchet
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom. MRC Brain Network Dynamics Unit, University of Oxford, Oxford, United Kingdom
| | | | | | | |
Collapse
|
38
|
Glomb K, Cabral J, Cattani A, Mazzoni A, Raj A, Franceschiello B. Computational Models in Electroencephalography. Brain Topogr 2021; 35:142-161. [PMID: 33779888 PMCID: PMC8813814 DOI: 10.1007/s10548-021-00828-2] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2020] [Accepted: 02/19/2021] [Indexed: 12/17/2022]
Abstract
Computational models lie at the intersection of basic neuroscience and healthcare applications because they allow researchers to test hypotheses in silico and predict the outcome of experiments and interactions that are very hard to test in reality. Yet, what is meant by “computational model” is understood in many different ways by researchers in different fields of neuroscience and psychology, hindering communication and collaboration. In this review, we point out the state of the art of computational modeling in Electroencephalography (EEG) and outline how these models can be used to integrate findings from electrophysiology, network-level models, and behavior. On the one hand, computational models serve to investigate the mechanisms that generate brain activity, for example measured with EEG, such as the transient emergence of oscillations at different frequency bands and/or with different spatial topographies. On the other hand, computational models serve to design experiments and test hypotheses in silico. The final purpose of computational models of EEG is to obtain a comprehensive understanding of the mechanisms that underlie the EEG signal. This is crucial for an accurate interpretation of EEG measurements that may ultimately serve in the development of novel clinical applications.
Collapse
Affiliation(s)
- Katharina Glomb
- Connectomics Lab, Department of Radiology, Lausanne University Hospital and University of Lausanne (CHUV-UNIL), Lausanne, Switzerland.
| | - Joana Cabral
- Life and Health Sciences Research Institute (ICVS), University of Minho, Braga, Portugal
| | - Anna Cattani
- Department of Psychiatry, University of Wisconsin-Madison, Madison, USA.,Department of Biomedical and Clinical Sciences 'Luigi Sacco', University of Milan, Milan, Italy
| | - Alberto Mazzoni
- The BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Ashish Raj
- School of Medicine, UCSF, San Francisco, USA
| | - Benedetta Franceschiello
- Department of Ophthalmology, Hopital Ophthalmic Jules Gonin, FAA, Lausanne, Switzerland.,CIBM Centre for Biomedical Imaging, EEG Section CHUV-UNIL, Lausanne, Switzerland.,Laboratory for Investigative Neurophysiology, Department of Radiology, Lausanne University Hospital and University of Lausanne (CHUV-UNIL), Lausanne, Switzerland
| |
Collapse
|
39
|
Morrison CL, Greenwood PE, Ward LM. Plastic systemic inhibition controls amplitude while allowing phase pattern in a stochastic neural field model. Phys Rev E 2021; 103:032311. [PMID: 33862754 DOI: 10.1103/physreve.103.032311] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Accepted: 02/19/2021] [Indexed: 11/07/2022]
Abstract
We investigate oscillatory phase pattern formation and amplitude control for a linearized stochastic neuron field model by simulating Mexican-hat-coupled stochastic processes. We find, for several choices of parameters, that spatial pattern formation in the temporal phases of the coupled processes occurs if and only if their amplitudes are allowed to grow unrealistically large. Stimulated by recent work on homeostatic inhibitory plasticity, we introduce static and plastic (adaptive) systemic inhibitory mechanisms to keep the amplitudes stochastically bounded. We find that systems with static inhibition exhibited bounded amplitudes but no sustained phase patterns. With plastic systemic inhibition, on the other hand, the resulting systems exhibit both bounded amplitudes and sustained phase patterns. These results demonstrate that plastic inhibitory mechanisms in neural field models can dynamically control amplitudes while allowing patterns of phase synchronization to develop. Similar mechanisms of plastic systemic inhibition could play a role in regulating oscillatory functioning in the brain.
Collapse
Affiliation(s)
- Conor L Morrison
- Department of Statistics, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z4
| | - Priscilla E Greenwood
- Department of Mathematics, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z2
| | - Lawrence M Ward
- Department of Psychology and Djavad Mowafaghian Centre for Brain Health, 2136 West Mall, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z4
| |
Collapse
|
40
|
Aqil M, Atasoy S, Kringelbach ML, Hindriks R. Graph neural fields: A framework for spatiotemporal dynamical models on the human connectome. PLoS Comput Biol 2021; 17:e1008310. [PMID: 33507899 PMCID: PMC7872285 DOI: 10.1371/journal.pcbi.1008310] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2020] [Revised: 02/09/2021] [Accepted: 12/11/2020] [Indexed: 12/22/2022] Open
Abstract
Tools from the field of graph signal processing, in particular the graph Laplacian operator, have recently been successfully applied to the investigation of structure-function relationships in the human brain. The eigenvectors of the human connectome graph Laplacian, dubbed "connectome harmonics", have been shown to relate to the functionally relevant resting-state networks. Whole-brain modelling of brain activity combines structural connectivity with local dynamical models to provide insight into the large-scale functional organization of the human brain. In this study, we employ the graph Laplacian and its properties to define and implement a large class of neural activity models directly on the human connectome. These models, consisting of systems of stochastic integrodifferential equations on graphs, are dubbed graph neural fields, in analogy with the well-established continuous neural fields. We obtain analytic predictions for harmonic and temporal power spectra, as well as functional connectivity and coherence matrices, of graph neural fields, with a technique dubbed CHAOSS (shorthand for Connectome-Harmonic Analysis Of Spatiotemporal Spectra). Combining graph neural fields with appropriate observation models allows for estimating model parameters from experimental data as obtained from electroencephalography (EEG), magnetoencephalography (MEG), or functional magnetic resonance imaging (fMRI). As an example application, we study a stochastic Wilson-Cowan graph neural field model on a high-resolution connectome graph constructed from diffusion tensor imaging (DTI) and structural MRI data. We show that the model equilibrium fluctuations can reproduce the empirically observed harmonic power spectrum of resting-state fMRI data, and predict its functional connectivity, with a high level of detail. Graph neural fields natively allow the inclusion of important features of cortical anatomy and fast computations of observable quantities for comparison with multimodal empirical data. They thus appear particularly suitable for modelling whole-brain activity at mesoscopic scales, and opening new potential avenues for connectome-graph-based investigations of structure-function relationships.
Collapse
Affiliation(s)
- Marco Aqil
- Department of Mathematics, Vrije Universiteit, Amsterdam, The Netherlands
| | - Selen Atasoy
- Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, United Kingdom
- Center for Music in the Brain, University of Aarhus, Aarhus, Denmark
| | - Morten L. Kringelbach
- Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, United Kingdom
- Center for Music in the Brain, University of Aarhus, Aarhus, Denmark
| | - Rikkert Hindriks
- Department of Mathematics, Vrije Universiteit, Amsterdam, The Netherlands
| |
Collapse
|
41
|
Bick C, Goodfellow M, Laing CR, Martens EA. Understanding the dynamics of biological and neural oscillator networks through exact mean-field reductions: a review. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2020; 10:9. [PMID: 32462281 PMCID: PMC7253574 DOI: 10.1186/s13408-020-00086-9] [Citation(s) in RCA: 109] [Impact Index Per Article: 21.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/17/2019] [Accepted: 05/07/2020] [Indexed: 05/03/2023]
Abstract
Many biological and neural systems can be seen as networks of interacting periodic processes. Importantly, their functionality, i.e., whether these networks can perform their function or not, depends on the emerging collective dynamics of the network. Synchrony of oscillations is one of the most prominent examples of such collective behavior and has been associated both with function and dysfunction. Understanding how network structure and interactions, as well as the microscopic properties of individual units, shape the emerging collective dynamics is critical to find factors that lead to malfunction. However, many biological systems such as the brain consist of a large number of dynamical units. Hence, their analysis has either relied on simplified heuristic models on a coarse scale, or the analysis comes at a huge computational cost. Here we review recently introduced approaches, known as the Ott-Antonsen and Watanabe-Strogatz reductions, allowing one to simplify the analysis by bridging small and large scales. Thus, reduced model equations are obtained that exactly describe the collective dynamics for each subpopulation in the oscillator network via few collective variables only. The resulting equations are next-generation models: Rather than being heuristic, they exactly link microscopic and macroscopic descriptions and therefore accurately capture microscopic properties of the underlying system. At the same time, they are sufficiently simple to analyze without great computational effort. In the last decade, these reduction methods have become instrumental in understanding how network structure and interactions shape the collective dynamics and the emergence of synchrony. We review this progress based on concrete examples and outline possible limitations. Finally, we discuss how linking the reduced models with experimental data can guide the way towards the development of new treatment approaches, for example, for neurological disease.
Collapse
Affiliation(s)
- Christian Bick
- Centre for Systems, Dynamics, and Control, University of Exeter, Exeter, UK.
- Department of Mathematics, University of Exeter, Exeter, UK.
- EPSRC Centre for Predictive Modelling in Healthcare, University of Exeter, Exeter, UK.
- Mathematical Institute, University of Oxford, Oxford, UK.
- Institute for Advanced Study, Technische Universität München, Garching, Germany.
| | - Marc Goodfellow
- Department of Mathematics, University of Exeter, Exeter, UK
- EPSRC Centre for Predictive Modelling in Healthcare, University of Exeter, Exeter, UK
- Living Systems Institute, University of Exeter, Exeter, UK
- Wellcome Trust Centre for Biomedical Modelling and Analysis, University of Exeter, Exeter, UK
| | - Carlo R Laing
- School of Natural and Computational Sciences, Massey University, Auckland, New Zealand
| | - Erik A Martens
- Department of Applied Mathematics and Computer Science, Technical University of Denmark, Kgs. Lyngby, Denmark.
- Department of Biomedical Science, University of Copenhagen, Copenhagen N, Denmark.
- Centre for Translational Neuroscience, University of Copenhagen, Copenhagen N, Denmark.
| |
Collapse
|