1
|
Blackiston D, Dromiack H, Grasso C, Varley TF, Moore DG, Srinivasan KK, Sporns O, Bongard J, Levin M, Walker SI. Revealing non-trivial information structures in aneural biological tissues via functional connectivity. PLoS Comput Biol 2025; 21:e1012149. [PMID: 40228211 PMCID: PMC11996219 DOI: 10.1371/journal.pcbi.1012149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2024] [Accepted: 02/19/2025] [Indexed: 04/16/2025] Open
Abstract
A central challenge in the progression of a variety of open questions in biology, such as morphogenesis, wound healing, and development, is learning from empirical data how information is integrated to support tissue-level function and behavior. Information-theoretic approaches provide a quantitative framework for extracting patterns from data, but so far have been predominantly applied to neuronal systems at the tissue-level. Here, we demonstrate how time series of Ca2+ dynamics can be used to identify the structure and information dynamics of other biological tissues. To this end, we expressed the calcium reporter GCaMP6s in an organoid system of explanted amphibian epidermis derived from the African clawed frog Xenopus laevis, and imaged calcium activity pre- and post- a puncture injury, for six replicate organoids. We constructed functional connectivity networks by computing mutual information between cells from time series derived using medical imaging techniques to track intracellular Ca2+. We analyzed network properties including degree distribution, spatial embedding, and modular structure. We find organoid networks exhibit potential evidence for more connectivity than null models, with our models displaying high degree hubs and mesoscale community structure with spatial clustering. Utilizing functional connectivity networks, our model suggests the tissue retains non-random features after injury, displays long range correlations and structure, and non-trivial clustering that is not necessarily spatially dependent. In the context of this reconstruction method our results suggest increased integration after injury, possible cellular coordination in response to injury, and some type of generative structure of the anatomy. While we study Ca2+ in Xenopus epidermal cells, our computational approach and analyses highlight how methods developed to analyze functional connectivity in neuronal tissues can be generalized to any tissue and fluorescent signal type. We discuss expanded methods of analyses to improve models of non-neuronal information processing highlighting the potential of our framework to provide a bridge between neuroscience and more basal modes of information processing.
Collapse
Affiliation(s)
- Douglas Blackiston
- Allen Discovery Center, Tufts University, Medford, Massachusetts, United States of America
- Wyss Institute for Biologically Inspired Engineering, Harvard University, Boston, Massachusetts, United States of America
- Institute for Computationally-Designed Organisms, UVM, Burlington, Vermont and Tufts, Medford, Massachusetts, United States of America
- Department of Biology, Tufts University, Medford, Massachusetts, United States of America
| | - Hannah Dromiack
- Department of Physics, Arizona State University, Tempe, Arizona, United States of America
- BEYOND Center for Fundamental Concepts in Science, Arizona State University, Tempe, Arizona, United States of America
| | - Caitlin Grasso
- Department of Computer Science, University of Vermont, Burlington, Vermont, United States of America
| | - Thomas F Varley
- Department of Computer Science, University of Vermont, Burlington, Vermont, United States of America
- Department of Complex Systems and Data Science, University of Vermont, Burlington, Vermont, United States of America
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, Indiana, United States of America
| | - Douglas G Moore
- BEYOND Center for Fundamental Concepts in Science, Arizona State University, Tempe, Arizona, United States of America
- Alpha 39 Research, Tempe, Arizona, United States of America
| | - Krishna Kannan Srinivasan
- Department of Computer Science, University of Vermont, Burlington, Vermont, United States of America
- Department of Complex Systems and Data Science, University of Vermont, Burlington, Vermont, United States of America
| | - Olaf Sporns
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Joshua Bongard
- Institute for Computationally-Designed Organisms, UVM, Burlington, Vermont and Tufts, Medford, Massachusetts, United States of America
- Department of Computer Science, University of Vermont, Burlington, Vermont, United States of America
| | - Michael Levin
- Allen Discovery Center, Tufts University, Medford, Massachusetts, United States of America
- Wyss Institute for Biologically Inspired Engineering, Harvard University, Boston, Massachusetts, United States of America
- Institute for Computationally-Designed Organisms, UVM, Burlington, Vermont and Tufts, Medford, Massachusetts, United States of America
- Department of Biology, Tufts University, Medford, Massachusetts, United States of America
| | - Sara I Walker
- BEYOND Center for Fundamental Concepts in Science, Arizona State University, Tempe, Arizona, United States of America
- School of Earth and Space Exploration, Arizona State University, Tempe, Arizona, United States of America
- Santa Fe Institute, Santa Fe, New Mexico, United States of America
| |
Collapse
|
2
|
Kabdushev S, Gabrielyan O, Kopishev E, Suleimenov I. Neural network properties of hydrophilic polymers as a key for development of the general theory of evolution. ROYAL SOCIETY OPEN SCIENCE 2025; 12:242149. [PMID: 40271142 PMCID: PMC12014241 DOI: 10.1098/rsos.242149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/17/2024] [Revised: 03/07/2025] [Accepted: 03/10/2025] [Indexed: 04/25/2025]
Abstract
The analysis of the existing literature demonstrates that in order to address the fundamental challenges associated with the origin of life, it is essential to consider this problem from a comprehensive perspective, specifically from the vantage point of the general theory of evolution of complex systems. From these positions, life should be regarded as a distinctive instance of an information storage and processing system that emerges naturally. Evolutionary processes should be examined from the vantage point of the coevolution of material and informational components, which has not been sufficiently emphasized hitherto. It is shown that a specific example in this respect is analogues of neural networks spontaneously formed in solutions of some hydrophilic polymers. Such systems lead to the formation of non-trivial information objects. A wide range of other examples is considered, proving that the processes occurring with the participation of hydrophilic polymers should be interpreted, among other things, from the point of view of formation of information objects, which, under certain conditions, influence the processes occurring at the molecular and supramolecular level. It is shown that it is reasonable to use the tools of classical dialectics to solve such fundamental problems as that of the origin of life.
Collapse
Affiliation(s)
- Sherniyaz Kabdushev
- Department of Chemistry and Technology of Organic Materials, Polymers and Natural Compounds, Al-Farabi Kazakh National University, Almaty, Kazakhstan
| | - Oleg Gabrielyan
- VI Vernadsky Crimean Federal University, Simferopol, Ukraine
| | - Eldar Kopishev
- Department of Chemistry, L.N. Gumilyov Eurasian National University, Astana, Kazakhstan
- Bukhara State University, Bukhara, Uzbekistan
| | - Ibragim Suleimenov
- National Engineering Academy of the Republic of Kazakhstan, Almaty, Kazakhstan
| |
Collapse
|
3
|
Varley TF, Havert D, Fosque L, Alipour A, Weerawongphrom N, Naganobori H, O’Shea L, Pope M, Beggs J. The serotonergic psychedelic N,N-dipropyltryptamine alters information-processing dynamics in in vitro cortical neural circuits. Netw Neurosci 2024; 8:1421-1438. [PMID: 39735490 PMCID: PMC11674936 DOI: 10.1162/netn_a_00408] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Accepted: 07/08/2024] [Indexed: 12/31/2024] Open
Abstract
Most of the recent work in psychedelic neuroscience has been done using noninvasive neuroimaging, with data recorded from the brains of adult volunteers under the influence of a variety of drugs. While these data provide holistic insights into the effects of psychedelics on whole-brain dynamics, the effects of psychedelics on the mesoscale dynamics of neuronal circuits remain much less explored. Here, we report the effects of the serotonergic psychedelic N,N-diproptyltryptamine (DPT) on information-processing dynamics in a sample of in vitro organotypic cultures of cortical tissue from postnatal rats. Three hours of spontaneous activity were recorded: an hour of predrug control, an hour of exposure to 10-μM DPT solution, and a final hour of washout, once again under control conditions. We found that DPT reversibly alters information dynamics in multiple ways: First, the DPT condition was associated with a higher entropy of spontaneous firing activity and reduced the amount of time information was stored in individual neurons. Second, DPT also reduced the reversibility of neural activity, increasing the entropy produced and suggesting a drive away from equilibrium. Third, DPT altered the structure of neuronal circuits, decreasing the overall information flow coming into each neuron, but increasing the number of weak connections, creating a dynamic that combines elements of integration and disintegration. Finally, DPT decreased the higher order statistical synergy present in sets of three neurons. Collectively, these results paint a complex picture of how psychedelics regulate information processing in mesoscale neuronal networks in cortical tissue. Implications for existing hypotheses of psychedelic action, such as the entropic brain hypothesis, are discussed.
Collapse
Affiliation(s)
- Thomas F. Varley
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, VT, USA
| | - Daniel Havert
- Department of Physics, Indiana University, Bloomington, IN, USA
| | - Leandro Fosque
- Department of Physics, Indiana University, Bloomington, IN, USA
| | - Abolfazl Alipour
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, USA
| | | | | | | | - Maria Pope
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, USA
| | - John Beggs
- Department of Physics, Indiana University, Bloomington, IN, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, USA
| |
Collapse
|
4
|
Luppi AI, Sanz Perl Y, Vohryzek J, Mediano PAM, Rosas FE, Milisav F, Suarez LE, Gini S, Gutierrez-Barragan D, Gozzi A, Misic B, Deco G, Kringelbach ML. Competitive interactions shape brain dynamics and computation across species. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.10.19.619194. [PMID: 39484469 PMCID: PMC11526968 DOI: 10.1101/2024.10.19.619194] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/03/2024]
Abstract
Adaptive cognition relies on cooperation across anatomically distributed brain circuits. However, specialised neural systems are also in constant competition for limited processing resources. How does the brain's network architecture enable it to balance these cooperative and competitive tendencies? Here we use computational whole-brain modelling to examine the dynamical and computational relevance of cooperative and competitive interactions in the mammalian connectome. Across human, macaque, and mouse we show that the architecture of the models that most faithfully reproduce brain activity, consistently combines modular cooperative interactions with diffuse, long-range competitive interactions. The model with competitive interactions consistently outperforms the cooperative-only model, with excellent fit to both spatial and dynamical properties of the living brain, which were not explicitly optimised but rather emerge spontaneously. Competitive interactions in the effective connectivity produce greater levels of synergistic information and local-global hierarchy, and lead to superior computational capacity when used for neuromorphic computing. Altogether, this work provides a mechanistic link between network architecture, dynamical properties, and computation in the mammalian brain.
Collapse
Affiliation(s)
- Andrea I. Luppi
- University of Oxford, Oxford, UK
- St John’s College, Cambridge, UK
- Montreal Neurological Institute, Montreal, Canada
| | | | | | | | | | | | | | - Silvia Gini
- Italian Institute of Technology, Rovereto, Italy
- Centre for Mind/Brain Sciences, University of Trento, Italy
| | | | | | | | | | | |
Collapse
|
5
|
Varley TF. A Synergistic Perspective on Multivariate Computation and Causality in Complex Systems. ENTROPY (BASEL, SWITZERLAND) 2024; 26:883. [PMID: 39451959 PMCID: PMC11507062 DOI: 10.3390/e26100883] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/15/2024] [Revised: 10/17/2024] [Accepted: 10/18/2024] [Indexed: 10/26/2024]
Abstract
What does it mean for a complex system to "compute" or perform "computations"? Intuitively, we can understand complex "computation" as occurring when a system's state is a function of multiple inputs (potentially including its own past state). Here, we discuss how computational processes in complex systems can be generally studied using the concept of statistical synergy, which is information about an output that can only be learned when the joint state of all inputs is known. Building on prior work, we show that this approach naturally leads to a link between multivariate information theory and topics in causal inference, specifically, the phenomenon of causal colliders. We begin by showing how Berkson's paradox implies a higher-order, synergistic interaction between multidimensional inputs and outputs. We then discuss how causal structure learning can refine and orient analyses of synergies in empirical data, and when empirical synergies meaningfully reflect computation versus when they may be spurious. We end by proposing that this conceptual link between synergy, causal colliders, and computation can serve as a foundation on which to build a mathematically rich general theory of computation in complex systems.
Collapse
Affiliation(s)
- Thomas F Varley
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405, USA
| |
Collapse
|
6
|
Luppi AI, Mediano PAM, Rosas FE, Allanson J, Pickard J, Carhart-Harris RL, Williams GB, Craig MM, Finoia P, Owen AM, Naci L, Menon DK, Bor D, Stamatakis EA. A synergistic workspace for human consciousness revealed by Integrated Information Decomposition. eLife 2024; 12:RP88173. [PMID: 39022924 PMCID: PMC11257694 DOI: 10.7554/elife.88173] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2024] Open
Abstract
How is the information-processing architecture of the human brain organised, and how does its organisation support consciousness? Here, we combine network science and a rigorous information-theoretic notion of synergy to delineate a 'synergistic global workspace', comprising gateway regions that gather synergistic information from specialised modules across the human brain. This information is then integrated within the workspace and widely distributed via broadcaster regions. Through functional MRI analysis, we show that gateway regions of the synergistic workspace correspond to the human brain's default mode network, whereas broadcasters coincide with the executive control network. We find that loss of consciousness due to general anaesthesia or disorders of consciousness corresponds to diminished ability of the synergistic workspace to integrate information, which is restored upon recovery. Thus, loss of consciousness coincides with a breakdown of information integration within the synergistic workspace of the human brain. This work contributes to conceptual and empirical reconciliation between two prominent scientific theories of consciousness, the Global Neuronal Workspace and Integrated Information Theory, while also advancing our understanding of how the human brain supports consciousness through the synergistic integration of information.
Collapse
Affiliation(s)
- Andrea I Luppi
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
| | - Pedro AM Mediano
- Department of Psychology, University of CambridgeCambridgeUnited Kingdom
| | - Fernando E Rosas
- Center for Psychedelic Research, Department of Brain Science, Imperial College LondonLondonUnited Kingdom
- Center for Complexity Science, Imperial College LondonLondonUnited Kingdom
- Data Science Institute, Imperial College LondonLondonUnited Kingdom
| | - Judith Allanson
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- Department of Neurosciences, Cambridge University Hospitals NHS Foundation, Addenbrooke's HospitalCambridgeUnited Kingdom
| | - John Pickard
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- Wolfson Brain Imaging Centre, University of CambridgeCambridgeUnited Kingdom
- Division of Neurosurgery, School of Clinical Medicine, University of Cambridge, Addenbrooke's HospitalCambridgeUnited Kingdom
| | - Robin L Carhart-Harris
- Center for Psychedelic Research, Department of Brain Science, Imperial College LondonLondonUnited Kingdom
- Psychedelics Division - Neuroscape, Department of Neurology, University of CaliforniaSan FranciscoUnited States
| | - Guy B Williams
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- Wolfson Brain Imaging Centre, University of CambridgeCambridgeUnited Kingdom
| | - Michael M Craig
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
| | - Paola Finoia
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
| | - Adrian M Owen
- Department of Psychology and Department of Physiology and Pharmacology, The Brain and Mind Institute, University of Western OntarioLondonCanada
| | - Lorina Naci
- Trinity College Institute of Neuroscience, School of Psychology, Lloyd Building, Trinity CollegeDublinIreland
| | - David K Menon
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
- Wolfson Brain Imaging Centre, University of CambridgeCambridgeUnited Kingdom
| | - Daniel Bor
- Department of Psychology, University of CambridgeCambridgeUnited Kingdom
| | - Emmanuel A Stamatakis
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
| |
Collapse
|
7
|
Bardella G, Giuffrida V, Giarrocco F, Brunamonti E, Pani P, Ferraina S. Response inhibition in premotor cortex corresponds to a complex reshuffle of the mesoscopic information network. Netw Neurosci 2024; 8:597-622. [PMID: 38952814 PMCID: PMC11168728 DOI: 10.1162/netn_a_00365] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2023] [Accepted: 01/18/2024] [Indexed: 07/03/2024] Open
Abstract
Recent studies have explored functional and effective neural networks in animal models; however, the dynamics of information propagation among functional modules under cognitive control remain largely unknown. Here, we addressed the issue using transfer entropy and graph theory methods on mesoscopic neural activities recorded in the dorsal premotor cortex of rhesus monkeys. We focused our study on the decision time of a Stop-signal task, looking for patterns in the network configuration that could influence motor plan maturation when the Stop signal is provided. When comparing trials with successful inhibition to those with generated movement, the nodes of the network resulted organized into four clusters, hierarchically arranged, and distinctly involved in information transfer. Interestingly, the hierarchies and the strength of information transmission between clusters varied throughout the task, distinguishing between generated movements and canceled ones and corresponding to measurable levels of network complexity. Our results suggest a putative mechanism for motor inhibition in premotor cortex: a topological reshuffle of the information exchanged among ensembles of neurons.
Collapse
Affiliation(s)
- Giampiero Bardella
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Valentina Giuffrida
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Franco Giarrocco
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Emiliano Brunamonti
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Pierpaolo Pani
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Stefano Ferraina
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| |
Collapse
|
8
|
Kay JW. A Partial Information Decomposition for Multivariate Gaussian Systems Based on Information Geometry. ENTROPY (BASEL, SWITZERLAND) 2024; 26:542. [PMID: 39056905 PMCID: PMC11276306 DOI: 10.3390/e26070542] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/22/2024] [Revised: 06/17/2024] [Accepted: 06/18/2024] [Indexed: 07/28/2024]
Abstract
There is much interest in the topic of partial information decomposition, both in developing new algorithms and in developing applications. An algorithm, based on standard results from information geometry, was recently proposed by Niu and Quinn (2019). They considered the case of three scalar random variables from an exponential family, including both discrete distributions and a trivariate Gaussian distribution. The purpose of this article is to extend their work to the general case of multivariate Gaussian systems having vector inputs and a vector output. By making use of standard results from information geometry, explicit expressions are derived for the components of the partial information decomposition for this system. These expressions depend on a real-valued parameter which is determined by performing a simple constrained convex optimisation. Furthermore, it is proved that the theoretical properties of non-negativity, self-redundancy, symmetry and monotonicity, which were proposed by Williams and Beer (2010), are valid for the decomposition Iig derived herein. Application of these results to real and simulated data show that the Iig algorithm does produce the results expected when clear expectations are available, although in some scenarios, it can overestimate the level of the synergy and shared information components of the decomposition, and correspondingly underestimate the levels of unique information. Comparisons of the Iig and Idep (Kay and Ince, 2018) methods show that they can both produce very similar results, but interesting differences are provided. The same may be said about comparisons between the Iig and Immi (Barrett, 2015) methods.
Collapse
Affiliation(s)
- Jim W Kay
- School of Mathematics and Statistics, University of Glasgow, Glasgow G12 8QQ, UK
| |
Collapse
|
9
|
Varley TF, Bongard J. Evolving higher-order synergies reveals a trade-off between stability and information-integration capacity in complex systems. CHAOS (WOODBURY, N.Y.) 2024; 34:063127. [PMID: 38865092 DOI: 10.1063/5.0200425] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/26/2024] [Accepted: 05/21/2024] [Indexed: 06/13/2024]
Abstract
There has recently been an explosion of interest in how "higher-order" structures emerge in complex systems comprised of many interacting elements (often called "synergistic" information). This "emergent" organization has been found in a variety of natural and artificial systems, although at present, the field lacks a unified understanding of what the consequences of higher-order synergies and redundancies are for systems under study. Typical research treats the presence (or absence) of synergistic information as a dependent variable and report changes in the level of synergy in response to some change in the system. Here, we attempt to flip the script: rather than treating higher-order information as a dependent variable, we use evolutionary optimization to evolve boolean networks with significant higher-order redundancies, synergies, or statistical complexity. We then analyze these evolved populations of networks using established tools for characterizing discrete dynamics: the number of attractors, the average transient length, and the Derrida coefficient. We also assess the capacity of the systems to integrate information. We find that high-synergy systems are unstable and chaotic, but with a high capacity to integrate information. In contrast, evolved redundant systems are extremely stable, but have negligible capacity to integrate information. Finally, the complex systems that balance integration and segregation (known as Tononi-Sporns-Edelman complexity) show features of both chaosticity and stability, with a greater capacity to integrate information than the redundant systems while being more stable than the random and synergistic systems. We conclude that there may be a fundamental trade-off between the robustness of a system's dynamics and its capacity to integrate information (which inherently requires flexibility and sensitivity) and that certain kinds of complexity naturally balance this trade-off.
Collapse
Affiliation(s)
- Thomas F Varley
- Department of Computer Science, University of Vermont, Burlington, Vermont 05405, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, Vermont 05405, USA
| | - Josh Bongard
- Department of Computer Science, University of Vermont, Burlington, Vermont 05405, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, Vermont 05405, USA
| |
Collapse
|
10
|
Luppi AI, Rosas FE, Mediano PAM, Menon DK, Stamatakis EA. Information decomposition and the informational architecture of the brain. Trends Cogn Sci 2024; 28:352-368. [PMID: 38199949 DOI: 10.1016/j.tics.2023.11.005] [Citation(s) in RCA: 19] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2023] [Revised: 11/09/2023] [Accepted: 11/17/2023] [Indexed: 01/12/2024]
Abstract
To explain how the brain orchestrates information-processing for cognition, we must understand information itself. Importantly, information is not a monolithic entity. Information decomposition techniques provide a way to split information into its constituent elements: unique, redundant, and synergistic information. We review how disentangling synergistic and redundant interactions is redefining our understanding of integrative brain function and its neural organisation. To explain how the brain navigates the trade-offs between redundancy and synergy, we review converging evidence integrating the structural, molecular, and functional underpinnings of synergy and redundancy; their roles in cognition and computation; and how they might arise over evolution and development. Overall, disentangling synergistic and redundant information provides a guiding principle for understanding the informational architecture of the brain and cognition.
Collapse
Affiliation(s)
- Andrea I Luppi
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK; Montreal Neurological Institute, McGill University, Montreal, QC, Canada
| | - Fernando E Rosas
- Department of Informatics, University of Sussex, Brighton, UK; Centre for Psychedelic Research, Department of Brain Sciences, Imperial College London, London, UK; Centre for Complexity Science, Imperial College London, London, UK; Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, UK
| | - Pedro A M Mediano
- Department of Computing, Imperial College London, London, UK; Department of Psychology, University of Cambridge, Cambridge, UK
| | - David K Menon
- Department of Medicine, University of Cambridge, Cambridge, UK; Wolfson Brain Imaging Centre, University of Cambridge, Cambridge, UK
| | - Emmanuel A Stamatakis
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK.
| |
Collapse
|
11
|
Varley TF. Generalized decomposition of multivariate information. PLoS One 2024; 19:e0297128. [PMID: 38315691 PMCID: PMC10843128 DOI: 10.1371/journal.pone.0297128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 12/28/2023] [Indexed: 02/07/2024] Open
Abstract
Since its introduction, the partial information decomposition (PID) has emerged as a powerful, information-theoretic technique useful for studying the structure of (potentially higher-order) interactions in complex systems. Despite its utility, the applicability of the PID is restricted by the need to assign elements as either "sources" or "targets", as well as the specific structure of the mutual information itself. Here, I introduce a generalized information decomposition that relaxes the source/target distinction while still satisfying the basic intuitions about information. This approach is based on the decomposition of the Kullback-Leibler divergence, and consequently allows for the analysis of any information gained when updating from an arbitrary prior to an arbitrary posterior. As a result, any information-theoretic measure that can be written as a linear combination of Kullback-Leibler divergences admits a decomposition in the style of Williams and Beer, including the total correlation, the negentropy, and the mutual information as special cases. This paper explores how the generalized information decomposition can reveal novel insights into existing measures, as well as the nature of higher-order synergies. We show that synergistic information is intimately related to the well-known Tononi-Sporns-Edelman (TSE) complexity, and that synergistic information requires a similar integration/segregation balance as a high TSE complexity. Finally, I end with a discussion of how this approach fits into other attempts to generalize the PID and the possibilities for empirical applications.
Collapse
Affiliation(s)
- Thomas F. Varley
- Department of Computer Science, University of Vermont, Burlington, VT, United States of America
- Vermont Complex Systems Center, University of Vermont, Burlington, VT, United States of America
| |
Collapse
|
12
|
Koçillari L, Celotto M, Francis NA, Mukherjee S, Babadi B, Kanold PO, Panzeri S. Behavioural relevance of redundant and synergistic stimulus information between functionally connected neurons in mouse auditory cortex. Brain Inform 2023; 10:34. [PMID: 38052917 PMCID: PMC10697912 DOI: 10.1186/s40708-023-00212-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2023] [Accepted: 11/02/2023] [Indexed: 12/07/2023] Open
Abstract
Measures of functional connectivity have played a central role in advancing our understanding of how information is transmitted and processed within the brain. Traditionally, these studies have focused on identifying redundant functional connectivity, which involves determining when activity is similar across different sites or neurons. However, recent research has highlighted the importance of also identifying synergistic connectivity-that is, connectivity that gives rise to information not contained in either site or neuron alone. Here, we measured redundant and synergistic functional connectivity between neurons in the mouse primary auditory cortex during a sound discrimination task. Specifically, we measured directed functional connectivity between neurons simultaneously recorded with calcium imaging. We used Granger Causality as a functional connectivity measure. We then used Partial Information Decomposition to quantify the amount of redundant and synergistic information about the presented sound that is carried by functionally connected or functionally unconnected pairs of neurons. We found that functionally connected pairs present proportionally more redundant information and proportionally less synergistic information about sound than unconnected pairs, suggesting that their functional connectivity is primarily redundant. Further, synergy and redundancy coexisted both when mice made correct or incorrect perceptual discriminations. However, redundancy was much higher (both in absolute terms and in proportion to the total information available in neuron pairs) in correct behavioural choices compared to incorrect ones, whereas synergy was higher in absolute terms but lower in relative terms in correct than in incorrect behavioural choices. Moreover, the proportion of redundancy reliably predicted perceptual discriminations, with the proportion of synergy adding no extra predictive power. These results suggest a crucial contribution of redundancy to correct perceptual discriminations, possibly due to the advantage it offers for information propagation, and also suggest a role of synergy in enhancing information level during correct discriminations.
Collapse
Affiliation(s)
- Loren Koçillari
- Istituto Italiano Di Tecnologia, 38068, Rovereto, Italy.
- Department of Excellence for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Falkenried 94, 20251, Hamburg, Germany.
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf (UKE), 20246, Hamburg, Germany.
| | - Marco Celotto
- Istituto Italiano Di Tecnologia, 38068, Rovereto, Italy
- Department of Excellence for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Falkenried 94, 20251, Hamburg, Germany
- Department of Pharmacy and Biotechnology, University of Bologna, 40126, Bologna, Italy
| | - Nikolas A Francis
- Department of Biology and Brain and Behavior Institute, University of Maryland, College Park, MD, 20742, USA
| | - Shoutik Mukherjee
- Department of Electrical and Computer Engineering and Institute for Systems Research, University of Maryland, College Park, MD, 20742, USA
| | - Behtash Babadi
- Department of Electrical and Computer Engineering and Institute for Systems Research, University of Maryland, College Park, MD, 20742, USA
| | - Patrick O Kanold
- Department of Biomedical Engineering and Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - Stefano Panzeri
- Department of Excellence for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Falkenried 94, 20251, Hamburg, Germany.
| |
Collapse
|
13
|
Varley TF, Pope M, Puxeddu MG, Faskowitz J, Sporns O. Partial entropy decomposition reveals higher-order information structures in human brain activity. Proc Natl Acad Sci U S A 2023; 120:e2300888120. [PMID: 37467265 PMCID: PMC10372615 DOI: 10.1073/pnas.2300888120] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 06/06/2023] [Indexed: 07/21/2023] Open
Abstract
The standard approach to modeling the human brain as a complex system is with a network, where the basic unit of interaction is a pairwise link between two brain regions. While powerful, this approach is limited by the inability to assess higher-order interactions involving three or more elements directly. In this work, we explore a method for capturing higher-order dependencies in multivariate data: the partial entropy decomposition (PED). Our approach decomposes the joint entropy of the whole system into a set of nonnegative atoms that describe the redundant, unique, and synergistic interactions that compose the system's structure. PED gives insight into the mathematics of functional connectivity and its limitation. When applied to resting-state fMRI data, we find robust evidence of higher-order synergies that are largely invisible to standard functional connectivity analyses. Our approach can also be localized in time, allowing a frame-by-frame analysis of how the distributions of redundancies and synergies change over the course of a recording. We find that different ensembles of regions can transiently change from being redundancy-dominated to synergy-dominated and that the temporal pattern is structured in time. These results provide strong evidence that there exists a large space of unexplored structures in human brain data that have been largely missed by a focus on bivariate network connectivity models. This synergistic structure is dynamic in time and likely will illuminate interesting links between brain and behavior. Beyond brain-specific application, the PED provides a very general approach for understanding higher-order structures in a variety of complex systems.
Collapse
Affiliation(s)
- Thomas F. Varley
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN47405
| | - Maria Pope
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Program in Neuroscience, Indiana University, Bloomington, IN47405
| | - Maria Grazia Puxeddu
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN47405
| | - Joshua Faskowitz
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Program in Neuroscience, Indiana University, Bloomington, IN47405
| | - Olaf Sporns
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN47405
- Program in Neuroscience, Indiana University, Bloomington, IN47405
| |
Collapse
|
14
|
Varley TF, Pope M, Faskowitz J, Sporns O. Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex. Commun Biol 2023; 6:451. [PMID: 37095282 PMCID: PMC10125999 DOI: 10.1038/s42003-023-04843-w] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Accepted: 04/14/2023] [Indexed: 04/26/2023] Open
Abstract
One of the most well-established tools for modeling the brain is the functional connectivity network, which is constructed from pairs of interacting brain regions. While powerful, the network model is limited by the restriction that only pairwise dependencies are considered and potentially higher-order structures are missed. Here, we explore how multivariate information theory reveals higher-order dependencies in the human brain. We begin with a mathematical analysis of the O-information, showing analytically and numerically how it is related to previously established information theoretic measures of complexity. We then apply the O-information to brain data, showing that synergistic subsystems are widespread in the human brain. Highly synergistic subsystems typically sit between canonical functional networks, and may serve an integrative role. We then use simulated annealing to find maximally synergistic subsystems, finding that such systems typically comprise ≈10 brain regions, recruited from multiple canonical brain systems. Though ubiquitous, highly synergistic subsystems are invisible when considering pairwise functional connectivity, suggesting that higher-order dependencies form a kind of shadow structure that has been unrecognized by established network-based analyses. We assert that higher-order interactions in the brain represent an under-explored space that, accessible with tools of multivariate information theory, may offer novel scientific insights.
Collapse
Affiliation(s)
- Thomas F Varley
- School of Informatics, Computing & Engineering, Indiana University, Bloomington, IN, 47405, USA.
- Department of Psychological & Brain Sciences, Indiana University, Bloomington, IN, 47405, USA.
| | - Maria Pope
- School of Informatics, Computing & Engineering, Indiana University, Bloomington, IN, 47405, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, 47405, USA
| | - Joshua Faskowitz
- Department of Psychological & Brain Sciences, Indiana University, Bloomington, IN, 47405, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, 47405, USA
| | - Olaf Sporns
- School of Informatics, Computing & Engineering, Indiana University, Bloomington, IN, 47405, USA
- Department of Psychological & Brain Sciences, Indiana University, Bloomington, IN, 47405, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, 47405, USA
| |
Collapse
|
15
|
Varley TF, Sporns O, Schaffelhofer S, Scherberger H, Dann B. Information-processing dynamics in neural networks of macaque cerebral cortex reflect cognitive state and behavior. Proc Natl Acad Sci U S A 2023; 120:e2207677120. [PMID: 36603032 PMCID: PMC9926243 DOI: 10.1073/pnas.2207677120] [Citation(s) in RCA: 25] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2022] [Accepted: 11/29/2022] [Indexed: 01/06/2023] Open
Abstract
One of the essential functions of biological neural networks is the processing of information. This includes everything from processing sensory information to perceive the environment, up to processing motor information to interact with the environment. Due to methodological limitations, it has been historically unclear how information processing changes during different cognitive or behavioral states and to what extent information is processed within or between the network of neurons in different brain areas. In this study, we leverage recent advances in the calculation of information dynamics to explore neural-level processing within and between the frontoparietal areas AIP, F5, and M1 during a delayed grasping task performed by three macaque monkeys. While information processing was high within all areas during all cognitive and behavioral states of the task, interareal processing varied widely: During visuomotor transformation, AIP and F5 formed a reciprocally connected processing unit, while no processing was present between areas during the memory period. Movement execution was processed globally across all areas with predominance of processing in the feedback direction. Furthermore, the fine-scale network structure reconfigured at the neuron level in response to different grasping conditions, despite no differences in the overall amount of information present. These results suggest that areas dynamically form higher-order processing units according to the cognitive or behavioral demand and that the information-processing network is hierarchically organized at the neuron level, with the coarse network structure determining the behavioral state and finer changes reflecting different conditions.
Collapse
Affiliation(s)
- Thomas F. Varley
- Department of Psychological & Brain Sciences, Indiana University47405-7007, Bloomington, IN
- School of Informatics, Computing, and Engineering, Indiana University47405-7007, Bloomington, IN
| | - Olaf Sporns
- Department of Psychological & Brain Sciences, Indiana University47405-7007, Bloomington, IN
| | - Stefan Schaffelhofer
- Neurobiology Laboratory, German Primate Center37077, Goettingen, Germany
- Faculty of Biology and Psychology, University of Goettingen37073, Goettingen, Germany
| | - Hansjörg Scherberger
- Neurobiology Laboratory, German Primate Center37077, Goettingen, Germany
- Faculty of Biology and Psychology, University of Goettingen37073, Goettingen, Germany
| | - Benjamin Dann
- Neurobiology Laboratory, German Primate Center37077, Goettingen, Germany
| |
Collapse
|
16
|
Varley TF, Kaminski P. Untangling Synergistic Effects of Intersecting Social Identities with Partial Information Decomposition. ENTROPY (BASEL, SWITZERLAND) 2022; 24:1387. [PMID: 37420406 PMCID: PMC9611752 DOI: 10.3390/e24101387] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/23/2022] [Revised: 09/17/2022] [Accepted: 09/22/2022] [Indexed: 05/10/2023]
Abstract
The theory of intersectionality proposes that an individual's experience of society has aspects that are irreducible to the sum of one's various identities considered individually, but are "greater than the sum of their parts". In recent years, this framework has become a frequent topic of discussion both in social sciences and among popular movements for social justice. In this work, we show that the effects of intersectional identities can be statistically observed in empirical data using information theory, particularly the partial information decomposition framework. We show that, when considering the predictive relationship between various identity categories such as race and sex, on outcomes such as income, health and wellness, robust statistical synergies appear. These synergies show that there are joint-effects of identities on outcomes that are irreducible to any identity considered individually and only appear when specific categories are considered together (for example, there is a large, synergistic effect of race and sex considered jointly on income irreducible to either race or sex). Furthermore, these synergies are robust over time, remaining largely constant year-to-year. We then show using synthetic data that the most widely used method of assessing intersectionalities in data (linear regression with multiplicative interaction coefficients) fails to disambiguate between truly synergistic, greater-than-the-sum-of-their-parts interactions, and redundant interactions. We explore the significance of these two distinct types of interactions in the context of making inferences about intersectional relationships in data and the importance of being able to reliably differentiate the two. Finally, we conclude that information theory, as a model-free framework sensitive to nonlinearities and synergies in data, is a natural method by which to explore the space of higher-order social dynamics.
Collapse
Affiliation(s)
- Thomas F. Varley
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47405, USA
- Department of Psychology & Brain Sciences, Indiana University, Bloomington, IN 47405, USA
| | - Patrick Kaminski
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47405, USA
- Department of Sociology, Indiana University, Bloomington, IN 47405, USA
| |
Collapse
|