1
|
Varley TF, Havert D, Fosque L, Alipour A, Weerawongphrom N, Naganobori H, O’Shea L, Pope M, Beggs J. The serotonergic psychedelic N,N-dipropyltryptamine alters information-processing dynamics in in vitro cortical neural circuits. Netw Neurosci 2024; 8:1421-1438. [PMID: 39735490 PMCID: PMC11674936 DOI: 10.1162/netn_a_00408] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Accepted: 07/08/2024] [Indexed: 12/31/2024] Open
Abstract
Most of the recent work in psychedelic neuroscience has been done using noninvasive neuroimaging, with data recorded from the brains of adult volunteers under the influence of a variety of drugs. While these data provide holistic insights into the effects of psychedelics on whole-brain dynamics, the effects of psychedelics on the mesoscale dynamics of neuronal circuits remain much less explored. Here, we report the effects of the serotonergic psychedelic N,N-diproptyltryptamine (DPT) on information-processing dynamics in a sample of in vitro organotypic cultures of cortical tissue from postnatal rats. Three hours of spontaneous activity were recorded: an hour of predrug control, an hour of exposure to 10-μM DPT solution, and a final hour of washout, once again under control conditions. We found that DPT reversibly alters information dynamics in multiple ways: First, the DPT condition was associated with a higher entropy of spontaneous firing activity and reduced the amount of time information was stored in individual neurons. Second, DPT also reduced the reversibility of neural activity, increasing the entropy produced and suggesting a drive away from equilibrium. Third, DPT altered the structure of neuronal circuits, decreasing the overall information flow coming into each neuron, but increasing the number of weak connections, creating a dynamic that combines elements of integration and disintegration. Finally, DPT decreased the higher order statistical synergy present in sets of three neurons. Collectively, these results paint a complex picture of how psychedelics regulate information processing in mesoscale neuronal networks in cortical tissue. Implications for existing hypotheses of psychedelic action, such as the entropic brain hypothesis, are discussed.
Collapse
Affiliation(s)
- Thomas F. Varley
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, VT, USA
| | - Daniel Havert
- Department of Physics, Indiana University, Bloomington, IN, USA
| | - Leandro Fosque
- Department of Physics, Indiana University, Bloomington, IN, USA
| | - Abolfazl Alipour
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, USA
| | | | | | | | - Maria Pope
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, USA
| | - John Beggs
- Department of Physics, Indiana University, Bloomington, IN, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, USA
| |
Collapse
|
2
|
Varley TF. A Synergistic Perspective on Multivariate Computation and Causality in Complex Systems. ENTROPY (BASEL, SWITZERLAND) 2024; 26:883. [PMID: 39451959 PMCID: PMC11507062 DOI: 10.3390/e26100883] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/15/2024] [Revised: 10/17/2024] [Accepted: 10/18/2024] [Indexed: 10/26/2024]
Abstract
What does it mean for a complex system to "compute" or perform "computations"? Intuitively, we can understand complex "computation" as occurring when a system's state is a function of multiple inputs (potentially including its own past state). Here, we discuss how computational processes in complex systems can be generally studied using the concept of statistical synergy, which is information about an output that can only be learned when the joint state of all inputs is known. Building on prior work, we show that this approach naturally leads to a link between multivariate information theory and topics in causal inference, specifically, the phenomenon of causal colliders. We begin by showing how Berkson's paradox implies a higher-order, synergistic interaction between multidimensional inputs and outputs. We then discuss how causal structure learning can refine and orient analyses of synergies in empirical data, and when empirical synergies meaningfully reflect computation versus when they may be spurious. We end by proposing that this conceptual link between synergy, causal colliders, and computation can serve as a foundation on which to build a mathematically rich general theory of computation in complex systems.
Collapse
Affiliation(s)
- Thomas F Varley
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405, USA
| |
Collapse
|
3
|
Luppi AI, Rosas FE, Mediano PAM, Demertzi A, Menon DK, Stamatakis EA. Unravelling consciousness and brain function through the lens of time, space, and information. Trends Neurosci 2024; 47:551-568. [PMID: 38824075 DOI: 10.1016/j.tins.2024.05.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Revised: 04/29/2024] [Accepted: 05/09/2024] [Indexed: 06/03/2024]
Abstract
Disentangling how cognitive functions emerge from the interplay of brain dynamics and network architecture is among the major challenges that neuroscientists face. Pharmacological and pathological perturbations of consciousness provide a lens to investigate these complex challenges. Here, we review how recent advances about consciousness and the brain's functional organisation have been driven by a common denominator: decomposing brain function into fundamental constituents of time, space, and information. Whereas unconsciousness increases structure-function coupling across scales, psychedelics may decouple brain function from structure. Convergent effects also emerge: anaesthetics, psychedelics, and disorders of consciousness can exhibit similar reconfigurations of the brain's unimodal-transmodal functional axis. Decomposition approaches reveal the potential to translate discoveries across species, with computational modelling providing a path towards mechanistic integration.
Collapse
Affiliation(s)
- Andrea I Luppi
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK; Montreal Neurological Institute, McGill University, Montreal, QC, Canada; St John's College, University of Cambridge, Cambridge, UK; Center for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, UK.
| | - Fernando E Rosas
- Center for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, UK; Department of Informatics, University of Sussex, Brighton, UK; Center for Psychedelic Research, Imperial College London, London, UK
| | | | - Athena Demertzi
- Physiology of Cognition Lab, GIGA-Cyclotron Research Center In Vivo Imaging, University of Liège, Liège 4000, Belgium; Psychology and Neuroscience of Cognition Research Unit, University of Liège, Liège 4000, Belgium; National Fund for Scientific Research (FNRS), Brussels 1000, Belgium
| | - David K Menon
- Division of Anaesthesia, University of Cambridge, Cambridge, UK
| | - Emmanuel A Stamatakis
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK
| |
Collapse
|
4
|
Ehrlich DA, Schick-Poland K, Makkeh A, Lanfermann F, Wollstadt P, Wibral M. Partial information decomposition for continuous variables based on shared exclusions: Analytical formulation and estimation. Phys Rev E 2024; 110:014115. [PMID: 39161017 DOI: 10.1103/physreve.110.014115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Accepted: 05/02/2024] [Indexed: 08/21/2024]
Abstract
Describing statistical dependencies is foundational to empirical scientific research. For uncovering intricate and possibly nonlinear dependencies between a single target variable and several source variables within a system, a principled and versatile framework can be found in the theory of partial information decomposition (PID). Nevertheless, the majority of existing PID measures are restricted to categorical variables, while many systems of interest in science are continuous. In this paper, we present a novel analytic formulation for continuous redundancy-a generalization of mutual information-drawing inspiration from the concept of shared exclusions in probability space as in the discrete PID definition of I_{∩}^{sx}. Furthermore, we introduce a nearest-neighbor-based estimator for continuous PID and showcase its effectiveness by applying it to a simulated energy management system provided by the Honda Research Institute Europe GmbH. This work bridges the gap between the measure-theoretically postulated existence proofs for a continuous I_{∩}^{sx} and its practical application to real-world scientific problems.
Collapse
|
5
|
Varley TF, Bongard J. Evolving higher-order synergies reveals a trade-off between stability and information-integration capacity in complex systems. CHAOS (WOODBURY, N.Y.) 2024; 34:063127. [PMID: 38865092 DOI: 10.1063/5.0200425] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/26/2024] [Accepted: 05/21/2024] [Indexed: 06/13/2024]
Abstract
There has recently been an explosion of interest in how "higher-order" structures emerge in complex systems comprised of many interacting elements (often called "synergistic" information). This "emergent" organization has been found in a variety of natural and artificial systems, although at present, the field lacks a unified understanding of what the consequences of higher-order synergies and redundancies are for systems under study. Typical research treats the presence (or absence) of synergistic information as a dependent variable and report changes in the level of synergy in response to some change in the system. Here, we attempt to flip the script: rather than treating higher-order information as a dependent variable, we use evolutionary optimization to evolve boolean networks with significant higher-order redundancies, synergies, or statistical complexity. We then analyze these evolved populations of networks using established tools for characterizing discrete dynamics: the number of attractors, the average transient length, and the Derrida coefficient. We also assess the capacity of the systems to integrate information. We find that high-synergy systems are unstable and chaotic, but with a high capacity to integrate information. In contrast, evolved redundant systems are extremely stable, but have negligible capacity to integrate information. Finally, the complex systems that balance integration and segregation (known as Tononi-Sporns-Edelman complexity) show features of both chaosticity and stability, with a greater capacity to integrate information than the redundant systems while being more stable than the random and synergistic systems. We conclude that there may be a fundamental trade-off between the robustness of a system's dynamics and its capacity to integrate information (which inherently requires flexibility and sensitivity) and that certain kinds of complexity naturally balance this trade-off.
Collapse
Affiliation(s)
- Thomas F Varley
- Department of Computer Science, University of Vermont, Burlington, Vermont 05405, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, Vermont 05405, USA
| | - Josh Bongard
- Department of Computer Science, University of Vermont, Burlington, Vermont 05405, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, Vermont 05405, USA
| |
Collapse
|
6
|
Varley TF. Generalized decomposition of multivariate information. PLoS One 2024; 19:e0297128. [PMID: 38315691 PMCID: PMC10843128 DOI: 10.1371/journal.pone.0297128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 12/28/2023] [Indexed: 02/07/2024] Open
Abstract
Since its introduction, the partial information decomposition (PID) has emerged as a powerful, information-theoretic technique useful for studying the structure of (potentially higher-order) interactions in complex systems. Despite its utility, the applicability of the PID is restricted by the need to assign elements as either "sources" or "targets", as well as the specific structure of the mutual information itself. Here, I introduce a generalized information decomposition that relaxes the source/target distinction while still satisfying the basic intuitions about information. This approach is based on the decomposition of the Kullback-Leibler divergence, and consequently allows for the analysis of any information gained when updating from an arbitrary prior to an arbitrary posterior. As a result, any information-theoretic measure that can be written as a linear combination of Kullback-Leibler divergences admits a decomposition in the style of Williams and Beer, including the total correlation, the negentropy, and the mutual information as special cases. This paper explores how the generalized information decomposition can reveal novel insights into existing measures, as well as the nature of higher-order synergies. We show that synergistic information is intimately related to the well-known Tononi-Sporns-Edelman (TSE) complexity, and that synergistic information requires a similar integration/segregation balance as a high TSE complexity. Finally, I end with a discussion of how this approach fits into other attempts to generalize the PID and the possibilities for empirical applications.
Collapse
Affiliation(s)
- Thomas F. Varley
- Department of Computer Science, University of Vermont, Burlington, VT, United States of America
- Vermont Complex Systems Center, University of Vermont, Burlington, VT, United States of America
| |
Collapse
|
7
|
Varley TF, Pope M, Faskowitz J, Sporns O. Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex. Commun Biol 2023; 6:451. [PMID: 37095282 PMCID: PMC10125999 DOI: 10.1038/s42003-023-04843-w] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Accepted: 04/14/2023] [Indexed: 04/26/2023] Open
Abstract
One of the most well-established tools for modeling the brain is the functional connectivity network, which is constructed from pairs of interacting brain regions. While powerful, the network model is limited by the restriction that only pairwise dependencies are considered and potentially higher-order structures are missed. Here, we explore how multivariate information theory reveals higher-order dependencies in the human brain. We begin with a mathematical analysis of the O-information, showing analytically and numerically how it is related to previously established information theoretic measures of complexity. We then apply the O-information to brain data, showing that synergistic subsystems are widespread in the human brain. Highly synergistic subsystems typically sit between canonical functional networks, and may serve an integrative role. We then use simulated annealing to find maximally synergistic subsystems, finding that such systems typically comprise ≈10 brain regions, recruited from multiple canonical brain systems. Though ubiquitous, highly synergistic subsystems are invisible when considering pairwise functional connectivity, suggesting that higher-order dependencies form a kind of shadow structure that has been unrecognized by established network-based analyses. We assert that higher-order interactions in the brain represent an under-explored space that, accessible with tools of multivariate information theory, may offer novel scientific insights.
Collapse
Affiliation(s)
- Thomas F Varley
- School of Informatics, Computing & Engineering, Indiana University, Bloomington, IN, 47405, USA.
- Department of Psychological & Brain Sciences, Indiana University, Bloomington, IN, 47405, USA.
| | - Maria Pope
- School of Informatics, Computing & Engineering, Indiana University, Bloomington, IN, 47405, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, 47405, USA
| | - Joshua Faskowitz
- Department of Psychological & Brain Sciences, Indiana University, Bloomington, IN, 47405, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, 47405, USA
| | - Olaf Sporns
- School of Informatics, Computing & Engineering, Indiana University, Bloomington, IN, 47405, USA
- Department of Psychological & Brain Sciences, Indiana University, Bloomington, IN, 47405, USA
- Program in Neuroscience, Indiana University, Bloomington, IN, 47405, USA
| |
Collapse
|
8
|
Varley TF. Decomposing past and future: Integrated information decomposition based on shared probability mass exclusions. PLoS One 2023; 18:e0282950. [PMID: 36952508 PMCID: PMC10035902 DOI: 10.1371/journal.pone.0282950] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Accepted: 02/27/2023] [Indexed: 03/25/2023] Open
Abstract
A core feature of complex systems is that the interactions between elements in the present causally constrain their own futures, and the futures of other elements as the system evolves through time. To fully model all of these interactions (between elements, as well as ensembles of elements), it is possible to decompose the total information flowing from past to future into a set of non-overlapping temporal interactions that describe all the different modes by which information can be stored, transferred, or modified. To achieve this, I propose a novel information-theoretic measure of temporal dependency (Iτsx) based on the logic of local probability mass exclusions. This integrated information decomposition can reveal emergent and higher-order interactions within the dynamics of a system, as well as refining existing measures. To demonstrate the utility of this framework, I apply the decomposition to spontaneous spiking activity recorded from dissociated neural cultures of rat cerebral cortex to show how different modes of information processing are distributed over the system. Furthermore, being a localizable analysis, Iτsx can provide insight into the computational structure of single moments. I explore the time-resolved computational structure of neuronal avalanches and find that different types of information atoms have distinct profiles over the course of an avalanche, with the majority of non-trivial information dynamics happening before the first half of the cascade is completed. These analyses allow us to move beyond the historical focus on single measures of dependency such as information transfer or information integration, and explore a panoply of different relationships between elements (and groups of elements) in complex systems.
Collapse
Affiliation(s)
- Thomas F. Varley
- Department of Psychological and Brain Sciences, Indiana University Bloomington, Bloomington, IN, United States of America
- School of Informatics, Computing, and Engineering, Indiana University Bloomington, Bloomington, IN, United States of America
| |
Collapse
|