1
|
Luppi AI, Sanz Perl Y, Vohryzek J, Mediano PAM, Rosas FE, Milisav F, Suarez LE, Gini S, Gutierrez-Barragan D, Gozzi A, Misic B, Deco G, Kringelbach ML. Competitive interactions shape brain dynamics and computation across species. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.10.19.619194. [PMID: 39484469 PMCID: PMC11526968 DOI: 10.1101/2024.10.19.619194] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/03/2024]
Abstract
Adaptive cognition relies on cooperation across anatomically distributed brain circuits. However, specialised neural systems are also in constant competition for limited processing resources. How does the brain's network architecture enable it to balance these cooperative and competitive tendencies? Here we use computational whole-brain modelling to examine the dynamical and computational relevance of cooperative and competitive interactions in the mammalian connectome. Across human, macaque, and mouse we show that the architecture of the models that most faithfully reproduce brain activity, consistently combines modular cooperative interactions with diffuse, long-range competitive interactions. The model with competitive interactions consistently outperforms the cooperative-only model, with excellent fit to both spatial and dynamical properties of the living brain, which were not explicitly optimised but rather emerge spontaneously. Competitive interactions in the effective connectivity produce greater levels of synergistic information and local-global hierarchy, and lead to superior computational capacity when used for neuromorphic computing. Altogether, this work provides a mechanistic link between network architecture, dynamical properties, and computation in the mammalian brain.
Collapse
Affiliation(s)
- Andrea I. Luppi
- University of Oxford, Oxford, UK
- St John’s College, Cambridge, UK
- Montreal Neurological Institute, Montreal, Canada
| | | | | | | | | | | | | | - Silvia Gini
- Italian Institute of Technology, Rovereto, Italy
- Centre for Mind/Brain Sciences, University of Trento, Italy
| | | | | | | | | | | |
Collapse
|
2
|
Varley TF. A Synergistic Perspective on Multivariate Computation and Causality in Complex Systems. ENTROPY (BASEL, SWITZERLAND) 2024; 26:883. [PMID: 39451959 PMCID: PMC11507062 DOI: 10.3390/e26100883] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/15/2024] [Revised: 10/17/2024] [Accepted: 10/18/2024] [Indexed: 10/26/2024]
Abstract
What does it mean for a complex system to "compute" or perform "computations"? Intuitively, we can understand complex "computation" as occurring when a system's state is a function of multiple inputs (potentially including its own past state). Here, we discuss how computational processes in complex systems can be generally studied using the concept of statistical synergy, which is information about an output that can only be learned when the joint state of all inputs is known. Building on prior work, we show that this approach naturally leads to a link between multivariate information theory and topics in causal inference, specifically, the phenomenon of causal colliders. We begin by showing how Berkson's paradox implies a higher-order, synergistic interaction between multidimensional inputs and outputs. We then discuss how causal structure learning can refine and orient analyses of synergies in empirical data, and when empirical synergies meaningfully reflect computation versus when they may be spurious. We end by proposing that this conceptual link between synergy, causal colliders, and computation can serve as a foundation on which to build a mathematically rich general theory of computation in complex systems.
Collapse
Affiliation(s)
- Thomas F Varley
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405, USA
| |
Collapse
|
3
|
Menesse G, Torres JJ. Information dynamics of in silico EEG Brain Waves: Insights into oscillations and functions. PLoS Comput Biol 2024; 20:e1012369. [PMID: 39236071 PMCID: PMC11407780 DOI: 10.1371/journal.pcbi.1012369] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 09/17/2024] [Accepted: 07/26/2024] [Indexed: 09/07/2024] Open
Abstract
The relation between electroencephalography (EEG) rhythms, brain functions, and behavioral correlates is well-established. Some physiological mechanisms underlying rhythm generation are understood, enabling the replication of brain rhythms in silico. This offers a pathway to explore connections between neural oscillations and specific neuronal circuits, potentially yielding fundamental insights into the functional properties of brain waves. Information theory frameworks, such as Integrated Information Decomposition (Φ-ID), relate dynamical regimes with informational properties, providing deeper insights into neuronal dynamic functions. Here, we investigate wave emergence in an excitatory/inhibitory (E/I) balanced network of integrate and fire neurons with short-term synaptic plasticity. This model produces a diverse range of EEG-like rhythms, from low δ waves to high-frequency oscillations. Through Φ-ID, we analyze the network's information dynamics and its relation with different emergent rhythms, elucidating the system's suitability for functions such as robust information transfer, storage, and parallel operation. Furthermore, our study helps to identify regimes that may resemble pathological states due to poor informational properties and high randomness. We found, e.g., that in silico β and δ waves are associated with maximum information transfer in inhibitory and excitatory neuron populations, respectively, and that the coexistence of excitatory θ, α, and β waves is associated to information storage. Additionally, we observed that high-frequency oscillations can exhibit either high or poor informational properties, potentially shedding light on ongoing discussions regarding physiological versus pathological high-frequency oscillations. In summary, our study demonstrates that dynamical regimes with similar oscillations may exhibit vastly different information dynamics. Characterizing information dynamics within these regimes serves as a potent tool for gaining insights into the functions of complex neuronal networks. Finally, our findings suggest that the use of information dynamics in both model and experimental data analysis, could help discriminate between oscillations associated with cognitive functions and those linked to neuronal disorders.
Collapse
Affiliation(s)
- Gustavo Menesse
- Department of Electromagnetism and Physics of the Matter & Institute Carlos I for Theoretical and Computational Physics, University of Granada, Granada, Spain
- Departamento de Física, Facultad de Ciencias Exactas y Naturales, Universidad Nacional de Asunción, San Lorenzo, Paraguay
| | - Joaquín J Torres
- Department of Electromagnetism and Physics of the Matter & Institute Carlos I for Theoretical and Computational Physics, University of Granada, Granada, Spain
| |
Collapse
|
4
|
Luppi AI, Mediano PAM, Rosas FE, Allanson J, Pickard J, Carhart-Harris RL, Williams GB, Craig MM, Finoia P, Owen AM, Naci L, Menon DK, Bor D, Stamatakis EA. A synergistic workspace for human consciousness revealed by Integrated Information Decomposition. eLife 2024; 12:RP88173. [PMID: 39022924 PMCID: PMC11257694 DOI: 10.7554/elife.88173] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2024] Open
Abstract
How is the information-processing architecture of the human brain organised, and how does its organisation support consciousness? Here, we combine network science and a rigorous information-theoretic notion of synergy to delineate a 'synergistic global workspace', comprising gateway regions that gather synergistic information from specialised modules across the human brain. This information is then integrated within the workspace and widely distributed via broadcaster regions. Through functional MRI analysis, we show that gateway regions of the synergistic workspace correspond to the human brain's default mode network, whereas broadcasters coincide with the executive control network. We find that loss of consciousness due to general anaesthesia or disorders of consciousness corresponds to diminished ability of the synergistic workspace to integrate information, which is restored upon recovery. Thus, loss of consciousness coincides with a breakdown of information integration within the synergistic workspace of the human brain. This work contributes to conceptual and empirical reconciliation between two prominent scientific theories of consciousness, the Global Neuronal Workspace and Integrated Information Theory, while also advancing our understanding of how the human brain supports consciousness through the synergistic integration of information.
Collapse
Affiliation(s)
- Andrea I Luppi
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
| | - Pedro AM Mediano
- Department of Psychology, University of CambridgeCambridgeUnited Kingdom
| | - Fernando E Rosas
- Center for Psychedelic Research, Department of Brain Science, Imperial College LondonLondonUnited Kingdom
- Center for Complexity Science, Imperial College LondonLondonUnited Kingdom
- Data Science Institute, Imperial College LondonLondonUnited Kingdom
| | - Judith Allanson
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- Department of Neurosciences, Cambridge University Hospitals NHS Foundation, Addenbrooke's HospitalCambridgeUnited Kingdom
| | - John Pickard
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- Wolfson Brain Imaging Centre, University of CambridgeCambridgeUnited Kingdom
- Division of Neurosurgery, School of Clinical Medicine, University of Cambridge, Addenbrooke's HospitalCambridgeUnited Kingdom
| | - Robin L Carhart-Harris
- Center for Psychedelic Research, Department of Brain Science, Imperial College LondonLondonUnited Kingdom
- Psychedelics Division - Neuroscape, Department of Neurology, University of CaliforniaSan FranciscoUnited States
| | - Guy B Williams
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- Wolfson Brain Imaging Centre, University of CambridgeCambridgeUnited Kingdom
| | - Michael M Craig
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
| | - Paola Finoia
- Department of Clinical Neurosciences, University of CambridgeCambridgeUnited Kingdom
| | - Adrian M Owen
- Department of Psychology and Department of Physiology and Pharmacology, The Brain and Mind Institute, University of Western OntarioLondonCanada
| | - Lorina Naci
- Trinity College Institute of Neuroscience, School of Psychology, Lloyd Building, Trinity CollegeDublinIreland
| | - David K Menon
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
- Wolfson Brain Imaging Centre, University of CambridgeCambridgeUnited Kingdom
| | - Daniel Bor
- Department of Psychology, University of CambridgeCambridgeUnited Kingdom
| | - Emmanuel A Stamatakis
- University Division of Anaesthesia, School of Clinical Medicine, University of CambridgeCambridgeUnited Kingdom
| |
Collapse
|
5
|
Luppi AI, Rosas FE, Mediano PAM, Demertzi A, Menon DK, Stamatakis EA. Unravelling consciousness and brain function through the lens of time, space, and information. Trends Neurosci 2024; 47:551-568. [PMID: 38824075 DOI: 10.1016/j.tins.2024.05.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Revised: 04/29/2024] [Accepted: 05/09/2024] [Indexed: 06/03/2024]
Abstract
Disentangling how cognitive functions emerge from the interplay of brain dynamics and network architecture is among the major challenges that neuroscientists face. Pharmacological and pathological perturbations of consciousness provide a lens to investigate these complex challenges. Here, we review how recent advances about consciousness and the brain's functional organisation have been driven by a common denominator: decomposing brain function into fundamental constituents of time, space, and information. Whereas unconsciousness increases structure-function coupling across scales, psychedelics may decouple brain function from structure. Convergent effects also emerge: anaesthetics, psychedelics, and disorders of consciousness can exhibit similar reconfigurations of the brain's unimodal-transmodal functional axis. Decomposition approaches reveal the potential to translate discoveries across species, with computational modelling providing a path towards mechanistic integration.
Collapse
Affiliation(s)
- Andrea I Luppi
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK; Montreal Neurological Institute, McGill University, Montreal, QC, Canada; St John's College, University of Cambridge, Cambridge, UK; Center for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, UK.
| | - Fernando E Rosas
- Center for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, UK; Department of Informatics, University of Sussex, Brighton, UK; Center for Psychedelic Research, Imperial College London, London, UK
| | | | - Athena Demertzi
- Physiology of Cognition Lab, GIGA-Cyclotron Research Center In Vivo Imaging, University of Liège, Liège 4000, Belgium; Psychology and Neuroscience of Cognition Research Unit, University of Liège, Liège 4000, Belgium; National Fund for Scientific Research (FNRS), Brussels 1000, Belgium
| | - David K Menon
- Division of Anaesthesia, University of Cambridge, Cambridge, UK
| | - Emmanuel A Stamatakis
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK
| |
Collapse
|
6
|
Varley TF, Bongard J. Evolving higher-order synergies reveals a trade-off between stability and information-integration capacity in complex systems. CHAOS (WOODBURY, N.Y.) 2024; 34:063127. [PMID: 38865092 DOI: 10.1063/5.0200425] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/26/2024] [Accepted: 05/21/2024] [Indexed: 06/13/2024]
Abstract
There has recently been an explosion of interest in how "higher-order" structures emerge in complex systems comprised of many interacting elements (often called "synergistic" information). This "emergent" organization has been found in a variety of natural and artificial systems, although at present, the field lacks a unified understanding of what the consequences of higher-order synergies and redundancies are for systems under study. Typical research treats the presence (or absence) of synergistic information as a dependent variable and report changes in the level of synergy in response to some change in the system. Here, we attempt to flip the script: rather than treating higher-order information as a dependent variable, we use evolutionary optimization to evolve boolean networks with significant higher-order redundancies, synergies, or statistical complexity. We then analyze these evolved populations of networks using established tools for characterizing discrete dynamics: the number of attractors, the average transient length, and the Derrida coefficient. We also assess the capacity of the systems to integrate information. We find that high-synergy systems are unstable and chaotic, but with a high capacity to integrate information. In contrast, evolved redundant systems are extremely stable, but have negligible capacity to integrate information. Finally, the complex systems that balance integration and segregation (known as Tononi-Sporns-Edelman complexity) show features of both chaosticity and stability, with a greater capacity to integrate information than the redundant systems while being more stable than the random and synergistic systems. We conclude that there may be a fundamental trade-off between the robustness of a system's dynamics and its capacity to integrate information (which inherently requires flexibility and sensitivity) and that certain kinds of complexity naturally balance this trade-off.
Collapse
Affiliation(s)
- Thomas F Varley
- Department of Computer Science, University of Vermont, Burlington, Vermont 05405, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, Vermont 05405, USA
| | - Josh Bongard
- Department of Computer Science, University of Vermont, Burlington, Vermont 05405, USA
- Vermont Complex Systems Center, University of Vermont, Burlington, Vermont 05405, USA
| |
Collapse
|
7
|
Menesse G, Houben AM, Soriano J, Torres JJ. Integrated information decomposition unveils major structural traits of in silico and in vitro neuronal networks. CHAOS (WOODBURY, N.Y.) 2024; 34:053139. [PMID: 38809907 DOI: 10.1063/5.0201454] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2024] [Accepted: 05/06/2024] [Indexed: 05/31/2024]
Abstract
The properties of complex networked systems arise from the interplay between the dynamics of their elements and the underlying topology. Thus, to understand their behavior, it is crucial to convene as much information as possible about their topological organization. However, in large systems, such as neuronal networks, the reconstruction of such topology is usually carried out from the information encoded in the dynamics on the network, such as spike train time series, and by measuring the transfer entropy between system elements. The topological information recovered by these methods does not necessarily capture the connectivity layout, but rather the causal flow of information between elements. New theoretical frameworks, such as Integrated Information Decomposition (Φ-ID), allow one to explore the modes in which information can flow between parts of a system, opening a rich landscape of interactions between network topology, dynamics, and information. Here, we apply Φ-ID on in silico and in vitro data to decompose the usual transfer entropy measure into different modes of information transfer, namely, synergistic, redundant, or unique. We demonstrate that the unique information transfer is the most relevant measure to uncover structural topological details from network activity data, while redundant information only introduces residual information for this application. Although the retrieved network connectivity is still functional, it captures more details of the underlying structural topology by avoiding to take into account emergent high-order interactions and information redundancy between elements, which are important for the functional behavior, but mask the detection of direct simple interactions between elements constituted by the structural network topology.
Collapse
Affiliation(s)
- Gustavo Menesse
- Department of Electromagnetism and Physics of the Matter & Institute Carlos I for Theoretical and Computational Physics, University of Granada, 18071 Granada, Spain
- Departamento de Física, Facultad de Ciencias Exactas y Naturales, Universidad Nacional de Asunción, 111451 San Lorenzo, Paraguay
| | - Akke Mats Houben
- Departament de Física de la Matèria Condensada, Universitat de Barcelona and Universitat de Barcelona Institute of Complex Systems (UBICS), E-08028 Barcelona, Spain
| | - Jordi Soriano
- Departament de Física de la Matèria Condensada, Universitat de Barcelona and Universitat de Barcelona Institute of Complex Systems (UBICS), E-08028 Barcelona, Spain
| | - Joaquín J Torres
- Department of Electromagnetism and Physics of the Matter & Institute Carlos I for Theoretical and Computational Physics, University of Granada, 18071 Granada, Spain
| |
Collapse
|
8
|
Luppi AI, Rosas FE, Mediano PAM, Menon DK, Stamatakis EA. Information decomposition and the informational architecture of the brain. Trends Cogn Sci 2024; 28:352-368. [PMID: 38199949 DOI: 10.1016/j.tics.2023.11.005] [Citation(s) in RCA: 19] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2023] [Revised: 11/09/2023] [Accepted: 11/17/2023] [Indexed: 01/12/2024]
Abstract
To explain how the brain orchestrates information-processing for cognition, we must understand information itself. Importantly, information is not a monolithic entity. Information decomposition techniques provide a way to split information into its constituent elements: unique, redundant, and synergistic information. We review how disentangling synergistic and redundant interactions is redefining our understanding of integrative brain function and its neural organisation. To explain how the brain navigates the trade-offs between redundancy and synergy, we review converging evidence integrating the structural, molecular, and functional underpinnings of synergy and redundancy; their roles in cognition and computation; and how they might arise over evolution and development. Overall, disentangling synergistic and redundant information provides a guiding principle for understanding the informational architecture of the brain and cognition.
Collapse
Affiliation(s)
- Andrea I Luppi
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK; Montreal Neurological Institute, McGill University, Montreal, QC, Canada
| | - Fernando E Rosas
- Department of Informatics, University of Sussex, Brighton, UK; Centre for Psychedelic Research, Department of Brain Sciences, Imperial College London, London, UK; Centre for Complexity Science, Imperial College London, London, UK; Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, UK
| | - Pedro A M Mediano
- Department of Computing, Imperial College London, London, UK; Department of Psychology, University of Cambridge, Cambridge, UK
| | - David K Menon
- Department of Medicine, University of Cambridge, Cambridge, UK; Wolfson Brain Imaging Centre, University of Cambridge, Cambridge, UK
| | - Emmanuel A Stamatakis
- Division of Anaesthesia, University of Cambridge, Cambridge, UK; Department of Clinical Neurosciences, University of Cambridge, Cambridge, UK.
| |
Collapse
|
9
|
Varley TF. Generalized decomposition of multivariate information. PLoS One 2024; 19:e0297128. [PMID: 38315691 PMCID: PMC10843128 DOI: 10.1371/journal.pone.0297128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 12/28/2023] [Indexed: 02/07/2024] Open
Abstract
Since its introduction, the partial information decomposition (PID) has emerged as a powerful, information-theoretic technique useful for studying the structure of (potentially higher-order) interactions in complex systems. Despite its utility, the applicability of the PID is restricted by the need to assign elements as either "sources" or "targets", as well as the specific structure of the mutual information itself. Here, I introduce a generalized information decomposition that relaxes the source/target distinction while still satisfying the basic intuitions about information. This approach is based on the decomposition of the Kullback-Leibler divergence, and consequently allows for the analysis of any information gained when updating from an arbitrary prior to an arbitrary posterior. As a result, any information-theoretic measure that can be written as a linear combination of Kullback-Leibler divergences admits a decomposition in the style of Williams and Beer, including the total correlation, the negentropy, and the mutual information as special cases. This paper explores how the generalized information decomposition can reveal novel insights into existing measures, as well as the nature of higher-order synergies. We show that synergistic information is intimately related to the well-known Tononi-Sporns-Edelman (TSE) complexity, and that synergistic information requires a similar integration/segregation balance as a high TSE complexity. Finally, I end with a discussion of how this approach fits into other attempts to generalize the PID and the possibilities for empirical applications.
Collapse
Affiliation(s)
- Thomas F. Varley
- Department of Computer Science, University of Vermont, Burlington, VT, United States of America
- Vermont Complex Systems Center, University of Vermont, Burlington, VT, United States of America
| |
Collapse
|
10
|
Varley TF, Pope M, Puxeddu MG, Faskowitz J, Sporns O. Partial entropy decomposition reveals higher-order information structures in human brain activity. Proc Natl Acad Sci U S A 2023; 120:e2300888120. [PMID: 37467265 PMCID: PMC10372615 DOI: 10.1073/pnas.2300888120] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 06/06/2023] [Indexed: 07/21/2023] Open
Abstract
The standard approach to modeling the human brain as a complex system is with a network, where the basic unit of interaction is a pairwise link between two brain regions. While powerful, this approach is limited by the inability to assess higher-order interactions involving three or more elements directly. In this work, we explore a method for capturing higher-order dependencies in multivariate data: the partial entropy decomposition (PED). Our approach decomposes the joint entropy of the whole system into a set of nonnegative atoms that describe the redundant, unique, and synergistic interactions that compose the system's structure. PED gives insight into the mathematics of functional connectivity and its limitation. When applied to resting-state fMRI data, we find robust evidence of higher-order synergies that are largely invisible to standard functional connectivity analyses. Our approach can also be localized in time, allowing a frame-by-frame analysis of how the distributions of redundancies and synergies change over the course of a recording. We find that different ensembles of regions can transiently change from being redundancy-dominated to synergy-dominated and that the temporal pattern is structured in time. These results provide strong evidence that there exists a large space of unexplored structures in human brain data that have been largely missed by a focus on bivariate network connectivity models. This synergistic structure is dynamic in time and likely will illuminate interesting links between brain and behavior. Beyond brain-specific application, the PED provides a very general approach for understanding higher-order structures in a variety of complex systems.
Collapse
Affiliation(s)
- Thomas F. Varley
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN47405
| | - Maria Pope
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Program in Neuroscience, Indiana University, Bloomington, IN47405
| | - Maria Grazia Puxeddu
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN47405
| | - Joshua Faskowitz
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Program in Neuroscience, Indiana University, Bloomington, IN47405
| | - Olaf Sporns
- School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN47405
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN47405
- Program in Neuroscience, Indiana University, Bloomington, IN47405
| |
Collapse
|