1
|
Li J, Bauer R, Rentzeperis I, van Leeuwen C. Adaptive rewiring: a general principle for neural network development. FRONTIERS IN NETWORK PHYSIOLOGY 2024; 4:1410092. [PMID: 39534101 PMCID: PMC11554485 DOI: 10.3389/fnetp.2024.1410092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/31/2024] [Accepted: 10/15/2024] [Indexed: 11/16/2024]
Abstract
The nervous system, especially the human brain, is characterized by its highly complex network topology. The neurodevelopment of some of its features has been described in terms of dynamic optimization rules. We discuss the principle of adaptive rewiring, i.e., the dynamic reorganization of a network according to the intensity of internal signal communication as measured by synchronization or diffusion, and its recent generalization for applications in directed networks. These have extended the principle of adaptive rewiring from highly oversimplified networks to more neurally plausible ones. Adaptive rewiring captures all the key features of the complex brain topology: it transforms initially random or regular networks into networks with a modular small-world structure and a rich-club core. This effect is specific in the sense that it can be tailored to computational needs, robust in the sense that it does not depend on a critical regime, and flexible in the sense that parametric variation generates a range of variant network configurations. Extreme variant networks can be associated at macroscopic level with disorders such as schizophrenia, autism, and dyslexia, and suggest a relationship between dyslexia and creativity. Adaptive rewiring cooperates with network growth and interacts constructively with spatial organization principles in the formation of topographically distinct modules and structures such as ganglia and chains. At the mesoscopic level, adaptive rewiring enables the development of functional architectures, such as convergent-divergent units, and sheds light on the early development of divergence and convergence in, for example, the visual system. Finally, we discuss future prospects for the principle of adaptive rewiring.
Collapse
Affiliation(s)
- Jia Li
- Brain and Cognition, KU Leuven, Leuven, Belgium
- Cognitive Science, RPTU Kaiserslautern, Kaiserslautern, Germany
| | - Roman Bauer
- NICE Research Group, Computer Science Research Centre, University of Surrey, Guildford, United Kingdom
| | - Ilias Rentzeperis
- Institute of Optics, Spanish National Research Council (CSIC), Madrid, Spain
| | - Cees van Leeuwen
- Brain and Cognition, KU Leuven, Leuven, Belgium
- Cognitive Science, RPTU Kaiserslautern, Kaiserslautern, Germany
| |
Collapse
|
2
|
Duswald T, Breitwieser L, Thorne T, Wohlmuth B, Bauer R. Calibration of stochastic, agent-based neuron growth models with approximate Bayesian computation. J Math Biol 2024; 89:50. [PMID: 39379537 PMCID: PMC11461709 DOI: 10.1007/s00285-024-02144-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2024] [Revised: 05/22/2024] [Accepted: 08/31/2024] [Indexed: 10/10/2024]
Abstract
Understanding how genetically encoded rules drive and guide complex neuronal growth processes is essential to comprehending the brain's architecture, and agent-based models (ABMs) offer a powerful simulation approach to further develop this understanding. However, accurately calibrating these models remains a challenge. Here, we present a novel application of Approximate Bayesian Computation (ABC) to address this issue. ABMs are based on parametrized stochastic rules that describe the time evolution of small components-the so-called agents-discretizing the system, leading to stochastic simulations that require appropriate treatment. Mathematically, the calibration defines a stochastic inverse problem. We propose to address it in a Bayesian setting using ABC. We facilitate the repeated comparison between data and simulations by quantifying the morphological information of single neurons with so-called morphometrics and resort to statistical distances to measure discrepancies between populations thereof. We conduct experiments on synthetic as well as experimental data. We find that ABC utilizing Sequential Monte Carlo sampling and the Wasserstein distance finds accurate posterior parameter distributions for representative ABMs. We further demonstrate that these ABMs capture specific features of pyramidal cells of the hippocampus (CA1). Overall, this work establishes a robust framework for calibrating agent-based neuronal growth models and opens the door for future investigations using Bayesian techniques for model building, verification, and adequacy assessment.
Collapse
Affiliation(s)
- Tobias Duswald
- CERN, Geneva, Switzerland.
- School of Computation, Information, and Technology, Technical University of Munich, Munich, Germany.
| | - Lukas Breitwieser
- Department of Information Technology and Electrical Engineering, ETH Zurich, Zurich, Switzerland
| | - Thomas Thorne
- School of Computer Science and Electronic Engineering, University of Surrey, Guildford, UK
| | - Barbara Wohlmuth
- School of Computation, Information, and Technology, Technical University of Munich, Munich, Germany
| | - Roman Bauer
- School of Computer Science and Electronic Engineering, University of Surrey, Guildford, UK
| |
Collapse
|
3
|
de Montigny J, Sernagor E, Bauer R. Retinal self-organization: a model of retinal ganglion cells and starburst amacrine cells mosaic formation. Open Biol 2023; 13:220217. [PMID: 37015288 PMCID: PMC10072945 DOI: 10.1098/rsob.220217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/06/2023] Open
Abstract
Individual retinal cell types exhibit semi-regular spatial patterns called retinal mosaics. Retinal ganglion cells (RGCs) and starburst amacrine cells (SACs) are known to exhibit such layouts. Mechanisms responsible for the formation of mosaics are not well understood but follow three main principles: (i) homotypic cells prevent nearby cells from adopting the same type, (ii) cell tangential migration and (iii) cell death. Alongside experiments in mouse, we use BioDynaMo, an agent-based simulation framework, to build a detailed and mechanistic model of mosaic formation. We investigate the implications of the three theories for RGC's mosaic formation. We report that the cell migration mechanism yields the most regular mosaics. In addition, we propose that low-density RGC type mosaics exhibit on average low regularities, and thus we question the relevance of regular spacing as a criterion for a group of RGCs to form a RGC type. We investigate SAC mosaics formation and interactions between the ganglion cell layer (GCL) and inner nuclear layer (INL) populations. We propose that homotypic interactions between the GCL and INL populations during mosaics creation are required to reproduce the observed SAC mosaics' characteristics. This suggests that the GCL and INL populations of SACs might not be independent during retinal development.
Collapse
Affiliation(s)
- Jean de Montigny
- Biosciences Institute, Newcastle University, Newcastle upon Tyne NE1 7RU, UK
| | - Evelyne Sernagor
- Biosciences Institute, Newcastle University, Newcastle upon Tyne NE1 7RU, UK
| | - Roman Bauer
- Department of Computer Science, University of Surrey, Guildford GU2 7XH, UK
| |
Collapse
|
4
|
Bauer R, Clowry GJ, Kaiser M. Creative Destruction: A Basic Computational Model of Cortical Layer Formation. Cereb Cortex 2021; 31:3237-3253. [PMID: 33625496 PMCID: PMC8196252 DOI: 10.1093/cercor/bhab003] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2018] [Revised: 12/23/2020] [Accepted: 12/23/2020] [Indexed: 12/13/2022] Open
Abstract
One of the most characteristic properties of many vertebrate neural systems is the layered organization of different cell types. This cytoarchitecture exists in the cortex, the retina, the hippocampus, and many other parts of the central nervous system. The developmental mechanisms of neural layer formation have been subject to substantial experimental efforts. Here, we provide a general computational model for cortical layer formation in 3D physical space. We show that this multiscale, agent-based model, comprising two distinct stages of apoptosis, can account for the wide range of neuronal numbers encountered in different cortical areas and species. Our results demonstrate the phenotypic richness of a basic state diagram structure. Importantly, apoptosis allows for changing the thickness of one layer without automatically affecting other layers. Therefore, apoptosis increases the flexibility for evolutionary change in layer architecture. Notably, slightly changed gene regulatory dynamics recapitulate the characteristic properties observed in neurodevelopmental diseases. Overall, we propose a novel computational model using gene-type rules, exhibiting many characteristics of normal and pathological cortical development.
Collapse
Affiliation(s)
- Roman Bauer
- Department of Computer Science, University of Surrey, Guildford, GU2 7XH, UK
| | - Gavin J Clowry
- Biosciences Institute, Newcastle University, Newcastle upon Tyne NE2 4HH, UK
| | - Marcus Kaiser
- School of Computing, Newcastle University, Newcastle upon Tyne NE4 5TG, UK
- Precision Imaging Beacon, School of Medicine, University of Nottingham, Nottingham NG7 2UH, UK
- Rui Jin Hospital, Shanghai Jiao Tong University, Shanghai 200025, China
| |
Collapse
|
5
|
Rentzeperis I, van Leeuwen C. Adaptive Rewiring in Weighted Networks Shows Specificity, Robustness, and Flexibility. Front Syst Neurosci 2021; 15:580569. [PMID: 33737871 PMCID: PMC7960922 DOI: 10.3389/fnsys.2021.580569] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Accepted: 02/02/2021] [Indexed: 11/13/2022] Open
Abstract
Brain network connections rewire adaptively in response to neural activity. Adaptive rewiring may be understood as a process which, at its every step, is aimed at optimizing the efficiency of signal diffusion. In evolving model networks, this amounts to creating shortcut connections in regions with high diffusion and pruning where diffusion is low. Adaptive rewiring leads over time to topologies akin to brain anatomy: small worlds with rich club and modular or centralized structures. We continue our investigation of adaptive rewiring by focusing on three desiderata: specificity of evolving model network architectures, robustness of dynamically maintained architectures, and flexibility of network evolution to stochastically deviate from specificity and robustness. Our adaptive rewiring model simulations show that specificity and robustness characterize alternative modes of network operation, controlled by a single parameter, the rewiring interval. Small control parameter shifts across a critical transition zone allow switching between the two modes. Adaptive rewiring exhibits greater flexibility for skewed, lognormal connection weight distributions than for normally distributed ones. The results qualify adaptive rewiring as a key principle of self-organized complexity in network architectures, in particular of those that characterize the variety of functional architectures in the brain.
Collapse
Affiliation(s)
| | - Cees van Leeuwen
- Brain and Cognition Research Unit, KU Leuven, Leuven, Belgium
- Department of Cognitive and Developmental Psychology, University of Technology Kaiserslautern, Kaiserslautern, Germany
| |
Collapse
|
6
|
Wright JJ, Bourke PD. The growth of cognition: Free energy minimization and the embryogenesis of cortical computation. Phys Life Rev 2020; 36:83-99. [PMID: 32527680 DOI: 10.1016/j.plrev.2020.05.004] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Accepted: 05/29/2020] [Indexed: 11/30/2022]
Abstract
The assumption that during cortical embryogenesis neurons and synaptic connections are selected to form an ensemble maximising synchronous oscillation explains mesoscopic cortical development, and a mechanism for cortical information processing is implied by consistency with the Free Energy Principle and Dynamic Logic. A heteroclinic network emerges, with stable and unstable fixed points of oscillation corresponding to activity in symmetrically connected, versus asymmetrically connected, sets of neurons. Simulations of growth explain a wide range of anatomical observations for columnar and non-columnar cortex, superficial patch connections, and the organization and dynamic interactions of neurone response properties. An antenatal scaffold is created, upon which postnatal learning can establish continuously ordered neuronal representations, permitting matching of co-synchronous fields in multiple cortical areas to solve optimization problems as in Dynamic Logic. Fast synaptic competition partitions equilibria, minimizing "the curse of dimensionality", while perturbations between imperfectly partitioned synchronous fields, under internal reinforcement, enable the cortex to become adaptively self-directed. As learning progresses variational free energy is minimized and entropy bounded.
Collapse
Affiliation(s)
- J J Wright
- Centre for Brain Research, and Department of Psychological Medicine, School of Medicine, University of Auckland, Auckland, New Zealand.
| | - P D Bourke
- School of Social Sciences, Faculty of Arts, Business, Law and Education, University of Western Australia, Perth, Australia.
| |
Collapse
|
7
|
Rentzeperis I, van Leeuwen C. Adaptive rewiring evolves brain-like structure in weighted networks. Sci Rep 2020; 10:6075. [PMID: 32269235 PMCID: PMC7142112 DOI: 10.1038/s41598-020-62204-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2019] [Accepted: 03/06/2020] [Indexed: 11/09/2022] Open
Abstract
Activity-dependent plasticity refers to a range of mechanisms for adaptively reshaping neuronal connections. We model their common principle in terms of adaptive rewiring of network connectivity, while representing neural activity by diffusion on the network: Where diffusion is intensive, shortcut connections are established, while underused connections are pruned. In binary networks, this process is known to steer initially random networks robustly to high levels of structural complexity, reflecting the global characteristics of brain anatomy: modular or centralized small world topologies. We investigate whether this result extends to more realistic, weighted networks. Both normally- and lognormally-distributed weighted networks evolve either modular or centralized topologies. Which of these prevails depends on a single control parameter, representing global homeostatic or normalizing regulation mechanisms. Intermediate control parameter values exhibit the greatest levels of network complexity, incorporating both modular and centralized tendencies. The simulation results allow us to propose diffusion based adaptive rewiring as a parsimonious model for activity-dependent reshaping of brain connectivity structure.
Collapse
Affiliation(s)
| | - Cees van Leeuwen
- KU Leuven, Leuven, Belgium.,University of Technology Kaiserslautern, Kaiserslautern, Germany
| |
Collapse
|
8
|
Kassraian-Fard P, Pfeiffer M, Bauer R. A generative growth model for thalamocortical axonal branching in primary visual cortex. PLoS Comput Biol 2020; 16:e1007315. [PMID: 32053598 PMCID: PMC7018004 DOI: 10.1371/journal.pcbi.1007315] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2018] [Accepted: 08/06/2019] [Indexed: 11/19/2022] Open
Abstract
Axonal morphology displays large variability and complexity, yet the canonical regularities of the cortex suggest that such wiring is based on the repeated initiation of a small set of genetically encoded rules. Extracting underlying developmental principles can hence shed light on what genetically encoded instructions must be available during cortical development. Within a generative model, we investigate growth rules for axonal branching patterns in cat area 17, originating from the lateral geniculate nucleus of the thalamus. This target area of synaptic connections is characterized by extensive ramifications and a high bouton density, characteristics thought to preserve the spatial resolution of receptive fields and to enable connections for the ocular dominance columns. We compare individual and global statistics, such as a newly introduced length-weighted asymmetry index and the global segment-length distribution, of generated and biological branching patterns as the benchmark for growth rules. We show that the proposed model surpasses the statistical accuracy of the Galton-Watson model, which is the most commonly employed model for biological growth processes. In contrast to the Galton-Watson model, our model can recreate the log-normal segment-length distribution of the experimental dataset and is considerably more accurate in recreating individual axonal morphologies. To provide a biophysical interpretation for statistical quantifications of the axonal branching patterns, the generative model is ported into the physically accurate simulation framework of Cx3D. In this 3D simulation environment we demonstrate how the proposed growth process can be formulated as an interactive process between genetic growth rules and chemical cues in the local environment.
Collapse
Affiliation(s)
- Pegah Kassraian-Fard
- Institute of Neuroinformatics, University and ETH Zurich, Zurich, Switzerland
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, California, United States of America
- * E-mail:
| | - Michael Pfeiffer
- Institute of Neuroinformatics, University and ETH Zurich, Zurich, Switzerland
| | - Roman Bauer
- Interdisciplinary Computing and Complex BioSystems Research Group (ICOS), School of Computing Science, Newcastle University, Newcastle upon Tyne, United Kingdom
- Biosciences Institute, Newcastle University, Newcastle upon Tyne, United Kingdom
| |
Collapse
|
9
|
Plebe A. The search of "canonical" explanations for the cerebral cortex. HISTORY AND PHILOSOPHY OF THE LIFE SCIENCES 2018; 40:40. [PMID: 29905901 DOI: 10.1007/s40656-018-0205-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/22/2017] [Accepted: 06/07/2018] [Indexed: 06/08/2023]
Abstract
This paper addresses a fundamental line of research in neuroscience: the identification of a putative neural processing core of the cerebral cortex, often claimed to be "canonical". This "canonical" core would be shared by the entire cortex, and would explain why it is so powerful and diversified in tasks and functions, yet so uniform in architecture. The purpose of this paper is to analyze the search for canonical explanations over the past 40 years, discussing the theoretical frameworks informing this research. It will highlight a bias that, in my opinion, has limited the success of this research project, that of overlooking the dimension of cortical development. The earliest explanation of the cerebral cortex as canonical was attempted by David Marr, deriving putative cortical circuits from general mathematical laws, loosely following a deductive-nomological account. Although Marr's theory turned out to be incorrect, one of its merits was to have put the issue of cortical circuit development at the top of his agenda. This aspect has been largely neglected in much of the research on canonical models that has followed. Models proposed in the 1980s were conceived as mechanistic. They identified a small number of components that interacted as a basic circuit, with each component defined as a function. More recent models have been presented as idealized canonical computations, distinct from mechanistic explanations, due to the lack of identifiable cortical components. Currently, the entire enterprise of coming up with a single canonical explanation has been criticized as being misguided, and the premise of the uniformity of the cortex has been strongly challenged. This debate is analyzed here. The legacy of the canonical circuit concept is reflected in both positive and negative ways in recent large-scale brain projects, such as the Human Brain Project. One positive aspect is that these projects might achieve the aim of producing detailed simulations of cortical electrical activity, a negative one regards whether they will be able to find ways of simulating how circuits actually develop.
Collapse
Affiliation(s)
- Alessio Plebe
- Department of Cognitive Science, Università degli Studi di Messina, v. Concezione 8, Messina, Italy.
| |
Collapse
|
10
|
Kaiser M. Mechanisms of Connectome Development. Trends Cogn Sci 2017; 21:703-717. [PMID: 28610804 DOI: 10.1016/j.tics.2017.05.010] [Citation(s) in RCA: 52] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2017] [Revised: 05/12/2017] [Accepted: 05/16/2017] [Indexed: 12/17/2022]
Abstract
At the centenary of D'Arcy Thompson's seminal work 'On Growth and Form', pioneering the description of principles of morphological changes during development and evolution, recent experimental advances allow us to study change in anatomical brain networks. Here, we outline potential principles for connectome development. We will describe recent results on how spatial and temporal factors shape connectome development in health and disease. Understanding the developmental origins of brain diseases in individuals will be crucial for deciding on personalized treatment options. We argue that longitudinal studies, experimentally derived parameters for connection formation, and biologically realistic computational models are needed to better understand the link between brain network development, network structure, and network function.
Collapse
Affiliation(s)
- Marcus Kaiser
- ICOS Research Group, School of Computing Science, Newcastle University, Newcastle upon Tyne, UK; Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, UK.
| |
Collapse
|
11
|
Bauer R, Kaiser M. Nonlinear growth: an origin of hub organization in complex networks. ROYAL SOCIETY OPEN SCIENCE 2017; 4:160691. [PMID: 28405356 PMCID: PMC5383813 DOI: 10.1098/rsos.160691] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/12/2016] [Accepted: 02/21/2017] [Indexed: 05/06/2023]
Abstract
Many real-world networks contain highly connected nodes called hubs. Hubs are often crucial for network function and spreading dynamics. However, classical models of how hubs originate during network development unrealistically assume that new nodes attain information about the connectivity (for example the degree) of existing nodes. Here, we introduce hub formation through nonlinear growth where the number of nodes generated at each stage increases over time and new nodes form connections independent of target node features. Our model reproduces variation in number of connections, hub occurrence time, and rich-club organization of networks ranging from protein-protein, neuronal and fibre tract brain networks to airline networks. Moreover, nonlinear growth gives a more generic representation of these networks compared with previous preferential attachment or duplication-divergence models. Overall, hub creation through nonlinear network expansion can serve as a benchmark model for studying the development of many real-world networks.
Collapse
Affiliation(s)
- Roman Bauer
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne NE2 4HH, UK
- Interdisciplinary Computing and Complex BioSystems Research Group (ICOS), School of Computing Science, Newcastle University, Newcastle upon Tyne NE1 7RU, UK
| | - Marcus Kaiser
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne NE2 4HH, UK
- Interdisciplinary Computing and Complex BioSystems Research Group (ICOS), School of Computing Science, Newcastle University, Newcastle upon Tyne NE1 7RU, UK
| |
Collapse
|
12
|
Bauer R, Breitwieser L, Di Meglio A, Johard L, Kaiser M, Manca M, Mazzara M, Rademakers F, Talanov M, Tchitchigin AD. The BioDynaMo Project. ADVANCED RESEARCH ON BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES 2017. [DOI: 10.4018/978-1-5225-1947-8.ch006] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/13/2023]
Abstract
Computer simulations have become a very powerful tool for scientific research. Given the vast complexity that comes with many open scientific questions, a purely analytical or experimental approach is often not viable. For example, biological systems comprise an extremely complex organization and heterogeneous interactions across different spatial and temporal scales. In order to facilitate research on such problems, the BioDynaMo project aims at a general platform for computer simulations for biological research. Since scientific investigations require extensive computer resources, this platform should be executable on hybrid cloud computing systems, allowing for the efficient use of state-of-the-art computing technology. This chapter describes challenges during the early stages of the software development process. In particular, we describe issues regarding the implementation and the highly interdisciplinary as well as international nature of the collaboration. Moreover, we explain the methodologies, the approach, and the lessons learned by the team during these first stages.
Collapse
Affiliation(s)
| | | | | | | | | | | | - Manuel Mazzara
- Service Science and Engineering Lab, Innopolis University, Russia
| | | | | | | |
Collapse
|
13
|
Bauer R, Kaiser M. Organisational Principles of Connectomes: Changes During Evolution and Development. DIVERSITY AND COMMONALITY IN ANIMALS 2017. [DOI: 10.1007/978-4-431-56469-0_17] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
|
14
|
Sergi PN, Cavalcanti-Adam EA. Biomaterials and computation: a strategic alliance to investigate emergent responses of neural cells. Biomater Sci 2017; 5:648-657. [DOI: 10.1039/c6bm00871b] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
Abstract
Synergistic use of biomaterials and computation allows to identify and unravel neural cell responses.
Collapse
Affiliation(s)
- Pier Nicola Sergi
- The Biorobotics Institute
- Sant’ Anna Scuola Universitaria Superiore
- Pontedera
- 56025 Italy
| | - Elisabetta Ada Cavalcanti-Adam
- Max Planck Institute for Medical Research
- Dept Cellular Biophysics and Heidelberg University
- Dept Biophysical Chemistry
- Heidelberg
- Germany
| |
Collapse
|
15
|
Wright JJ, Bourke PD. Further Work on the Shaping of Cortical Development and Function by Synchrony and Metabolic Competition. Front Comput Neurosci 2016; 10:127. [PMID: 28018202 PMCID: PMC5145869 DOI: 10.3389/fncom.2016.00127] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2016] [Accepted: 11/25/2016] [Indexed: 11/13/2022] Open
Abstract
This paper furthers our attempts to resolve two major controversies-whether gamma synchrony plays a role in cognition, and whether cortical columns are functionally important. We have previously argued that the configuration of cortical cells that emerges in development is that which maximizes the magnitude of synchronous oscillation and minimizes metabolic cost. Here we analyze the separate effects in development of minimization of axonal lengths, and of early Hebbian learning and selective distribution of resources to growing synapses, by showing in simulations that these effects are partially antagonistic, but their interaction during development produces accurate anatomical and functional properties for both columnar and non-columnar cortex. The resulting embryonic anatomical order can provide a cortex-wide scaffold for postnatal learning that is dimensionally consistent with the representation of moving sensory objects, and, as learning progressively overwrites the embryonic order, further associations also occur in a dimensionally consistent framework. The role ascribed to cortical synchrony does not demand specific frequency, amplitude or phase variation of pulses to mediate "feature linking." Instead, the concerted interactions of pulse synchrony with short-term synaptic dynamics, and synaptic resource competition can further explain cortical information processing in analogy to Hopfield networks and quantum computation.
Collapse
Affiliation(s)
- James J. Wright
- Department of Psychological Medicine, School of Medicine, The University of AucklandAuckland, New Zealand
| | - Paul D. Bourke
- EPICentre, The University of New South WalesSydney, Australia
| |
Collapse
|
16
|
Gafarov FM, Gafarova VR. The effect of the neural activity on topological properties of growing neural networks. J Integr Neurosci 2016; 15:305-319. [PMID: 27507003 DOI: 10.1142/s0219635216500187] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The connectivity structure in cortical networks defines how information is transmitted and processed, and it is a source of the complex spatiotemporal patterns of network's development, and the process of creation and deletion of connections is continuous in the whole life of the organism. In this paper, we study how neural activity influences the growth process in neural networks. By using a two-dimensional activity-dependent growth model we demonstrated the neural network growth process from disconnected neurons to fully connected networks. For making quantitative investigation of the network's activity influence on its topological properties we compared it with the random growth network not depending on network's activity. By using the random graphs theory methods for the analysis of the network's connections structure it is shown that the growth in neural networks results in the formation of a well-known "small-world" network.
Collapse
Affiliation(s)
- F M Gafarov
- 1 Institute of Computational Mathematics and Information Technologies, Laboratory of Neurobiology, Kazan Federal University, Kremlevskaya 35, Kazan, 420008, Russia
| | - V R Gafarova
- 2 Institute of Philology and Intercultural Communication, Kazan Federal University, Kremlevskaya 35, Kazan, 420008, Russia
| |
Collapse
|
17
|
Sweeney Y, Clopath C. Emergent spatial synaptic structure from diffusive plasticity. Eur J Neurosci 2016; 45:1057-1067. [DOI: 10.1111/ejn.13279] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2016] [Revised: 05/04/2016] [Accepted: 05/13/2016] [Indexed: 11/29/2022]
Affiliation(s)
- Yann Sweeney
- Department of Bioengineering; Imperial College London, South Kensington Campus; London SW7 2AZ UK
| | - Claudia Clopath
- Department of Bioengineering; Imperial College London, South Kensington Campus; London SW7 2AZ UK
| |
Collapse
|