1
|
Gebicke-Haerter PJ. The computational power of the human brain. Front Cell Neurosci 2023; 17:1220030. [PMID: 37608987 PMCID: PMC10441807 DOI: 10.3389/fncel.2023.1220030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Accepted: 07/05/2023] [Indexed: 08/24/2023] Open
Abstract
At the end of the 20th century, analog systems in computer science have been widely replaced by digital systems due to their higher computing power. Nevertheless, the question keeps being intriguing until now: is the brain analog or digital? Initially, the latter has been favored, considering it as a Turing machine that works like a digital computer. However, more recently, digital and analog processes have been combined to implant human behavior in robots, endowing them with artificial intelligence (AI). Therefore, we think it is timely to compare mathematical models with the biology of computation in the brain. To this end, digital and analog processes clearly identified in cellular and molecular interactions in the Central Nervous System are highlighted. But above that, we try to pinpoint reasons distinguishing in silico computation from salient features of biological computation. First, genuinely analog information processing has been observed in electrical synapses and through gap junctions, the latter both in neurons and astrocytes. Apparently opposed to that, neuronal action potentials (APs) or spikes represent clearly digital events, like the yes/no or 1/0 of a Turing machine. However, spikes are rarely uniform, but can vary in amplitude and widths, which has significant, differential effects on transmitter release at the presynaptic terminal, where notwithstanding the quantal (vesicular) release itself is digital. Conversely, at the dendritic site of the postsynaptic neuron, there are numerous analog events of computation. Moreover, synaptic transmission of information is not only neuronal, but heavily influenced by astrocytes tightly ensheathing the majority of synapses in brain (tripartite synapse). At least at this point, LTP and LTD modifying synaptic plasticity and believed to induce short and long-term memory processes including consolidation (equivalent to RAM and ROM in electronic devices) have to be discussed. The present knowledge of how the brain stores and retrieves memories includes a variety of options (e.g., neuronal network oscillations, engram cells, astrocytic syncytium). Also epigenetic features play crucial roles in memory formation and its consolidation, which necessarily guides to molecular events like gene transcription and translation. In conclusion, brain computation is not only digital or analog, or a combination of both, but encompasses features in parallel, and of higher orders of complexity.
Collapse
Affiliation(s)
- Peter J. Gebicke-Haerter
- Institute of Psychopharmacology, Central Institute of Mental Health, Faculty of Medicine, University of Heidelberg, Mannheim, Germany
| |
Collapse
|
2
|
Kastellakis G, Tasciotti S, Pandi I, Poirazi P. The dendritic engram. Front Behav Neurosci 2023; 17:1212139. [PMID: 37576932 PMCID: PMC10412934 DOI: 10.3389/fnbeh.2023.1212139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Accepted: 07/11/2023] [Indexed: 08/15/2023] Open
Abstract
Accumulating evidence from a wide range of studies, including behavioral, cellular, molecular and computational findings, support a key role of dendrites in the encoding and recall of new memories. Dendrites can integrate synaptic inputs in non-linear ways, provide the substrate for local protein synthesis and facilitate the orchestration of signaling pathways that regulate local synaptic plasticity. These capabilities allow them to act as a second layer of computation within the neuron and serve as the fundamental unit of plasticity. As such, dendrites are integral parts of the memory engram, namely the physical representation of memories in the brain and are increasingly studied during learning tasks. Here, we review experimental and computational studies that support a novel, dendritic view of the memory engram that is centered on non-linear dendritic branches as elementary memory units. We highlight the potential implications of dendritic engrams for the learning and memory field and discuss future research directions.
Collapse
Affiliation(s)
- George Kastellakis
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology, Heraklion, Greece
| | - Simone Tasciotti
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology, Heraklion, Greece
- Department of Biology, University of Crete, Heraklion, Greece
| | - Ioanna Pandi
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology, Heraklion, Greece
- Department of Biology, University of Crete, Heraklion, Greece
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology, Heraklion, Greece
| |
Collapse
|
3
|
Sáray S, Rössert CA, Appukuttan S, Migliore R, Vitale P, Lupascu CA, Bologna LL, Van Geit W, Romani A, Davison AP, Muller E, Freund TF, Káli S. HippoUnit: A software tool for the automated testing and systematic comparison of detailed models of hippocampal neurons based on electrophysiological data. PLoS Comput Biol 2021; 17:e1008114. [PMID: 33513130 PMCID: PMC7875359 DOI: 10.1371/journal.pcbi.1008114] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2020] [Revised: 02/10/2021] [Accepted: 12/24/2020] [Indexed: 11/19/2022] Open
Abstract
Anatomically and biophysically detailed data-driven neuronal models have become widely used tools for understanding and predicting the behavior and function of neurons. Due to the increasing availability of experimental data from anatomical and electrophysiological measurements as well as the growing number of computational and software tools that enable accurate neuronal modeling, there are now a large number of different models of many cell types available in the literature. These models were usually built to capture a few important or interesting properties of the given neuron type, and it is often unknown how they would behave outside their original context. In addition, there is currently no simple way of quantitatively comparing different models regarding how closely they match specific experimental observations. This limits the evaluation, re-use and further development of the existing models. Further, the development of new models could also be significantly facilitated by the ability to rapidly test the behavior of model candidates against the relevant collection of experimental data. We address these problems for the representative case of the CA1 pyramidal cell of the rat hippocampus by developing an open-source Python test suite, which makes it possible to automatically and systematically test multiple properties of models by making quantitative comparisons between the models and electrophysiological data. The tests cover various aspects of somatic behavior, and signal propagation and integration in apical dendrites. To demonstrate the utility of our approach, we applied our tests to compare the behavior of several different rat hippocampal CA1 pyramidal cell models from the ModelDB database against electrophysiological data available in the literature, and evaluated how well these models match experimental observations in different domains. We also show how we employed the test suite to aid the development of models within the European Human Brain Project (HBP), and describe the integration of the tests into the validation framework developed in the HBP, with the aim of facilitating more reproducible and transparent model building in the neuroscience community.
Collapse
Affiliation(s)
- Sára Sáray
- Faculty of Information Technology and Bionics, Pázmány Péter Catholic University, Budapest, Hungary
- Institute of Experimental Medicine, Budapest, Hungary
- * E-mail: (SS); (SK)
| | - Christian A. Rössert
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Shailesh Appukuttan
- Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique/Université Paris-Saclay, Gif-sur-Yvette, France
| | - Rosanna Migliore
- Institute of Biophysics, National Research Council, Palermo, Italy
| | - Paola Vitale
- Institute of Biophysics, National Research Council, Palermo, Italy
| | | | - Luca L. Bologna
- Institute of Biophysics, National Research Council, Palermo, Italy
| | - Werner Van Geit
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Armando Romani
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Andrew P. Davison
- Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique/Université Paris-Saclay, Gif-sur-Yvette, France
| | - Eilif Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
- Department of Neurosciences, Faculty of Medicine, University of Montreal, Montreal, Canada
- CHU Sainte-Justine Research Center, Montreal, Canada
- Quebec Artificial Intelligence Institute (Mila), Montreal, Canada
| | - Tamás F. Freund
- Faculty of Information Technology and Bionics, Pázmány Péter Catholic University, Budapest, Hungary
- Institute of Experimental Medicine, Budapest, Hungary
| | - Szabolcs Káli
- Faculty of Information Technology and Bionics, Pázmány Péter Catholic University, Budapest, Hungary
- Institute of Experimental Medicine, Budapest, Hungary
- * E-mail: (SS); (SK)
| |
Collapse
|
4
|
Risi N, Aimar A, Donati E, Solinas S, Indiveri G. A Spike-Based Neuromorphic Architecture of Stereo Vision. Front Neurorobot 2020; 14:568283. [PMID: 33304262 PMCID: PMC7693562 DOI: 10.3389/fnbot.2020.568283] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2020] [Accepted: 10/09/2020] [Indexed: 11/13/2022] Open
Abstract
The problem of finding stereo correspondences in binocular vision is solved effortlessly in nature and yet it is still a critical bottleneck for artificial machine vision systems. As temporal information is a crucial feature in this process, the advent of event-based vision sensors and dedicated event-based processors promises to offer an effective approach to solving the stereo matching problem. Indeed, event-based neuromorphic hardware provides an optimal substrate for fast, asynchronous computation, that can make explicit use of precise temporal coincidences. However, although several biologically-inspired solutions have already been proposed, the performance benefits of combining event-based sensing with asynchronous and parallel computation are yet to be explored. Here we present a hardware spike-based stereo-vision system that leverages the advantages of brain-inspired neuromorphic computing by interfacing two event-based vision sensors to an event-based mixed-signal analog/digital neuromorphic processor. We describe a prototype interface designed to enable the emulation of a stereo-vision system on neuromorphic hardware and we quantify the stereo matching performance with two datasets. Our results provide a path toward the realization of low-latency, end-to-end event-based, neuromorphic architectures for stereo vision.
Collapse
Affiliation(s)
- Nicoletta Risi
- Institute of Neuroinformatics, University of Zurich, Eidgenössische Technische Hochschule Zurich, Zurich, Switzerland
| | - Alessandro Aimar
- Institute of Neuroinformatics, University of Zurich, Eidgenössische Technische Hochschule Zurich, Zurich, Switzerland
| | - Elisa Donati
- Institute of Neuroinformatics, University of Zurich, Eidgenössische Technische Hochschule Zurich, Zurich, Switzerland
| | - Sergio Solinas
- Institute of Neuroinformatics, University of Zurich, Eidgenössische Technische Hochschule Zurich, Zurich, Switzerland
| | - Giacomo Indiveri
- Institute of Neuroinformatics, University of Zurich, Eidgenössische Technische Hochschule Zurich, Zurich, Switzerland
| |
Collapse
|
5
|
Poirazi P, Papoutsi A. Illuminating dendritic function with computational models. Nat Rev Neurosci 2020; 21:303-321. [PMID: 32393820 DOI: 10.1038/s41583-020-0301-7] [Citation(s) in RCA: 79] [Impact Index Per Article: 19.8] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/25/2020] [Indexed: 02/06/2023]
Abstract
Dendrites have always fascinated researchers: from the artistic drawings by Ramon y Cajal to the beautiful recordings of today, neuroscientists have been striving to unravel the mysteries of these structures. Theoretical work in the 1960s predicted important dendritic effects on neuronal processing, establishing computational modelling as a powerful technique for their investigation. Since then, modelling of dendrites has been instrumental in driving neuroscience research in a targeted manner, providing experimentally testable predictions that range from the subcellular level to the systems level, and their relevance extends to fields beyond neuroscience, such as machine learning and artificial intelligence. Validation of modelling predictions often requires - and drives - new technological advances, thus closing the loop with theory-driven experimentation that moves the field forward. This Review features the most important, to our understanding, contributions of modelling of dendritic computations, including those pending experimental verification, and highlights studies of successful interactions between the modelling and experimental neuroscience communities.
Collapse
Affiliation(s)
- Panayiota Poirazi
- Institute of Molecular Biology & Biotechnology, Foundation for Research & Technology - Hellas, Heraklion, Crete, Greece.
| | - Athanasia Papoutsi
- Institute of Molecular Biology & Biotechnology, Foundation for Research & Technology - Hellas, Heraklion, Crete, Greece
| |
Collapse
|
6
|
Kastellakis G, Poirazi P. Synaptic Clustering and Memory Formation. Front Mol Neurosci 2019; 12:300. [PMID: 31866824 PMCID: PMC6908852 DOI: 10.3389/fnmol.2019.00300] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2019] [Accepted: 11/25/2019] [Indexed: 01/12/2023] Open
Abstract
In the study of memory engrams, synaptic memory allocation is a newly emerged theme that focuses on how specific synapses are engaged in the storage of a given memory. Cumulating evidence from imaging and molecular experiments indicates that the recruitment of synapses that participate in the encoding and expression of memory is neither random nor uniform. A hallmark observation is the emergence of groups of synapses that share similar response properties and/or similar input properties and are located within a stretch of a dendritic branch. This grouping of synapses has been termed "synapse clustering" and has been shown to emerge in many different memory-related paradigms, as well as in in vitro studies. The clustering of synapses may emerge from synapses receiving similar input, or via many processes which allow for cross-talk between nearby synapses within a dendritic branch, leading to cooperative plasticity. Clustered synapses can act in concert to maximally exploit the nonlinear integration potential of the dendritic branches in which they reside. Their main contribution is to facilitate the induction of dendritic spikes and dendritic plateau potentials, which provide advanced computational and memory-related capabilities to dendrites and single neurons. This review focuses on recent evidence which investigates the role of synapse clustering in dendritic integration, sensory perception, learning, and memory as well as brain dysfunction. We also discuss recent theoretical work which explores the computational advantages provided by synapse clustering, leading to novel and revised theories of memory. As an eminent phenomenon during memory allocation, synapse clustering both shapes memory engrams and is also shaped by the parallel plasticity mechanisms upon which it relies.
Collapse
Affiliation(s)
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, Greece
| |
Collapse
|
7
|
Synaptic plasticity in dendrites: complications and coping strategies. Curr Opin Neurobiol 2017; 43:177-186. [PMID: 28453975 DOI: 10.1016/j.conb.2017.03.012] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2016] [Revised: 03/20/2017] [Accepted: 03/22/2017] [Indexed: 12/15/2022]
Abstract
The elaborate morphology, nonlinear membrane mechanisms and spatiotemporally varying synaptic activation patterns of dendrites complicate the expression, compartmentalization and modulation of synaptic plasticity. To grapple with this complexity, we start with the observation that neurons in different brain areas face markedly different learning problems, and dendrites of different neuron types contribute to the cell's input-output function in markedly different ways. By committing to specific assumptions regarding a neuron's learning problem and its input-output function, specific inferences can be drawn regarding the synaptic plasticity mechanisms and outcomes that we 'ought' to expect for that neuron. Exploiting this assumption-driven approach can help both in interpreting existing experimental data and designing future experiments aimed at understanding the brain's myriad learning processes.
Collapse
|
8
|
Singh MF, Zald DH. A simple transfer function for nonlinear dendritic integration. Front Comput Neurosci 2015; 9:98. [PMID: 26321940 PMCID: PMC4530314 DOI: 10.3389/fncom.2015.00098] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2015] [Accepted: 07/15/2015] [Indexed: 11/13/2022] Open
Abstract
Relatively recent advances in patch clamp recordings and iontophoresis have enabled unprecedented study of neuronal post-synaptic integration ("dendritic integration"). Findings support a separate layer of integration in the dendritic branches before potentials reach the cell's soma. While integration between branches obeys previous linear assumptions, proximal inputs within a branch produce threshold nonlinearity, which some authors have likened to the sigmoid function. Here we show the implausibility of a sigmoidal relation and present a more realistic transfer function in both an elegant artificial form and a biophysically derived form that further considers input locations along the dendritic arbor. As the distance between input locations determines their ability to produce nonlinear interactions, models incorporating dendritic topology are essential to understanding the computational power afforded by these early stages of integration. We use the biophysical transfer function to emulate empirical data using biophysical parameters and describe the conditions under which the artificial and biophysically derived forms are equivalent.
Collapse
Affiliation(s)
- Matthew F Singh
- Department of Psychology, Vanderbilt University Nashville, TN, USA ; Department of Psychiatry, Vanderbilt University Nashville, TN, USA
| | - David H Zald
- Department of Psychology, Vanderbilt University Nashville, TN, USA ; The Program in Neurosciences, Washington University School of Medicine in St. Louis St. Louis, MO, USA
| |
Collapse
|
9
|
Tran-Van-Minh A, Cazé RD, Abrahamsson T, Cathala L, Gutkin BS, DiGregorio DA. Contribution of sublinear and supralinear dendritic integration to neuronal computations. Front Cell Neurosci 2015; 9:67. [PMID: 25852470 PMCID: PMC4371705 DOI: 10.3389/fncel.2015.00067] [Citation(s) in RCA: 57] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2014] [Accepted: 02/13/2015] [Indexed: 12/25/2022] Open
Abstract
Nonlinear dendritic integration is thought to increase the computational ability of neurons. Most studies focus on how supralinear summation of excitatory synaptic responses arising from clustered inputs within single dendrites result in the enhancement of neuronal firing, enabling simple computations such as feature detection. Recent reports have shown that sublinear summation is also a prominent dendritic operation, extending the range of subthreshold input-output (sI/O) transformations conferred by dendrites. Like supralinear operations, sublinear dendritic operations also increase the repertoire of neuronal computations, but feature extraction requires different synaptic connectivity strategies for each of these operations. In this article we will review the experimental and theoretical findings describing the biophysical determinants of the three primary classes of dendritic operations: linear, sublinear, and supralinear. We then review a Boolean algebra-based analysis of simplified neuron models, which provides insight into how dendritic operations influence neuronal computations. We highlight how neuronal computations are critically dependent on the interplay of dendritic properties (morphology and voltage-gated channel expression), spiking threshold and distribution of synaptic inputs carrying particular sensory features. Finally, we describe how global (scattered) and local (clustered) integration strategies permit the implementation of similar classes of computations, one example being the object feature binding problem.
Collapse
Affiliation(s)
- Alexandra Tran-Van-Minh
- Unit of Dynamic Neuronal Imaging, Department of Neuroscience, CNRS UMR 3571, Institut Pasteur Paris, France
| | - Romain D Cazé
- Group for Neural Theory, LNC INSERM U960, Institut d'Etude de la Cognition de l'Ecole normale supérieure, Ecole normale supérieure Paris, France ; Department of Bioengineering, Imperial College London London, UK
| | - Therése Abrahamsson
- Unit of Dynamic Neuronal Imaging, Department of Neuroscience, CNRS UMR 3571, Institut Pasteur Paris, France ; Center for Research in Neuroscience, Department of Neurology and Neurosurgery, The Research Institute of the McGill University Health Centre, Montreal General Hospital Montreal, QC, Canada
| | - Laurence Cathala
- Sorbonne Universités, UPMC Univ Paris 6, UMR 8256 B2A, Team Brain Development, Repair and Aging Paris, France
| | - Boris S Gutkin
- Group for Neural Theory, LNC INSERM U960, Institut d'Etude de la Cognition de l'Ecole normale supérieure, Ecole normale supérieure Paris, France ; Federal Research University Higher School of Economics Moscow, Russia
| | - David A DiGregorio
- Unit of Dynamic Neuronal Imaging, Department of Neuroscience, CNRS UMR 3571, Institut Pasteur Paris, France
| |
Collapse
|
10
|
Kastellakis G, Cai DJ, Mednick SC, Silva AJ, Poirazi P. Synaptic clustering within dendrites: an emerging theory of memory formation. Prog Neurobiol 2015; 126:19-35. [PMID: 25576663 PMCID: PMC4361279 DOI: 10.1016/j.pneurobio.2014.12.002] [Citation(s) in RCA: 118] [Impact Index Per Article: 13.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2014] [Revised: 12/29/2014] [Accepted: 12/29/2014] [Indexed: 11/30/2022]
Abstract
It is generally accepted that complex memories are stored in distributed representations throughout the brain, however the mechanisms underlying these representations are not understood. Here, we review recent findings regarding the subcellular mechanisms implicated in memory formation, which provide evidence for a dendrite-centered theory of memory. Plasticity-related phenomena which affect synaptic properties, such as synaptic tagging and capture, synaptic clustering, branch strength potentiation and spinogenesis provide the foundation for a model of memory storage that relies heavily on processes operating at the dendrite level. The emerging picture suggests that clusters of functionally related synapses may serve as key computational and memory storage units in the brain. We discuss both experimental evidence and theoretical models that support this hypothesis and explore its advantages for neuronal function.
Collapse
Affiliation(s)
- George Kastellakis
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology, Hellas (FORTH), P.O. Box 1385, GR 70013 Heraklion, Greece
| | - Denise J Cai
- Departments of Neurobiology, Psychology, Psychiatry, Integrative Center for Learning and Memory and Brain Research Institute, UCLA, 2554 Gonda Center, Los Angeles, CA 90095, United States
| | - Sara C Mednick
- Department of Psychology, University of California, 900 University Avenue, Riverside, CA 92521, United States
| | - Alcino J Silva
- Departments of Neurobiology, Psychology, Psychiatry, Integrative Center for Learning and Memory and Brain Research Institute, UCLA, 2554 Gonda Center, Los Angeles, CA 90095, United States
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology, Hellas (FORTH), P.O. Box 1385, GR 70013 Heraklion, Greece.
| |
Collapse
|
11
|
Araya R. Input transformation by dendritic spines of pyramidal neurons. Front Neuroanat 2014; 8:141. [PMID: 25520626 PMCID: PMC4251451 DOI: 10.3389/fnana.2014.00141] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2014] [Accepted: 11/11/2014] [Indexed: 11/13/2022] Open
Abstract
In the mammalian brain, most inputs received by a neuron are formed on the dendritic tree. In the neocortex, the dendrites of pyramidal neurons are covered by thousands of tiny protrusions known as dendritic spines, which are the major recipient sites for excitatory synaptic information in the brain. Their peculiar morphology, with a small head connected to the dendritic shaft by a slender neck, has inspired decades of theoretical and more recently experimental work in an attempt to understand how excitatory synaptic inputs are processed, stored and integrated in pyramidal neurons. Advances in electrophysiological, optical and genetic tools are now enabling us to unravel the biophysical and molecular mechanisms controlling spine function in health and disease. Here I highlight relevant findings, challenges and hypotheses on spine function, with an emphasis on the electrical properties of spines and on how these affect the storage and integration of excitatory synaptic inputs in pyramidal neurons. In an attempt to make sense of the published data, I propose that the raison d'etre for dendritic spines lies in their ability to undergo activity-dependent structural and molecular changes that can modify synaptic strength, and hence alter the gain of the linearly integrated sub-threshold depolarizations in pyramidal neuron dendrites before the generation of a dendritic spike.
Collapse
Affiliation(s)
- Roberto Araya
- Department of Neurosciences, Faculty of Medicine, University of Montreal Montreal, QC, Canada
| |
Collapse
|
12
|
DeBello WM, McBride TJ, Nichols GS, Pannoni KE, Sanculi D, Totten DJ. Input clustering and the microscale structure of local circuits. Front Neural Circuits 2014; 8:112. [PMID: 25309336 PMCID: PMC4162353 DOI: 10.3389/fncir.2014.00112] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2013] [Accepted: 08/28/2014] [Indexed: 11/13/2022] Open
Abstract
The recent development of powerful tools for high-throughput mapping of synaptic networks promises major advances in understanding brain function. One open question is how circuits integrate and store information. Competing models based on random vs. structured connectivity make distinct predictions regarding the dendritic addressing of synaptic inputs. In this article we review recent experimental tests of one of these models, the input clustering hypothesis. Across circuits, brain regions and species, there is growing evidence of a link between synaptic co-activation and dendritic location, although this finding is not universal. The functional implications of input clustering and future challenges are discussed.
Collapse
Affiliation(s)
- William M DeBello
- Center for Neuroscience, Department of Neurobiology, Physiology and Behavior, University of California-Davis Davis, CA, USA
| | - Thomas J McBride
- Center for Neuroscience, Department of Neurobiology, Physiology and Behavior, University of California-Davis Davis, CA, USA ; PLOS Medicine San Francisco, CA, USA
| | - Grant S Nichols
- Center for Neuroscience, Department of Neurobiology, Physiology and Behavior, University of California-Davis Davis, CA, USA
| | - Katy E Pannoni
- Center for Neuroscience, Department of Neurobiology, Physiology and Behavior, University of California-Davis Davis, CA, USA
| | - Daniel Sanculi
- Center for Neuroscience, Department of Neurobiology, Physiology and Behavior, University of California-Davis Davis, CA, USA
| | - Douglas J Totten
- Center for Neuroscience, Department of Neurobiology, Physiology and Behavior, University of California-Davis Davis, CA, USA
| |
Collapse
|
13
|
Farinella M, Ruedt DT, Gleeson P, Lanore F, Silver RA. Glutamate-bound NMDARs arising from in vivo-like network activity extend spatio-temporal integration in a L5 cortical pyramidal cell model. PLoS Comput Biol 2014; 10:e1003590. [PMID: 24763087 PMCID: PMC3998913 DOI: 10.1371/journal.pcbi.1003590] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2013] [Accepted: 03/14/2014] [Indexed: 11/18/2022] Open
Abstract
In vivo, cortical pyramidal cells are bombarded by asynchronous synaptic input arising from ongoing network activity. However, little is known about how such 'background' synaptic input interacts with nonlinear dendritic mechanisms. We have modified an existing model of a layer 5 (L5) pyramidal cell to explore how dendritic integration in the apical dendritic tuft could be altered by the levels of network activity observed in vivo. Here we show that asynchronous background excitatory input increases neuronal gain and extends both temporal and spatial integration of stimulus-evoked synaptic input onto the dendritic tuft. Addition of fast and slow inhibitory synaptic conductances, with properties similar to those from dendritic targeting interneurons, that provided a 'balanced' background configuration, partially counteracted these effects, suggesting that inhibition can tune spatio-temporal integration in the tuft. Excitatory background input lowered the threshold for NMDA receptor-mediated dendritic spikes, extended their duration and increased the probability of additional regenerative events occurring in neighbouring branches. These effects were also observed in a passive model where all the non-synaptic voltage-gated conductances were removed. Our results show that glutamate-bound NMDA receptors arising from ongoing network activity can provide a powerful spatially distributed nonlinear dendritic conductance. This may enable L5 pyramidal cells to change their integrative properties as a function of local network activity, potentially allowing both clustered and spatially distributed synaptic inputs to be integrated over extended timescales.
Collapse
Affiliation(s)
- Matteo Farinella
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - Daniel T. Ruedt
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - Padraig Gleeson
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - Frederic Lanore
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - R. Angus Silver
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
- * E-mail:
| |
Collapse
|
14
|
Hasler J, Marr B. Finding a roadmap to achieve large neuromorphic hardware systems. Front Neurosci 2013; 7:118. [PMID: 24058330 PMCID: PMC3767911 DOI: 10.3389/fnins.2013.00118] [Citation(s) in RCA: 87] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2012] [Accepted: 06/20/2013] [Indexed: 11/13/2022] Open
Abstract
Neuromorphic systems are gaining increasing importance in an era where CMOS digital computing techniques are reaching physical limits. These silicon systems mimic extremely energy efficient neural computing structures, potentially both for solving engineering applications as well as understanding neural computation. Toward this end, the authors provide a glimpse at what the technology evolution roadmap looks like for these systems so that Neuromorphic engineers may gain the same benefit of anticipation and foresight that IC designers gained from Moore's law many years ago. Scaling of energy efficiency, performance, and size will be discussed as well as how the implementation and application space of Neuromorphic systems are expected to evolve over time.
Collapse
Affiliation(s)
- Jennifer Hasler
- School of Electrical and Computer Engineering, Georgia Institute of TechnologyAtlanta, GA, USA
| | | |
Collapse
|
15
|
Tripp BP, Orchard J. Population coding in sparsely connected networks of noisy neurons. Front Comput Neurosci 2012; 6:23. [PMID: 22586391 PMCID: PMC3345527 DOI: 10.3389/fncom.2012.00023] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2012] [Accepted: 04/03/2012] [Indexed: 11/13/2022] Open
Abstract
This study examines the relationship between population coding and spatial connection statistics in networks of noisy neurons. Encoding of sensory information in the neocortex is thought to require coordinated neural populations, because individual cortical neurons respond to a wide range of stimuli, and exhibit highly variable spiking in response to repeated stimuli. Population coding is rooted in network structure, because cortical neurons receive information only from other neurons, and because the information they encode must be decoded by other neurons, if it is to affect behavior. However, population coding theory has often ignored network structure, or assumed discrete, fully connected populations (in contrast with the sparsely connected, continuous sheet of the cortex). In this study, we modeled a sheet of cortical neurons with sparse, primarily local connections, and found that a network with this structure could encode multiple internal state variables with high signal-to-noise ratio. However, we were unable to create high-fidelity networks by instantiating connections at random according to spatial connection probabilities. In our models, high-fidelity networks required additional structure, with higher cluster factors and correlations between the inputs to nearby neurons.
Collapse
Affiliation(s)
- Bryan P Tripp
- Department of Systems Design Engineering, Centre for Theoretical Neuroscience, University of Waterloo, Waterloo ON, Canada
| | | |
Collapse
|
16
|
Predictive features of persistent activity emergence in regular spiking and intrinsic bursting model neurons. PLoS Comput Biol 2012; 8:e1002489. [PMID: 22570601 PMCID: PMC3343116 DOI: 10.1371/journal.pcbi.1002489] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2011] [Accepted: 03/08/2012] [Indexed: 11/19/2022] Open
Abstract
Proper functioning of working memory involves the expression of stimulus-selective persistent activity in pyramidal neurons of the prefrontal cortex (PFC), which refers to neural activity that persists for seconds beyond the end of the stimulus. The mechanisms which PFC pyramidal neurons use to discriminate between preferred vs. neutral inputs at the cellular level are largely unknown. Moreover, the presence of pyramidal cell subtypes with different firing patterns, such as regular spiking and intrinsic bursting, raises the question as to what their distinct role might be in persistent firing in the PFC. Here, we use a compartmental modeling approach to search for discriminatory features in the properties of incoming stimuli to a PFC pyramidal neuron and/or its response that signal which of these stimuli will result in persistent activity emergence. Furthermore, we use our modeling approach to study cell-type specific differences in persistent activity properties, via implementing a regular spiking (RS) and an intrinsic bursting (IB) model neuron. We identify synaptic location within the basal dendrites as a feature of stimulus selectivity. Specifically, persistent activity-inducing stimuli consist of activated synapses that are located more distally from the soma compared to non-inducing stimuli, in both model cells. In addition, the action potential (AP) latency and the first few inter-spike-intervals of the neuronal response can be used to reliably detect inducing vs. non-inducing inputs, suggesting a potential mechanism by which downstream neurons can rapidly decode the upcoming emergence of persistent activity. While the two model neurons did not differ in the coding features of persistent activity emergence, the properties of persistent activity, such as the firing pattern and the duration of temporally-restricted persistent activity were distinct. Collectively, our results pinpoint to specific features of the neuronal response to a given stimulus that code for its ability to induce persistent activity and predict differential roles of RS and IB neurons in persistent activity expression. Memory, referred to as the ability to retain, store and recall information, represents one of the most fundamental cognitive functions in daily life. A significant feature of memory processes is selectivity to particular events or items that are important to our survival and relevant to specific situations. For long-term memory, the selectivity to a specific stimulus is seen both at the behavioral as well as the cellular level. For working memory, a type of short-term memory involved in decision making and attention processes, stimulus selectivity has been observed in vivo using spatial working memory tasks. In addition, persistent activity, which is the cellular correlate of working memory, is also selective to specific stimuli for each neuron, suggesting that each neuron has a ‘memory field’. Our study proposes that both the location of incoming inputs onto the neuronal dendritic tree and specific temporal features of the neuronal response can be used to predict the emergence of persistent activity in two neuron models with different firing patterns, revealing possible mechanisms for generating and propagating stimulus-selectivity in working memory processes. The study also reveals that neurons with different firing patterns may have different roles in persistent activity expression.
Collapse
|