1
|
Gleeson P, Crook S, Cannon RC, Hines ML, Billings GO, Farinella M, Morse TM, Davison AP, Ray S, Bhalla US, Barnes SR, Dimitrova YD, Silver RA. NeuroML: a language for describing data driven models of neurons and networks with a high degree of biological detail. PLoS Comput Biol 2010; 6:e1000815. [PMID: 20585541 PMCID: PMC2887454 DOI: 10.1371/journal.pcbi.1000815] [Citation(s) in RCA: 168] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2010] [Accepted: 05/13/2010] [Indexed: 11/18/2022] Open
Abstract
Biologically detailed single neuron and network models are important for understanding how ion channels, synapses and anatomical connectivity underlie the complex electrical behavior of the brain. While neuronal simulators such as NEURON, GENESIS, MOOSE, NEST, and PSICS facilitate the development of these data-driven neuronal models, the specialized languages they employ are generally not interoperable, limiting model accessibility and preventing reuse of model components and cross-simulator validation. To overcome these problems we have used an Open Source software approach to develop NeuroML, a neuronal model description language based on XML (Extensible Markup Language). This enables these detailed models and their components to be defined in a standalone form, allowing them to be used across multiple simulators and archived in a standardized format. Here we describe the structure of NeuroML and demonstrate its scope by converting into NeuroML models of a number of different voltage- and ligand-gated conductances, models of electrical coupling, synaptic transmission and short-term plasticity, together with morphologically detailed models of individual neurons. We have also used these NeuroML-based components to develop an highly detailed cortical network model. NeuroML-based model descriptions were validated by demonstrating similar model behavior across five independently developed simulators. Although our results confirm that simulations run on different simulators converge, they reveal limits to model interoperability, by showing that for some models convergence only occurs at high levels of spatial and temporal discretisation, when the computational overhead is high. Our development of NeuroML as a common description language for biophysically detailed neuronal and network models enables interoperability across multiple simulation environments, thereby improving model transparency, accessibility and reuse in computational neuroscience.
Collapse
Affiliation(s)
- Padraig Gleeson
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - Sharon Crook
- School of Mathematical and Statistical Sciences, School of Life Sciences, and Center for Adaptive Neural Systems, Arizona State University, Tempe, Arizona, United States of America
| | | | - Michael L. Hines
- Department of Computer Science, Yale University, New Haven, Connecticut, United States of America
| | - Guy O. Billings
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - Matteo Farinella
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - Thomas M. Morse
- Department of Neurobiology, Yale University School of Medicine, New Haven, Connecticut, United States of America
| | - Andrew P. Davison
- Unité de Neurosciences, Information et Complexité, CNRS, Gif sur Yvette, France
| | - Subhasis Ray
- National Centre for Biological Sciences, TIFR, UAS-GKVK Campus, Bangalore, India
| | - Upinder S. Bhalla
- National Centre for Biological Sciences, TIFR, UAS-GKVK Campus, Bangalore, India
| | - Simon R. Barnes
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - Yoana D. Dimitrova
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - R. Angus Silver
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| |
Collapse
|
2
|
Barrett AB, Billings GO, Morris RGM, van Rossum MCW. State Based Model of Long-Term Potentiation and Synaptic Tagging and Capture. PLoS Comput Biol 2009. [DOI: 10.1117/12.945264] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
3
|
Barrett AB, Billings GO, Morris RGM, van Rossum MCW. State based model of long-term potentiation and synaptic tagging and capture. PLoS Comput Biol 2009; 5:e1000259. [PMID: 19148264 PMCID: PMC2603667 DOI: 10.1371/journal.pcbi.1000259] [Citation(s) in RCA: 48] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2008] [Accepted: 11/24/2008] [Indexed: 11/17/2022] Open
Abstract
Recent data indicate that plasticity protocols have not only synapse-specific but also more widespread effects. In particular, in synaptic tagging and capture (STC), tagged synapses can capture plasticity-related proteins, synthesized in response to strong stimulation of other synapses. This leads to long-lasting modification of only weakly stimulated synapses. Here we present a biophysical model of synaptic plasticity in the hippocampus that incorporates several key results from experiments on STC. The model specifies a set of physical states in which a synapse can exist, together with transition rates that are affected by high- and low-frequency stimulation protocols. In contrast to most standard plasticity models, the model exhibits both early- and late-phase LTP/D, de-potentiation, and STC. As such, it provides a useful starting point for further theoretical work on the role of STC in learning and memory.
Collapse
Affiliation(s)
- Adam B Barrett
- Institute for Adaptive and Neural Computation, University of Edinburgh, Edinburgh, United Kingdom.
| | | | | | | |
Collapse
|