1
|
Ashida G, Wang T, Kretzberg J. Integrate-and-fire-type models of the lateral superior olive. PLoS One 2024; 19:e0304832. [PMID: 38900820 PMCID: PMC11189240 DOI: 10.1371/journal.pone.0304832] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2023] [Accepted: 05/20/2024] [Indexed: 06/22/2024] Open
Abstract
Neurons of the lateral superior olive (LSO) in the auditory brainstem play a fundamental role in binaural sound localization. Previous theoretical studies developed various types of neuronal models to study the physiological functions of the LSO. These models were usually tuned to a small set of physiological data with specific aims in mind. Therefore, it is unclear whether and how they can be related to each other, how widely applicable they are, and which model is suitable for what purposes. In this study, we address these questions for six different single-compartment integrate-and-fire (IF) type LSO models. The models are divided into two groups depending on their subthreshold responses: passive (linear) models with only the leak conductance and active (nonlinear) models with an additional low-voltage-activated potassium conductance that is prevalent among the auditory system. Each of these two groups is further subdivided into three subtypes according to the spike generation mechanism: one with simple threshold-crossing detection and voltage reset, one with threshold-crossing detection plus a current to mimic spike shapes, and one with a depolarizing exponential current for spiking. In our simulations, all six models were driven by identical synaptic inputs and calibrated with common criteria for binaural tuning. The resulting spike rates of the passive models were higher for intensive inputs and lower for temporally structured inputs than those of the active models, confirming the active function of the potassium current. Within each passive or active group, the simulated responses resembled each other, regardless of the spike generation types. These results, in combination with the analysis of computational costs, indicate that an active IF model is more suitable than a passive model for accurately reproducing temporal coding of LSO. The simulation of realistic spike shapes with an extended spiking mechanism added relatively small computational costs.
Collapse
Affiliation(s)
- Go Ashida
- Faculty 6, Department of Neuroscience, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Cluster of Excellence "Hearing4all", Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
| | - Tiezhi Wang
- Faculty 6, Department of Neuroscience, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Faculty 6, Department of Health Services Research, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
| | - Jutta Kretzberg
- Faculty 6, Department of Neuroscience, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Cluster of Excellence "Hearing4all", Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Research Center Neurosensory Science, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
| |
Collapse
|
2
|
Zeldenrust F, Calcini N, Yan X, Bijlsma A, Celikel T. The tuning of tuning: How adaptation influences single cell information transfer. PLoS Comput Biol 2024; 20:e1012043. [PMID: 38739640 PMCID: PMC11115315 DOI: 10.1371/journal.pcbi.1012043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2023] [Revised: 05/23/2024] [Accepted: 04/01/2024] [Indexed: 05/16/2024] Open
Abstract
Sensory neurons reconstruct the world from action potentials (spikes) impinging on them. To effectively transfer information about the stimulus to the next processing level, a neuron needs to be able to adapt its working range to the properties of the stimulus. Here, we focus on the intrinsic neural properties that influence information transfer in cortical neurons and how tightly their properties need to be tuned to the stimulus statistics for them to be effective. We start by measuring the intrinsic information encoding properties of putative excitatory and inhibitory neurons in L2/3 of the mouse barrel cortex. Excitatory neurons show high thresholds and strong adaptation, making them fire sparsely and resulting in a strong compression of information, whereas inhibitory neurons that favour fast spiking transfer more information. Next, we turn to computational modelling and ask how two properties influence information transfer: 1) spike-frequency adaptation and 2) the shape of the IV-curve. We find that a subthreshold (but not threshold) adaptation, the 'h-current', and a properly tuned leak conductance can increase the information transfer of a neuron, whereas threshold adaptation can increase its working range. Finally, we verify the effect of the IV-curve slope in our experimental recordings and show that excitatory neurons form a more heterogeneous population than inhibitory neurons. These relationships between intrinsic neural features and neural coding that had not been quantified before will aid computational, theoretical and systems neuroscientists in understanding how neuronal populations can alter their coding properties, such as through the impact of neuromodulators. Why the variability of intrinsic properties of excitatory neurons is larger than that of inhibitory ones is an exciting question, for which future research is needed.
Collapse
Affiliation(s)
- Fleur Zeldenrust
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen - the Netherlands
| | - Niccolò Calcini
- Maastricht Centre for Systems Biology (MaCSBio), University of Maastricht, Maastricht, The Netherlands
| | - Xuan Yan
- Institute of Neuroscience, Chinese Academy of Sciences, Beijing, China
| | - Ate Bijlsma
- Department of Population Health Sciences / Department of Biology, Universiteit Utrecht, the Netherlands
| | - Tansu Celikel
- School of Psychology, Georgia Institute of Technology, Atlanta - GA, United States of America
| |
Collapse
|
3
|
Liu MB, Ajijola OA. The promise of cardiac neuromodulation: can computational modelling bridge the gap? J Physiol 2023; 601:3693-3694. [PMID: 37535053 PMCID: PMC10529398 DOI: 10.1113/jp285309] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Accepted: 07/25/2023] [Indexed: 08/04/2023] Open
Affiliation(s)
- Michael B Liu
- Division of Cardiovascular Medicine, Stanford University, Palo Alto, CA, USA
| | - Olujimi A Ajijola
- UCLA Cardiac Arrhythmia Center, University of California, Los Angeles, CA, USA
| |
Collapse
|
4
|
Yang PC, Rose A, DeMarco KR, Dawson JRD, Han Y, Jeng MT, Harvey RD, Santana LF, Ripplinger CM, Vorobyov I, Lewis TJ, Clancy CE. A multiscale predictive digital twin for neurocardiac modulation. J Physiol 2023; 601:3789-3812. [PMID: 37528537 PMCID: PMC10528740 DOI: 10.1113/jp284391] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 07/11/2023] [Indexed: 08/03/2023] Open
Abstract
Cardiac function is tightly regulated by the autonomic nervous system (ANS). Activation of the sympathetic nervous system increases cardiac output by increasing heart rate and stroke volume, while parasympathetic nerve stimulation instantly slows heart rate. Importantly, imbalance in autonomic control of the heart has been implicated in the development of arrhythmias and heart failure. Understanding of the mechanisms and effects of autonomic stimulation is a major challenge because synapses in different regions of the heart result in multiple changes to heart function. For example, nerve synapses on the sinoatrial node (SAN) impact pacemaking, while synapses on contractile cells alter contraction and arrhythmia vulnerability. Here, we present a multiscale neurocardiac modelling and simulator tool that predicts the effect of efferent stimulation of the sympathetic and parasympathetic branches of the ANS on the cardiac SAN and ventricular myocardium. The model includes a layered representation of the ANS and reproduces firing properties measured experimentally. Model parameters are derived from experiments and atomistic simulations. The model is a first prototype of a digital twin that is applied to make predictions across all system scales, from subcellular signalling to pacemaker frequency to tissue level responses. We predict conditions under which autonomic imbalance induces proarrhythmia and can be modified to prevent or inhibit arrhythmia. In summary, the multiscale model constitutes a predictive digital twin framework to test and guide high-throughput prediction of novel neuromodulatory therapy. KEY POINTS: A multi-layered model representation of the autonomic nervous system that includes sympathetic and parasympathetic branches, each with sparse random intralayer connectivity, synaptic dynamics and conductance based integrate-and-fire neurons generates firing patterns in close agreement with experiment. A key feature of the neurocardiac computational model is the connection between the autonomic nervous system and both pacemaker and contractile cells, where modification to pacemaker frequency drives initiation of electrical signals in the contractile cells. We utilized atomic-scale molecular dynamics simulations to predict the association and dissociation rates of noradrenaline with the β-adrenergic receptor. Multiscale predictions demonstrate how autonomic imbalance may increase proclivity to arrhythmias or be used to terminate arrhythmias. The model serves as a first step towards a digital twin for predicting neuromodulation to prevent or reduce disease.
Collapse
Affiliation(s)
- Pei-Chi Yang
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
| | - Adam Rose
- Department of Mathematics, University of California Davis, Davis, CA
| | - Kevin R. DeMarco
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
| | - John R. D. Dawson
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
| | - Yanxiao Han
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
| | - Mao-Tsuen Jeng
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
| | | | - L. Fernando Santana
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
| | | | - Igor Vorobyov
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
| | - Timothy J. Lewis
- Department of Mathematics, University of California Davis, Davis, CA
| | - Colleen E. Clancy
- Department of Physiology and Membrane Biology, University of California Davis, Davis, CA
- Center for Precision Medicine and Data Science, University of California Davis, Sacramento, CA
| |
Collapse
|
5
|
Chialva U, González Boscá V, Rotstein HG. Low-dimensional models of single neurons: a review. BIOLOGICAL CYBERNETICS 2023; 117:163-183. [PMID: 37060453 DOI: 10.1007/s00422-023-00960-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Accepted: 03/05/2023] [Indexed: 06/13/2023]
Abstract
The classical Hodgkin-Huxley (HH) point-neuron model of action potential generation is four-dimensional. It consists of four ordinary differential equations describing the dynamics of the membrane potential and three gating variables associated to a transient sodium and a delayed-rectifier potassium ionic currents. Conductance-based models of HH type are higher-dimensional extensions of the classical HH model. They include a number of supplementary state variables associated with other ionic current types, and are able to describe additional phenomena such as subthreshold oscillations, mixed-mode oscillations (subthreshold oscillations interspersed with spikes), clustering and bursting. In this manuscript we discuss biophysically plausible and phenomenological reduced models that preserve the biophysical and/or dynamic description of models of HH type and the ability to produce complex phenomena, but the number of effective dimensions (state variables) is lower. We describe several representative models. We also describe systematic and heuristic methods of deriving reduced models from models of HH type.
Collapse
Affiliation(s)
- Ulises Chialva
- Departamento de Matemática, Universidad Nacional del Sur and CONICET, Bahía Blanca, Buenos Aires, Argentina
| | | | - Horacio G Rotstein
- Federated Department of Biological Sciences, New Jersey Institute of Technology and Rutgers University, Newark, New Jersey, USA.
- Behavioral Neurosciences Program, Rutgers University, Newark, NJ, USA.
- Corresponding Investigators Group, CONICET, Buenos Aires, Argentina.
| |
Collapse
|
6
|
Garnier Artiñano T, Andalibi V, Atula I, Maestri M, Vanni S. Biophysical parameters control signal transfer in spiking network. Front Comput Neurosci 2023; 17:1011814. [PMID: 36761840 PMCID: PMC9905747 DOI: 10.3389/fncom.2023.1011814] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2022] [Accepted: 01/09/2023] [Indexed: 01/26/2023] Open
Abstract
Introduction Information transmission and representation in both natural and artificial networks is dependent on connectivity between units. Biological neurons, in addition, modulate synaptic dynamics and post-synaptic membrane properties, but how these relate to information transmission in a population of neurons is still poorly understood. A recent study investigated local learning rules and showed how a spiking neural network can learn to represent continuous signals. Our study builds on their model to explore how basic membrane properties and synaptic delays affect information transfer. Methods The system consisted of three input and output units and a hidden layer of 300 excitatory and 75 inhibitory leaky integrate-and-fire (LIF) or adaptive integrate-and-fire (AdEx) units. After optimizing the connectivity to accurately replicate the input patterns in the output units, we transformed the model to more biologically accurate units and included synaptic delay and concurrent action potential generation in distinct neurons. We examined three different parameter regimes which comprised either identical physiological values for both excitatory and inhibitory units (Comrade), more biologically accurate values (Bacon), or the Comrade regime whose output units were optimized for low reconstruction error (HiFi). We evaluated information transmission and classification accuracy of the network with four distinct metrics: coherence, Granger causality, transfer entropy, and reconstruction error. Results Biophysical parameters showed a major impact on information transfer metrics. The classification was surprisingly robust, surviving very low firing and information rates, whereas information transmission overall and particularly low reconstruction error were more dependent on higher firing rates in LIF units. In AdEx units, the firing rates were lower and less information was transferred, but interestingly the highest information transmission rates were no longer overlapping with the highest firing rates. Discussion Our findings can be reflected on the predictive coding theory of the cerebral cortex and may suggest information transfer qualities as a phenomenological quality of biological cells.
Collapse
Affiliation(s)
- Tomás Garnier Artiñano
- Helsinki University Hospital (HUS) Neurocenter, Neurology, Helsinki University Hospital, Helsinki, Finland,Department of Neurosciences, Clinicum, University of Helsinki, Helsinki, Finland
| | - Vafa Andalibi
- Department of Computer Science, Indiana University Bloomington, Bloomington, IN, United States
| | - Iiris Atula
- Helsinki University Hospital (HUS) Neurocenter, Neurology, Helsinki University Hospital, Helsinki, Finland,Department of Neurosciences, Clinicum, University of Helsinki, Helsinki, Finland
| | - Matteo Maestri
- Helsinki University Hospital (HUS) Neurocenter, Neurology, Helsinki University Hospital, Helsinki, Finland,Department of Neurosciences, Clinicum, University of Helsinki, Helsinki, Finland,Department of Biomedical and Neuromotor Sciences, University of Bologna, Bologna, Italy
| | - Simo Vanni
- Helsinki University Hospital (HUS) Neurocenter, Neurology, Helsinki University Hospital, Helsinki, Finland,Department of Neurosciences, Clinicum, University of Helsinki, Helsinki, Finland,Department of Physiology, Medicum, University of Helsinki, Helsinki, Finland,*Correspondence: Simo Vanni,
| |
Collapse
|
7
|
Koren V, Bondanelli G, Panzeri S. Computational methods to study information processing in neural circuits. Comput Struct Biotechnol J 2023; 21:910-922. [PMID: 36698970 PMCID: PMC9851868 DOI: 10.1016/j.csbj.2023.01.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2022] [Revised: 01/09/2023] [Accepted: 01/09/2023] [Indexed: 01/13/2023] Open
Abstract
The brain is an information processing machine and thus naturally lends itself to be studied using computational tools based on the principles of information theory. For this reason, computational methods based on or inspired by information theory have been a cornerstone of practical and conceptual progress in neuroscience. In this Review, we address how concepts and computational tools related to information theory are spurring the development of principled theories of information processing in neural circuits and the development of influential mathematical methods for the analyses of neural population recordings. We review how these computational approaches reveal mechanisms of essential functions performed by neural circuits. These functions include efficiently encoding sensory information and facilitating the transmission of information to downstream brain areas to inform and guide behavior. Finally, we discuss how further progress and insights can be achieved, in particular by studying how competing requirements of neural encoding and readout may be optimally traded off to optimize neural information processing.
Collapse
Affiliation(s)
- Veronika Koren
- Department of Excellence for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Falkenried 94, Hamburg 20251, Germany
| | | | - Stefano Panzeri
- Department of Excellence for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Falkenried 94, Hamburg 20251, Germany
- Istituto Italiano di Tecnologia, Via Melen 83, Genova 16152, Italy
| |
Collapse
|
8
|
Nanami T, Kohno T. Piecewise quadratic neuron model: A tool for close-to-biology spiking neuronal network simulation on dedicated hardware. Front Neurosci 2023; 16:1069133. [PMID: 36699524 PMCID: PMC9870328 DOI: 10.3389/fnins.2022.1069133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Accepted: 11/17/2022] [Indexed: 01/12/2023] Open
Abstract
Spiking neuron models simulate neuronal activities and allow us to analyze and reproduce the information processing of the nervous system. However, ionic-conductance models, which can faithfully reproduce neuronal activities, require a huge computational cost, while integral-firing models, which are computationally inexpensive, have some difficulties in reproducing neuronal activities. Here we propose a Piecewise Quadratic Neuron (PQN) model based on a qualitative modeling approach that aims to reproduce only the key dynamics behind neuronal activities. We demonstrate that PQN models can accurately reproduce the responses of ionic-conductance models of major neuronal classes to stimulus inputs of various magnitudes. In addition, the PQN model is designed to support the efficient implementation on digital arithmetic circuits for use as silicon neurons, and we confirm that the PQN model consumes much fewer circuit resources than the ionic-conductance models. This model intends to serve as a tool for building a large-scale closer-to-biology spiking neural network.
Collapse
Affiliation(s)
- Takuya Nanami
- Institute of Industrial Science, The University of Tokyo, Tokyo, Japan
| | - Takashi Kohno
- Institute of Industrial Science, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
9
|
Sidhu RS, Johnson EC, Jones DL, Ratnam R. A dynamic spike threshold with correlated noise predicts observed patterns of negative interval correlations in neuronal spike trains. BIOLOGICAL CYBERNETICS 2022; 116:611-633. [PMID: 36244004 PMCID: PMC9691502 DOI: 10.1007/s00422-022-00946-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Accepted: 09/26/2022] [Indexed: 06/16/2023]
Abstract
Negative correlations in the sequential evolution of interspike intervals (ISIs) are a signature of memory in neuronal spike-trains. They provide coding benefits including firing-rate stabilization, improved detectability of weak sensory signals, and enhanced transmission of information by improving signal-to-noise ratio. Primary electrosensory afferent spike-trains in weakly electric fish fall into two categories based on the pattern of ISI correlations: non-bursting units have negative correlations which remain negative but decay to zero with increasing lags (Type I ISI correlations), and bursting units have oscillatory (alternating sign) correlation which damp to zero with increasing lags (Type II ISI correlations). Here, we predict and match observed ISI correlations in these afferents using a stochastic dynamic threshold model. We determine the ISI correlation function as a function of an arbitrary discrete noise correlation function [Formula: see text], where k is a multiple of the mean ISI. The function permits forward and inverse calculations of the correlation function. Both types of correlation functions can be generated by adding colored noise to the spike threshold with Type I correlations generated with slow noise and Type II correlations generated with fast noise. A first-order autoregressive (AR) process with a single parameter is sufficient to predict and accurately match both types of afferent ISI correlation functions, with the type being determined by the sign of the AR parameter. The predicted and experimentally observed correlations are in geometric progression. The theory predicts that the limiting sum of ISI correlations is [Formula: see text] yielding a perfect DC-block in the power spectrum of the spike train. Observed ISI correlations from afferents have a limiting sum that is slightly larger at [Formula: see text] ([Formula: see text]). We conclude that the underlying process for generating ISIs may be a simple combination of low-order AR and moving average processes and discuss the results from the perspective of optimal coding.
Collapse
Affiliation(s)
- Robin S Sidhu
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Erik C Johnson
- The Johns Hopkins University Applied Physics Laboratory, Laurel, MD, USA
| | - Douglas L Jones
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Rama Ratnam
- Division of Biological and Life Sciences, School of Arts and Sciences, Ahmedabad University, Ahmedabad, Gujarat, India.
| |
Collapse
|
10
|
Zeng Y, Bao W, Tao L, Hu D, Yang Z, Yang L, Shang D. Regularized Spectral Spike Response Model: A Neuron Model for Robust Parameter Reduction. Brain Sci 2022; 12:1008. [PMID: 36009071 PMCID: PMC9405574 DOI: 10.3390/brainsci12081008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Revised: 07/12/2022] [Accepted: 07/25/2022] [Indexed: 02/04/2023] Open
Abstract
The modeling procedure of current biological neuron models is hindered by either hyperparameter optimization or overparameterization, which limits their application to a variety of biologically realistic tasks. This article proposes a novel neuron model called the Regularized Spectral Spike Response Model (RSSRM) to address these issues. The selection of hyperparameters is avoided by the model structure and fitting strategy, while the number of parameters is constrained by regularization techniques. Twenty firing simulation experiments indicate the superiority of RSSRM. In particular, after pruning more than 99% of its parameters, RSSRM with 100 parameters achieves an RMSE of 5.632 in membrane potential prediction, a VRD of 47.219, and an F1-score of 0.95 in spike train forecasting with correct timing (±1.4 ms), which are 25%, 99%, 55%, and 24% better than the average of other neuron models with the same number of parameters in RMSE, VRD, F1-score, and correct timing, respectively. Moreover, RSSRM with 100 parameters achieves a memory use of 10 KB and a runtime of 1 ms during inference, which is more efficient than the Izhikevich model.
Collapse
Affiliation(s)
- Yinuo Zeng
- Nanjing Institute of Intelligent Technology, Nanjing 210000, China; (Y.Z.); (W.B.); (D.H.); (Z.Y.); (L.Y.)
| | - Wendi Bao
- Nanjing Institute of Intelligent Technology, Nanjing 210000, China; (Y.Z.); (W.B.); (D.H.); (Z.Y.); (L.Y.)
| | - Liying Tao
- Institute of Microelectronics of the Chinese Academy of Sciences, Beijing 100000, China;
- University of Chinese Academy of Sciences, Beijing 100000, China
| | - Die Hu
- Nanjing Institute of Intelligent Technology, Nanjing 210000, China; (Y.Z.); (W.B.); (D.H.); (Z.Y.); (L.Y.)
| | - Zonglin Yang
- Nanjing Institute of Intelligent Technology, Nanjing 210000, China; (Y.Z.); (W.B.); (D.H.); (Z.Y.); (L.Y.)
| | - Liren Yang
- Nanjing Institute of Intelligent Technology, Nanjing 210000, China; (Y.Z.); (W.B.); (D.H.); (Z.Y.); (L.Y.)
| | - Delong Shang
- Nanjing Institute of Intelligent Technology, Nanjing 210000, China; (Y.Z.); (W.B.); (D.H.); (Z.Y.); (L.Y.)
- Institute of Microelectronics of the Chinese Academy of Sciences, Beijing 100000, China;
- University of Chinese Academy of Sciences, Beijing 100000, China
| |
Collapse
|
11
|
Thorn JT, Chenais NAL, Hinrichs S, Chatelain M, Ghezzi D. Virtual reality validation of naturalistic modulation strategies to counteract fading in retinal stimulation. J Neural Eng 2022; 19. [PMID: 35240583 DOI: 10.1088/1741-2552/ac5a5c] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/03/2022] [Indexed: 11/11/2022]
Abstract
OBJECTIVE Temporal resolution is a key challenge in artificial vision. Several prosthetic approaches are limited by the perceptual fading of evoked phosphenes upon repeated stimulation from the same electrode. Therefore, implanted patients are forced to perform active scanning, via head movements, to refresh the visual field viewed by the camera. However, active scanning is a draining task, and it is crucial to find compensatory strategies to reduce it. APPROACH To address this question, we implemented perceptual fading in simulated prosthetic vision using virtual reality. Then, we quantified the effect of fading on two indicators: the time to complete a reading task and the head rotation during the task. We also tested if stimulation strategies previously proposed to increase the persistence of responses in retinal ganglion cells to electrical stimulation could improve these indicators. MAIN RESULTS This study shows that stimulation strategies based on interrupted pulse trains and randomisation of the pulse duration allows significant reduction of both the time to complete the task and the head rotation during the task. SIGNIFICANCE The stimulation strategy used in retinal implants is crucial to counteract perceptual fading and to reduce active head scanning during prosthetic vision. In turn, less active scanning might improve the patient's comfort in artificial vision.
Collapse
Affiliation(s)
- Jacob Thomas Thorn
- Neuroengineering, EPFL STI, Chemin des Mines 9, Geneva, 1202, SWITZERLAND
| | | | - Sandrine Hinrichs
- École Polytechnique Fédérale de Lausanne, Chemin des Mines 9, Geneva, 1202, SWITZERLAND
| | - Marion Chatelain
- École Polytechnique Fédérale de Lausanne, Chemin des Mines 9, Geneva, 1202, SWITZERLAND
| | - Diego Ghezzi
- Medtronic Chair in Neuroengineering, Ecole Polytechnique Federale de Lausanne, EPFL STI IBI LNE, Lausanne, 1015, SWITZERLAND
| |
Collapse
|
12
|
Abstract
Neurophysiological measurements suggest that human information processing is evinced by neuronal activity. However, the quantitative relationship between the activity of a brain region and its information processing capacity remains unclear. We introduce and validate a mathematical model of the information processing capacity of a brain region in terms of neuronal activity, input storage capacity, and the arrival rate of afferent information. We applied the model to fMRI data obtained from a flanker paradigm in young and old subjects. Our analysis showed that—for a given cognitive task and subject—higher information processing capacity leads to lower neuronal activity and faster responses. Crucially, processing capacity—as estimated from fMRI data—predicted task and age-related differences in reaction times, speaking to the model’s predictive validity. This model offers a framework for modelling of brain dynamics in terms of information processing capacity, and may be exploited for studies of predictive coding and Bayes-optimal decision-making.
Collapse
|
13
|
Huang C, Zeldenrust F, Celikel T. Cortical Representation of Touch in Silico. Neuroinformatics 2022; 20:1013-1039. [PMID: 35486347 PMCID: PMC9588483 DOI: 10.1007/s12021-022-09576-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/19/2022] [Indexed: 12/31/2022]
Abstract
With its six layers and ~ 12,000 neurons, a cortical column is a complex network whose function is plausibly greater than the sum of its constituents'. Functional characterization of its network components will require going beyond the brute-force modulation of the neural activity of a small group of neurons. Here we introduce an open-source, biologically inspired, computationally efficient network model of the somatosensory cortex's granular and supragranular layers after reconstructing the barrel cortex in soma resolution. Comparisons of the network activity to empirical observations showed that the in silico network replicates the known properties of touch representations and whisker deprivation-induced changes in synaptic strength induced in vivo. Simulations show that the history of the membrane potential acts as a spatial filter that determines the presynaptic population of neurons contributing to a post-synaptic action potential; this spatial filtering might be critical for synaptic integration of top-down and bottom-up information.
Collapse
Affiliation(s)
- Chao Huang
- grid.9647.c0000 0004 7669 9786Department of Biology, University of Leipzig, Leipzig, Germany
| | - Fleur Zeldenrust
- grid.5590.90000000122931605Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, the Netherlands
| | - Tansu Celikel
- grid.5590.90000000122931605Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, the Netherlands ,grid.213917.f0000 0001 2097 4943School of Psychology, Georgia Institute of Technology, Atlanta, GA USA
| |
Collapse
|
14
|
Ahmed K. Brain-Inspired Spiking Neural Networks. Biomimetics (Basel) 2021. [DOI: 10.5772/intechopen.93435] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
Brain is a very efficient computing system. It performs very complex tasks while occupying about 2 liters of volume and consuming very little energy. The computation tasks are performed by special cells in the brain called neurons. They compute using electrical pulses and exchange information between them through chemicals called neurotransmitters. With this as inspiration, there are several compute models which exist today trying to exploit the inherent efficiencies demonstrated by nature. The compute models representing spiking neural networks (SNNs) are biologically plausible, hence are used to study and understand the workings of brain and nervous system. More importantly, they are used to solve a wide variety of problems in the field of artificial intelligence (AI). They are uniquely suited to model temporal and spatio-temporal data paradigms. This chapter explores the fundamental concepts of SNNs, few of the popular neuron models, how the information is represented, learning methodologies, and state of the art platforms for implementing and evaluating SNNs along with a discussion on their applications and broader role in the field of AI and data networks.
Collapse
|
15
|
An Integrate-and-Fire Spiking Neural Network Model Simulating Artificially Induced Cortical Plasticity. eNeuro 2021; 8:ENEURO.0333-20.2021. [PMID: 33632810 PMCID: PMC7986529 DOI: 10.1523/eneuro.0333-20.2021] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2020] [Revised: 02/10/2021] [Accepted: 02/16/2021] [Indexed: 11/21/2022] Open
Abstract
We describe an integrate-and-fire (IF) spiking neural network that incorporates spike-timing-dependent plasticity (STDP) and simulates the experimental outcomes of four different conditioning protocols that produce cortical plasticity. The original conditioning experiments were performed in freely moving non-human primates (NHPs) with an autonomous head-fixed bidirectional brain-computer interface (BCI). Three protocols involved closed-loop stimulation triggered from (1) spike activity of single cortical neurons, (2) electromyographic (EMG) activity from forearm muscles, and (3) cycles of spontaneous cortical beta activity. A fourth protocol involved open-loop delivery of pairs of stimuli at neighboring cortical sites. The IF network that replicates the experimental results consists of 360 units with simulated membrane potentials produced by synaptic inputs and triggering a spike when reaching threshold. The 240 cortical units produce either excitatory or inhibitory postsynaptic potentials (PSPs) in their target units. In addition to the experimentally observed conditioning effects, the model also allows computation of underlying network behavior not originally documented. Furthermore, the model makes predictions about outcomes from protocols not yet investigated, including spike-triggered inhibition, γ-triggered stimulation and disynaptic conditioning. The success of the simulations suggests that a simple voltage-based IF model incorporating STDP can capture the essential mechanisms mediating targeted plasticity with closed-loop stimulation.
Collapse
|
16
|
Fehrman C, Robbins TD, Meliza CD. Nonlinear effects of intrinsic dynamics on temporal encoding in a model of avian auditory cortex. PLoS Comput Biol 2021; 17:e1008768. [PMID: 33617539 PMCID: PMC7932506 DOI: 10.1371/journal.pcbi.1008768] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Revised: 03/04/2021] [Accepted: 02/04/2021] [Indexed: 11/18/2022] Open
Abstract
Neurons exhibit diverse intrinsic dynamics, which govern how they integrate synaptic inputs to produce spikes. Intrinsic dynamics are often plastic during development and learning, but the effects of these changes on stimulus encoding properties are not well known. To examine this relationship, we simulated auditory responses to zebra finch song using a linear-dynamical cascade model, which combines a linear spectrotemporal receptive field with a dynamical, conductance-based neuron model, then used generalized linear models to estimate encoding properties from the resulting spike trains. We focused on the effects of a low-threshold potassium current (KLT) that is present in a subset of cells in the zebra finch caudal mesopallium and is affected by early auditory experience. We found that KLT affects both spike adaptation and the temporal filtering properties of the receptive field. The direction of the effects depended on the temporal modulation tuning of the linear (input) stage of the cascade model, indicating a strongly nonlinear relationship. These results suggest that small changes in intrinsic dynamics in tandem with differences in synaptic connectivity can have dramatic effects on the tuning of auditory neurons. Experience-dependent developmental plasticity involves changes not only to synaptic connections, but to voltage-gated currents as well. Using biophysical models, it is straightforward to predict the effects of this intrinsic plasticity on the firing patterns of individual neurons, but it remains difficult to understand the consequences for sensory coding. We investigated this in the context of the zebra finch auditory cortex, where early exposure to a complex acoustic environment causes increased expression of a low-threshold potassium current. We simulated responses to song using a detailed biophysical model and then characterized encoding properties using generalized linear models. This analysis revealed that this potassium current has strong, nonlinear effects on how the model encodes the song’s temporal structure, and that the sign of these effects depend on the temporal tuning of the synaptic inputs. This nonlinearity gives intrinsic plasticity broad scope as a mechanism for developmental learning in the auditory system.
Collapse
Affiliation(s)
- Christof Fehrman
- Psychology Department, University of Virginia, Charlottesville, Virginia, United States of America
| | - Tyler D. Robbins
- Cognitive Science Program, University of Virginia, Charlottesville, Virginia, United States of America
| | - C. Daniel Meliza
- Psychology Department, University of Virginia, Charlottesville, Virginia, United States of America
- Neuroscience Graduate Program, University of Virginia, Charlottesville, Virginia, United States of America
- * E-mail:
| |
Collapse
|
17
|
Van Pottelbergh T, Drion G, Sepulchre R. From Biophysical to Integrate-and-Fire Modeling. Neural Comput 2021; 33:563-589. [PMID: 33400899 DOI: 10.1162/neco_a_01353] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
This article proposes a methodology to extract a low-dimensional integrate-and-fire model from an arbitrarily detailed single-compartment biophysical model. The method aims at relating the modulation of maximal conductance parameters in the biophysical model to the modulation of parameters in the proposed integrate-and-fire model. The approach is illustrated on two well-documented examples of cellular neuromodulation: the transition between type I and type II excitability and the transition between spiking and bursting.
Collapse
Affiliation(s)
| | - Guillaume Drion
- Department of Electrical Engineering and Computer Science, University of Liège, 4000 Liège, Belgium
| | - Rodolphe Sepulchre
- Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, U.K.
| |
Collapse
|
18
|
Whitwell HJ, Bacalini MG, Blyuss O, Chen S, Garagnani P, Gordleeva SY, Jalan S, Ivanchenko M, Kanakov O, Kustikova V, Mariño IP, Meyerov I, Ullner E, Franceschi C, Zaikin A. The Human Body as a Super Network: Digital Methods to Analyze the Propagation of Aging. Front Aging Neurosci 2020; 12:136. [PMID: 32523526 PMCID: PMC7261843 DOI: 10.3389/fnagi.2020.00136] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2020] [Accepted: 04/22/2020] [Indexed: 12/13/2022] Open
Abstract
Biological aging is a complex process involving multiple biological processes. These can be understood theoretically though considering them as individual networks-e.g., epigenetic networks, cell-cell networks (such as astroglial networks), and population genetics. Mathematical modeling allows the combination of such networks so that they may be studied in unison, to better understand how the so-called "seven pillars of aging" combine and to generate hypothesis for treating aging as a condition at relatively early biological ages. In this review, we consider how recent progression in mathematical modeling can be utilized to investigate aging, particularly in, but not exclusive to, the context of degenerative neuronal disease. We also consider how the latest techniques for generating biomarker models for disease prediction, such as longitudinal analysis and parenclitic analysis can be applied to as both biomarker platforms for aging, as well as to better understand the inescapable condition. This review is written by a highly diverse and multi-disciplinary team of scientists from across the globe and calls for greater collaboration between diverse fields of research.
Collapse
Affiliation(s)
- Harry J Whitwell
- Department of Chemical Engineering, Imperial College London, London, United Kingdom
| | | | - Oleg Blyuss
- School of Physics, Astronomy and Mathematics, University of Hertfordshire, Harfield, United Kingdom.,Department of Paediatrics and Paediatric Infectious Diseases, Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Shangbin Chen
- Britton Chance Centre for Biomedical Photonics, Wuhan National Laboratory for Optoelectronics-Huazhong University of Science and Technology, Wuhan, China
| | - Paolo Garagnani
- Department of Experimental, Diagnostic and Specialty Medicine (DIMES), University of Bologna, Bologna, Italy
| | - Susan Yu Gordleeva
- Laboratory of Systems Medicine of Healthy Aging, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Sarika Jalan
- Complex Systems Laboratory, Discipline of Physics, Indian Institute of Technology Indore, Indore, India.,Centre for Bio-Science and Bio-Medical Engineering, Indian Institute of Technology Indore, Indore, India
| | - Mikhail Ivanchenko
- Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Oleg Kanakov
- Laboratory of Systems Medicine of Healthy Aging, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Valentina Kustikova
- Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Ines P Mariño
- Department of Biology and Geology, Physics and Inorganic Chemistry, Universidad Rey Juan Carlos, Madrid, Spain
| | - Iosif Meyerov
- Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Ekkehard Ullner
- Department of Physics (SUPA), Institute for Complex Systems and Mathematical Biology, University of Aberdeen, Aberdeen, United Kingdom
| | - Claudio Franceschi
- Laboratory of Systems Medicine of Healthy Aging, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia.,Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Alexey Zaikin
- Department of Paediatrics and Paediatric Infectious Diseases, Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia.,Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia.,Department of Mathematics, Institute for Women's Health, University College London, London, United Kingdom
| |
Collapse
|
19
|
Matzner A, Gorodetski L, Korngreen A, Bar-Gad I. Dynamic input-dependent encoding of individual basal ganglia neurons. Sci Rep 2020; 10:5833. [PMID: 32242059 PMCID: PMC7118110 DOI: 10.1038/s41598-020-62750-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2019] [Accepted: 03/16/2020] [Indexed: 11/09/2022] Open
Abstract
Computational models are crucial to studying the encoding of individual neurons. Static models are composed of a fixed set of parameters, thus resulting in static encoding properties that do not change under different inputs. Here, we challenge this basic concept which underlies these models. Using generalized linear models, we quantify the encoding and information processing properties of basal ganglia neurons recorded in-vitro. These properties are highly sensitive to the internal state of the neuron due to factors such as dependency on the baseline firing rate. Verification of these experimental results with simulations provides insights into the mechanisms underlying this input-dependent encoding. Thus, static models, which are not context dependent, represent only part of the neuronal encoding capabilities, and are not sufficient to represent the dynamics of a neuron over varying inputs. Input-dependent encoding is crucial for expanding our understanding of neuronal behavior in health and disease and underscores the need for a new generation of dynamic neuronal models.
Collapse
Affiliation(s)
- Ayala Matzner
- The Leslie & Susan Goldschmied (Gonda) Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel
| | - Lilach Gorodetski
- Goodman Faculty of life sciences, Bar-Ilan University, Ramat-Gan, Israel
| | - Alon Korngreen
- The Leslie & Susan Goldschmied (Gonda) Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel.,Goodman Faculty of life sciences, Bar-Ilan University, Ramat-Gan, Israel
| | - Izhar Bar-Gad
- The Leslie & Susan Goldschmied (Gonda) Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel.
| |
Collapse
|
20
|
Amsalem O, Eyal G, Rogozinski N, Gevaert M, Kumbhar P, Schürmann F, Segev I. An efficient analytical reduction of detailed nonlinear neuron models. Nat Commun 2020; 11:288. [PMID: 31941884 PMCID: PMC6962154 DOI: 10.1038/s41467-019-13932-6] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Accepted: 12/09/2019] [Indexed: 12/31/2022] Open
Abstract
Detailed conductance-based nonlinear neuron models consisting of thousands of synapses are key for understanding of the computational properties of single neurons and large neuronal networks, and for interpreting experimental results. Simulations of these models are computationally expensive, considerably curtailing their utility. Neuron_Reduce is a new analytical approach to reduce the morphological complexity and computational time of nonlinear neuron models. Synapses and active membrane channels are mapped to the reduced model preserving their transfer impedance to the soma; synapses with identical transfer impedance are merged into one NEURON process still retaining their individual activation times. Neuron_Reduce accelerates the simulations by 40-250 folds for a variety of cell types and realistic number (10,000-100,000) of synapses while closely replicating voltage dynamics and specific dendritic computations. The reduced neuron-models will enable realistic simulations of neural networks at unprecedented scale, including networks emerging from micro-connectomics efforts and biologically-inspired "deep networks". Neuron_Reduce is publicly available and is straightforward to implement.
Collapse
Affiliation(s)
- Oren Amsalem
- Department of Neurobiology, Hebrew University of Jerusalem, 9190401, Jerusalem, Israel.
| | - Guy Eyal
- Department of Neurobiology, Hebrew University of Jerusalem, 9190401, Jerusalem, Israel
| | - Noa Rogozinski
- Department of Neurobiology, Hebrew University of Jerusalem, 9190401, Jerusalem, Israel
| | - Michael Gevaert
- Blue Brain Project, École polytechnique fédérale de Lausanne (EPFL), Campus Biotech, 1202, Geneva, Switzerland
| | - Pramod Kumbhar
- Blue Brain Project, École polytechnique fédérale de Lausanne (EPFL), Campus Biotech, 1202, Geneva, Switzerland
| | - Felix Schürmann
- Blue Brain Project, École polytechnique fédérale de Lausanne (EPFL), Campus Biotech, 1202, Geneva, Switzerland
| | - Idan Segev
- Department of Neurobiology, Hebrew University of Jerusalem, 9190401, Jerusalem, Israel
- Edmond and Lily Safra Center for Brain Sciences, Hebrew University of Jerusalem, 9190401, Jerusalem, Israel
| |
Collapse
|
21
|
Schmidt H, Knösche TR. Action potential propagation and synchronisation in myelinated axons. PLoS Comput Biol 2019; 15:e1007004. [PMID: 31622338 PMCID: PMC6818808 DOI: 10.1371/journal.pcbi.1007004] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Revised: 10/29/2019] [Accepted: 09/27/2019] [Indexed: 01/11/2023] Open
Abstract
With the advent of advanced MRI techniques it has become possible to study axonal white matter non-invasively and in great detail. Measuring the various parameters of the long-range connections of the brain opens up the possibility to build and refine detailed models of large-scale neuronal activity. One particular challenge is to find a mathematical description of action potential propagation that is sufficiently simple, yet still biologically plausible to model signal transmission across entire axonal fibre bundles. We develop a mathematical framework in which we replace the Hodgkin-Huxley dynamics by a spike-diffuse-spike model with passive sub-threshold dynamics and explicit, threshold-activated ion channel currents. This allows us to study in detail the influence of the various model parameters on the action potential velocity and on the entrainment of action potentials between ephaptically coupled fibres without having to recur to numerical simulations. Specifically, we recover known results regarding the influence of axon diameter, node of Ranvier length and internode length on the velocity of action potentials. Additionally, we find that the velocity depends more strongly on the thickness of the myelin sheath than was suggested by previous theoretical studies. We further explain the slowing down and synchronisation of action potentials in ephaptically coupled fibres by their dynamic interaction. In summary, this study presents a solution to incorporate detailed axonal parameters into a whole-brain modelling framework. With more and more data becoming available on white-matter tracts, the need arises to develop modelling frameworks that incorporate these data at the whole-brain level. This requires the development of efficient mathematical schemes to study parameter dependencies that can then be matched with data, in particular the speed of action potentials that cause delays between brain regions. Here, we develop a method that describes the formation of action potentials by threshold activated currents, often referred to as spike-diffuse-spike modelling. A particular focus of our study is the dependence of the speed of action potentials on structural parameters. We find that the diameter of axons and the thickness of the myelin sheath have a strong influence on the speed, whereas the length of myelinated segments and node of Ranvier length have a lesser effect. In addition to examining single axons, we demonstrate that action potentials between nearby axons can synchronise and slow down their propagation speed.
Collapse
Affiliation(s)
- Helmut Schmidt
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- * E-mail:
| | - Thomas R. Knösche
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- Institute of Biomedical Engineering and Informatics, Ilmenau University of Technology, Ilmenau, Germany
| |
Collapse
|
22
|
Chartrand T, Goldman MS, Lewis TJ. Synchronization of Electrically Coupled Resonate-and-Fire Neurons. SIAM JOURNAL ON APPLIED DYNAMICAL SYSTEMS 2019; 18:1643-1693. [PMID: 33273894 PMCID: PMC7709966 DOI: 10.1137/18m1197412] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
Electrical coupling between neurons is broadly present across brain areas and is typically assumed to synchronize network activity. However, intrinsic properties of the coupled cells can complicate this simple picture. Many cell types with electrical coupling show a diversity of post-spike subthreshold fluctuations, often linked to subthreshold resonance, which are transmitted through electrical synapses in addition to action potentials. Using the theory of weakly coupled oscillators, we explore the effect of both subthreshold and spike-mediated coupling on synchrony in small networks of electrically coupled resonate-and-fire neurons, a hybrid neuron model with damped subthreshold oscillations and a range of post-spike voltage dynamics. We calculate the phase response curve using an extension of the adjoint method that accounts for the discontinuous post-spike reset rule. We find that both spikes and subthreshold fluctuations can jointly promote synchronization. The subthreshold contribution is strongest when the voltage exhibits a significant post-spike elevation in voltage, or plateau potential. Additionally, we show that the geometry of trajectories approaching the spiking threshold causes a "reset-induced shear" effect that can oppose synchrony in the presence of network asymmetry, despite having no effect on the phase-locking of symmetrically coupled pairs.
Collapse
Affiliation(s)
- Thomas Chartrand
- Graduate Group in Applied Mathematics, University of California-Davis, Davis, CA 95616. Current address: Allen Institute for Brain Science, Seattle, WA
| | - Mark S Goldman
- Center for Neuroscience, Department of Neurobiology, Physiology and Behavior, Department of Ophthalmology and Vision Science, and Graduate Group in Applied Mathematics, University of California-Davis, Davis, CA 95616
| | - Timothy J Lewis
- Department of Mathematics and Graduate Group in Applied Mathematics, University of California-Davis, Davis, CA 95616
| |
Collapse
|
23
|
MacGregor DJ, Leng G. Emergent decision-making behaviour and rhythm generation in a computational model of the ventromedial nucleus of the hypothalamus. PLoS Comput Biol 2019; 15:e1007092. [PMID: 31158265 PMCID: PMC6564049 DOI: 10.1371/journal.pcbi.1007092] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2018] [Revised: 06/13/2019] [Accepted: 05/13/2019] [Indexed: 12/18/2022] Open
Abstract
The ventromedial nucleus of the hypothalamus (VMN) has an important role in diverse behaviours. The common involvement in these of sex steroids, nutritionally-related signals, and emotional inputs from other brain areas, suggests that, at any given time, its output is in one of a discrete number of possible states corresponding to discrete motivational drives. Here we explored how networks of VMN neurons might generate such a decision-making architecture. We began with minimalist assumptions about the intrinsic properties of VMN neurons inferred from electrophysiological recordings of these neurons in rats in vivo, using an integrate-and-fire based model modified to simulate activity-dependent post-spike changes in neuronal excitability. We used a genetic algorithm based method to fit model parameters to the statistical features of spike patterning in each cell. The spike patterns in both recorded cells and model cells were assessed by analysis of interspike interval distributions and of the index of dispersion of firing rate over different binwidths. Simpler patterned cells could be closely matched by single neuron models incorporating a hyperpolarising afterpotential and either a slow afterhyperpolarisation or a depolarising afterpotential, but many others could not. We then constructed network models with the challenge of explaining the more complex patterns. We assumed that neurons of a given type (with heterogeneity introduced by independently random patterns of external input) were mutually interconnected at random by excitatory synaptic connections (with a variable delay and a random chance of failure). Simple network models of one or two cell types were able to explain the more complex patterns. We then explored the information processing features of such networks that might be relevant for a decision-making network. We concluded that rhythm generation (in the slow theta range) and bistability arise as emergent properties of networks of heterogeneous VMN neurons.
Collapse
Affiliation(s)
- Duncan J. MacGregor
- Centre for Discovery Brain Sciences, University of Edinburgh, Edinburgh, United Kingdom
- * E-mail:
| | - Gareth Leng
- Centre for Discovery Brain Sciences, University of Edinburgh, Edinburgh, United Kingdom
| |
Collapse
|
24
|
Ocker GK, Doiron B. Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
25
|
Bjoring MC, Meliza CD. A low-threshold potassium current enhances sparseness and reliability in a model of avian auditory cortex. PLoS Comput Biol 2019; 15:e1006723. [PMID: 30689626 PMCID: PMC6366721 DOI: 10.1371/journal.pcbi.1006723] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2018] [Revised: 02/07/2019] [Accepted: 12/17/2018] [Indexed: 11/18/2022] Open
Abstract
Birdsong is a complex vocal communication signal, and like humans, birds need to discriminate between similar sequences of sound with different meanings. The caudal mesopallium (CM) is a cortical-level auditory area implicated in song discrimination. CM neurons respond sparsely to conspecific song and are tolerant of production variability. Intracellular recordings in CM have identified a diversity of intrinsic membrane dynamics, which could contribute to the emergence of these higher-order functional properties. We investigated this hypothesis using a novel linear-dynamical cascade model that incorporated detailed biophysical dynamics to simulate auditory responses to birdsong. Neuron models that included a low-threshold potassium current present in a subset of CM neurons showed increased selectivity and coding efficiency relative to models without this current. These results demonstrate the impact of intrinsic dynamics on sensory coding and the importance of including the biophysical characteristics of neural populations in simulation studies. Maintaining a stable mental representation of an object is an important task for sensory systems, requiring both recognizing the features required for identification and ignoring incidental changes in its presentation. The prevailing explanation for these processes emphasizes precise sets of connections between neurons that capture only the essential features of an object. However, the intrinsic dynamics of the neurons themselves, which determine how these inputs are transformed into spiking outputs, may also contribute to the neural computations underlying object recognition. To understand how intrinsic dynamics contribute to sensory coding, we constructed a computational model capable of simulating a neural response to an auditory stimulus using a detailed description of different intrinsic dynamics in a higher-order avian auditory area. The results of our simulation showed that intrinsic dynamics can have a profound effect on processes underlying object recognition. These findings challenge the view that patterns of connectivity alone account for the emergence of stable object representations and encourage greater consideration of the functional implications of the diversity of neurons in the brain.
Collapse
Affiliation(s)
- Margot C. Bjoring
- Department of Psychology, University of Virginia, Charlottesville, VA, USA
| | - C. Daniel Meliza
- Department of Psychology, University of Virginia, Charlottesville, VA, USA
- Neuroscience Graduate Program, University of Virginia, Charlottesville, VA, USA
- * E-mail:
| |
Collapse
|
26
|
Clarke SE. Analog Signaling With the "Digital" Molecular Switch CaMKII. Front Comput Neurosci 2018; 12:92. [PMID: 30524260 PMCID: PMC6262075 DOI: 10.3389/fncom.2018.00092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2018] [Accepted: 10/31/2018] [Indexed: 11/13/2022] Open
Abstract
Molecular switches, such as the protein kinase CaMKII, play a fundamental role in cell signaling by decoding inputs into either high or low states of activity; because the high activation state can be turned on and persist after the input ceases, these switches have earned a reputation as "digital." Although this on/off, binary perspective has been valuable for understanding long timescale synaptic plasticity, accumulating experimental evidence suggests that the CaMKII switch can also control plasticity on short timescales. To investigate this idea further, a non-autonomous, nonlinear ordinary differential equation, representative of a general bistable molecular switch, is analyzed. The results suggest that switch activity in regions surrounding either the high- or low-stable states of activation could act as a reliable analog signal, whose short timescale fluctuations relative to equilibrium track instantaneous input frequency. The model makes intriguing predictions and is validated against previous work demonstrating its suitability as a minimal representation of switch dynamics; in combination with existing experimental evidence, the theory suggests a multiplexed encoding of instantaneous frequency information over short timescales, with integration of total activity over longer timescales.
Collapse
Affiliation(s)
- Stephen E Clarke
- Department of Bioengineering, Department of Neurosurgery, Stanford University, Stanford, CA, United States
| |
Collapse
|
27
|
D'Onofrio G, Tamborrino M, Lansky P. The Jacobi diffusion process as a neuronal model. CHAOS (WOODBURY, N.Y.) 2018; 28:103119. [PMID: 30384666 DOI: 10.1063/1.5051494] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/09/2018] [Accepted: 10/01/2018] [Indexed: 06/08/2023]
Abstract
The Jacobi process is a stochastic diffusion characterized by a linear drift and a special form of multiplicative noise which keeps the process confined between two boundaries. One example of such a process can be obtained as the diffusion limit of the Stein's model of membrane depolarization which includes both excitatory and inhibitory reversal potentials. The reversal potentials create the two boundaries between which the process is confined. Solving the first-passage-time problem for the Jacobi process, we found closed-form expressions for mean, variance, and third moment that are easy to implement numerically. The first two moments are used here to determine the role played by the parameters of the neuronal model; namely, the effect of multiplicative noise on the output of the Jacobi neuronal model with input-dependent parameters is examined in detail and compared with the properties of the generic Jacobi diffusion. It appears that the dependence of the model parameters on the rate of inhibition turns out to be of primary importance to observe a change in the slope of the response curves. This dependence also affects the variability of the output as reflected by the coefficient of variation. It often takes values larger than one, and it is not always a monotonic function in dependency on the rate of excitation.
Collapse
Affiliation(s)
- Giuseppe D'Onofrio
- Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| | - Massimiliano Tamborrino
- Johannes Kepler University Linz, Institute for Stochastics Altenbergerstraße 69, 4040 Linz, Austria
| | - Petr Lansky
- Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| |
Collapse
|
28
|
Baram Y. Circuit Polarity Effect of Cortical Connectivity, Activity, and Memory. Neural Comput 2018; 30:3037-3071. [PMID: 30216139 DOI: 10.1162/neco_a_01128] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Experimental constraints have traditionally implied separate studies of different cortical functions, such as memory and sensory-motor control. Yet certain cortical modalities, while repeatedly observed and reported, have not been clearly identified with one cortical function or another. Specifically, while neuronal membrane and synapse polarities with respect to a certain potential value have been attracting considerable interest in recent years, the purposes of such polarities have largely remained a subject for speculation and debate. Formally identifying these polarities as on-off neuronal polarity gates, we analytically show that cortical circuit structure, behavior, and memory are all governed by the combined potent effect of these gates, which we collectively term circuit polarity. Employing widely accepted and biologically validated firing rate and plasticity paradigms, we show that circuit polarity is mathematically embedded in the corresponding models. Moreover, we show that the firing rate dynamics implied by these models are driven by ongoing circuit polarity gating dynamics. Furthermore, circuit polarity is shown to segregate cortical circuits into internally synchronous, externally asynchronous subcircuits, defining their firing rate modes in accordance with different cortical tasks. In contrast to the Hebbian paradigm, which is shown to be susceptible to mutual neuronal interference in the face of asynchrony, circuit polarity is shown to block such interference. Noting convergence of synaptic weights, we show that circuit polarity holds the key to cortical memory, having a segregated capacity linear in the number of neurons. While memory concealment is implied by complete neuronal silencing, memory is restored by reactivating the original circuit polarity. Finally, we show that incomplete deterioration or restoration of circuit polarity results in memory modification, which may be associated with partial or false recall, or novel innovation.
Collapse
Affiliation(s)
- Yoram Baram
- Computer Science Department, Technion-Israel Institute of Technology, Haifa 32000, Israel
| |
Collapse
|
29
|
Ullner E, Politi A, Torcini A. Ubiquity of collective irregular dynamics in balanced networks of spiking neurons. CHAOS (WOODBURY, N.Y.) 2018; 28:081106. [PMID: 30180628 DOI: 10.1063/1.5049902] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 08/09/2018] [Indexed: 06/08/2023]
Abstract
We revisit the dynamics of a prototypical model of balanced activity in networks of spiking neurons. A detailed investigation of the thermodynamic limit for fixed density of connections (massive coupling) shows that, when inhibition prevails, the asymptotic regime is not asynchronous but rather characterized by a self-sustained irregular, macroscopic (collective) dynamics. So long as the connectivity is massive, this regime is found in many different setups: leaky as well as quadratic integrate-and-fire neurons; large and small coupling strength; and weak and strong external currents.
Collapse
Affiliation(s)
- Ekkehard Ullner
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Antonio Politi
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Alessandro Torcini
- Max Planck Institut für Physik komplexer Systeme, Nöthnitzer Str. 38, 01187 Dresden, Germany
| |
Collapse
|
30
|
Verzi SJ, Rothganger F, Parekh OD, Quach TT, Miner NE, Vineyard CM, James CD, Aimone JB. Computing with Spikes: The Advantage of Fine-Grained Timing. Neural Comput 2018; 30:2660-2690. [PMID: 30021083 DOI: 10.1162/neco_a_01113] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Neural-inspired spike-based computing machines often claim to achieve considerable advantages in terms of energy and time efficiency by using spikes for computation and communication. However, fundamental questions about spike-based computation remain unanswered. For instance, how much advantage do spike-based approaches have over conventional methods, and under what circumstances does spike-based computing provide a comparative advantage? Simply implementing existing algorithms using spikes as the medium of computation and communication is not guaranteed to yield an advantage. Here, we demonstrate that spike-based communication and computation within algorithms can increase throughput, and they can decrease energy cost in some cases. We present several spiking algorithms, including sorting a set of numbers in ascending/descending order, as well as finding the maximum or minimum or median of a set of numbers. We also provide an example application: a spiking median-filtering approach for image processing providing a low-energy, parallel implementation. The algorithms and analyses presented here demonstrate that spiking algorithms can provide performance advantages and offer efficient computation of fundamental operations useful in more complex algorithms.
Collapse
Affiliation(s)
- Stephen J Verzi
- Energy, Earth and Complex Systems Center, Sandia National Laboratories, NM 87185-1138, U.S.A.
| | - Fredrick Rothganger
- Center for Computing Research, Sandia National Laboratories, NM 87185-1326, U.S.A.
| | - Ojas D Parekh
- Center for Computing Research, Sandia National Laboratories, NM 87185-1326, U.S.A.
| | - Tu-Thach Quach
- Threat Intelligence Center, Sandia National Laboratories, NM 87185-1248, U.S.A.
| | - Nadine E Miner
- System Mission Engineering Center, Sandia National Laboratories, NM 87185-9405, U.S.A.
| | - Craig M Vineyard
- Center for Computing Research, Sandia National Laboratories, NM 87185-1327, U.S.A.
| | - Conrad D James
- Microsystems Science, Technology and Components Center, Sandia National Laboratories, NM 87185-1425, U.S.A.
| | - James B Aimone
- Center for Computing Research, Sandia National Laboratories, NM 87185-1327, U.S.A.
| |
Collapse
|
31
|
Bird AD, Richardson MJE. Transmission of temporally correlated spike trains through synapses with short-term depression. PLoS Comput Biol 2018; 14:e1006232. [PMID: 29933363 PMCID: PMC6039054 DOI: 10.1371/journal.pcbi.1006232] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Revised: 07/10/2018] [Accepted: 05/24/2018] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic depression, caused by depletion of releasable neurotransmitter, modulates the strength of neuronal connections in a history-dependent manner. Quantifying the statistics of synaptic transmission requires stochastic models that link probabilistic neurotransmitter release with presynaptic spike-train statistics. Common approaches are to model the presynaptic spike train as either regular or a memory-less Poisson process: few analytical results are available that describe depressing synapses when the afferent spike train has more complex, temporally correlated statistics such as bursts. Here we present a series of analytical results—from vesicle release-site occupancy statistics, via neurotransmitter release, to the post-synaptic voltage mean and variance—for depressing synapses driven by correlated presynaptic spike trains. The class of presynaptic drive considered is that fully characterised by the inter-spike-interval distribution and encompasses a broad range of models used for neuronal circuit and network analyses, such as integrate-and-fire models with a complete post-spike reset and receiving sufficiently short-time correlated drive. We further demonstrate that the derived post-synaptic voltage mean and variance allow for a simple and accurate approximation of the firing rate of the post-synaptic neuron, using the exponential integrate-and-fire model as an example. These results extend the level of biological detail included in models of synaptic transmission and will allow for the incorporation of more complex and physiologically relevant firing patterns into future studies of neuronal networks. Synapses between neurons transmit signals with strengths that vary with the history of their activity, over scales from milliseconds to decades. Short-term changes in synaptic strength modulate and sculpt ongoing neuronal activity, whereas long-term changes underpin memory formation. Here we focus on changes of strength over timescales of less than a second caused by transitory depletion of the neurotransmitters that carry signals across the synapse. Neurotransmitters are stored in small vesicles that release their contents, with a certain probability, when the presynaptic neuron is active. Once a vesicle has been used it is replenished after a variable delay. There is therefore a complex interaction between the pattern of incoming signals to the synapse and the probablistic release and restock of packaged neurotransmitter. Here we extend existing models to examine how correlated synaptic activity is transmitted through synapses and affects the voltage fluctuations and firing rate of the target neuron. Our results provide a framework that will allow for the inclusion of biophysically realistic synaptic behaviour in studies of neuronal circuits.
Collapse
Affiliation(s)
- Alex D. Bird
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
- Ernst Strüngmann Institute for Neuroscience, Max Planck Society, Frankfurt, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt, Germany
- * E-mail: (ADB); (MJER)
| | - Magnus J. E. Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry, United Kingdom
- * E-mail: (ADB); (MJER)
| |
Collapse
|
32
|
Leugering J, Pipa G. A Unifying Framework of Synaptic and Intrinsic Plasticity in Neural Populations. Neural Comput 2018; 30:945-986. [PMID: 29342400 DOI: 10.1162/neco_a_01057] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
A neuronal population is a computational unit that receives a multivariate, time-varying input signal and creates a related multivariate output. These neural signals are modeled as stochastic processes that transmit information in real time, subject to stochastic noise. In a stationary environment, where the input signals can be characterized by constant statistical properties, the systematic relationship between its input and output processes determines the computation carried out by a population. When these statistical characteristics unexpectedly change, the population needs to adapt to its new environment if it is to maintain stable operation. Based on the general concept of homeostatic plasticity, we propose a simple compositional model of adaptive networks that achieve invariance with regard to undesired changes in the statistical properties of their input signals and maintain outputs with well-defined joint statistics. To achieve such invariance, the network model combines two functionally distinct types of plasticity. An abstract stochastic process neuron model implements a generalized form of intrinsic plasticity that adapts marginal statistics, relying only on mechanisms locally confined within each neuron and operating continuously in time, while a simple form of Hebbian synaptic plasticity operates on synaptic connections, thus shaping the interrelation between neurons as captured by a copula function. The combined effect of both mechanisms allows a neuron population to discover invariant representations of its inputs that remain stable under a wide range of transformations (e.g., shifting, scaling and (affine linear) mixing). The probabilistic model of homeostatic adaptation on a population level as presented here allows us to isolate and study the individual and the interaction dynamics of both mechanisms of plasticity and could guide the future search for computationally beneficial types of adaptation.
Collapse
Affiliation(s)
- Johannes Leugering
- Neuroinformatics Group, Institute of Cognitive Science, Osnabrück University, D-49069 Osnabrück, Germany
| | - Gordon Pipa
- Neuroinformatics Group, Institute of Cognitive Science, Osnabrück University, D-49069 Osnabrück, Germany
| |
Collapse
|
33
|
Ashida G, Tollin DJ, Kretzberg J. Physiological models of the lateral superior olive. PLoS Comput Biol 2017; 13:e1005903. [PMID: 29281618 PMCID: PMC5744914 DOI: 10.1371/journal.pcbi.1005903] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2017] [Accepted: 11/28/2017] [Indexed: 01/09/2023] Open
Abstract
In computational biology, modeling is a fundamental tool for formulating, analyzing and predicting complex phenomena. Most neuron models, however, are designed to reproduce certain small sets of empirical data. Hence their outcome is usually not compatible or comparable with other models or datasets, making it unclear how widely applicable such models are. In this study, we investigate these aspects of modeling, namely credibility and generalizability, with a specific focus on auditory neurons involved in the localization of sound sources. The primary cues for binaural sound localization are comprised of interaural time and level differences (ITD/ILD), which are the timing and intensity differences of the sound waves arriving at the two ears. The lateral superior olive (LSO) in the auditory brainstem is one of the locations where such acoustic information is first computed. An LSO neuron receives temporally structured excitatory and inhibitory synaptic inputs that are driven by ipsi- and contralateral sound stimuli, respectively, and changes its spike rate according to binaural acoustic differences. Here we examine seven contemporary models of LSO neurons with different levels of biophysical complexity, from predominantly functional ones (‘shot-noise’ models) to those with more detailed physiological components (variations of integrate-and-fire and Hodgkin-Huxley-type). These models, calibrated to reproduce known monaural and binaural characteristics of LSO, generate largely similar results to each other in simulating ITD and ILD coding. Our comparisons of physiological detail, computational efficiency, predictive performances, and further expandability of the models demonstrate (1) that the simplistic, functional LSO models are suitable for applications where low computational costs and mathematical transparency are needed, (2) that more complex models with detailed membrane potential dynamics are necessary for simulation studies where sub-neuronal nonlinear processes play important roles, and (3) that, for general purposes, intermediate models might be a reasonable compromise between simplicity and biological plausibility. Computational models help our understanding of complex biological systems, by identifying their key elements and revealing their operational principles. Close comparisons between model predictions and empirical observations ensure our confidence in a model as a building block for further applications. Most current neuronal models, however, are constructed to replicate only a small specific set of experimental data. Thus, it is usually unclear how these models can be generalized to different datasets and how they compare with each other. In this paper, seven neuronal models are examined that are designed to reproduce known physiological characteristics of auditory neurons involved in the detection of sound source location. Despite their different levels of complexity, the models generate largely similar results when their parameters are tuned with common criteria. Comparisons show that simple models are computationally more efficient and theoretically transparent, and therefore suitable for rigorous mathematical analyses and engineering applications including real-time simulations. In contrast, complex models are necessary for investigating the relationship between underlying biophysical processes and sub- and suprathreshold spiking properties, although they have a large number of unconstrained, unverified parameters. Having identified their advantages and drawbacks, these auditory neuron models may readily be used for future studies and applications.
Collapse
Affiliation(s)
- Go Ashida
- Cluster of Excellence "Hearing4all", Department of Neuroscience, University of Oldenburg, Oldenburg, Germany
| | - Daniel J Tollin
- Department of Physiology and Biophysics, University of Colorado School of Medicine, Aurora, Colorado, United States of America
| | - Jutta Kretzberg
- Cluster of Excellence "Hearing4all", Department of Neuroscience, University of Oldenburg, Oldenburg, Germany
| |
Collapse
|
34
|
Kobayashi R, Nishimaru H, Nishijo H, Lansky P. A single spike deteriorates synaptic conductance estimation. Biosystems 2017; 161:41-45. [PMID: 28756162 DOI: 10.1016/j.biosystems.2017.07.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2017] [Revised: 07/19/2017] [Accepted: 07/20/2017] [Indexed: 11/19/2022]
Abstract
We investigated the estimation accuracy of synaptic conductances by analyzing simulated voltage traces generated by a Hodgkin-Huxley type model. We show that even a single spike substantially deteriorates the estimation. We also demonstrate that two approaches, namely, negative current injection and spike removal, can ameliorate this deterioration.
Collapse
Affiliation(s)
- Ryota Kobayashi
- Principles of Informatics Research Division, National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan; Department of Informatics, Graduate University for Advanced Studies (Sokendai), 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan.
| | - Hiroshi Nishimaru
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama, Sugitani 2630, Toyama 930-0194, Japan
| | - Hisao Nishijo
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama, Sugitani 2630, Toyama 930-0194, Japan
| | - Petr Lansky
- Institute of Physiology, The Czech Academy of Sciences, 142 20 Prague 4, Czech Republic
| |
Collapse
|
35
|
Vich C, Berg RW, Guillamon A, Ditlevsen S. Estimation of Synaptic Conductances in Presence of Nonlinear Effects Caused by Subthreshold Ionic Currents. Front Comput Neurosci 2017; 11:69. [PMID: 28790909 PMCID: PMC5524927 DOI: 10.3389/fncom.2017.00069] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Accepted: 07/07/2017] [Indexed: 11/13/2022] Open
Abstract
Subthreshold fluctuations in neuronal membrane potential traces contain nonlinear components, and employing nonlinear models might improve the statistical inference. We propose a new strategy to estimate synaptic conductances, which has been tested using in silico data and applied to in vivo recordings. The model is constructed to capture the nonlinearities caused by subthreshold activated currents, and the estimation procedure can discern between excitatory and inhibitory conductances using only one membrane potential trace. More precisely, we perform second order approximations of biophysical models to capture the subthreshold nonlinearities, resulting in quadratic integrate-and-fire models, and apply approximate maximum likelihood estimation where we only suppose that conductances are stationary in a 50–100 ms time window. The results show an improvement compared to existent procedures for the models tested here.
Collapse
Affiliation(s)
- Catalina Vich
- Departament de Matemàtiques i Informàtica, Universitat de les Illes BalearsPalma, Spain
| | - Rune W Berg
- Center for Neuroscience, University of CopenhagenCopenhagen, Denmark
| | - Antoni Guillamon
- Departament de Matemàtiques, Universitat Politècnica de CatalunyaBarcelona, Spain
| | - Susanne Ditlevsen
- Department of Mathematical Sciences, University of CopenhagenCopenhagen, Denmark
| |
Collapse
|
36
|
Leng T, Leng G, MacGregor DJ. Spike patterning in oxytocin neurons: Capturing physiological behaviour with Hodgkin-Huxley and integrate-and-fire models. PLoS One 2017; 12:e0180368. [PMID: 28683135 PMCID: PMC5500322 DOI: 10.1371/journal.pone.0180368] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2017] [Accepted: 06/14/2017] [Indexed: 01/08/2023] Open
Abstract
Integrate-and-fire (IF) models can provide close matches to the discharge activity of neurons, but do they oversimplify the biophysical properties of the neurons? A single compartment Hodgkin-Huxley (HH) model of the oxytocin neuron has previously been developed, incorporating biophysical measurements of channel properties obtained in vitro. A simpler modified integrate-and-fire model has also been developed, which can match well the characteristic spike patterning of oxytocin neurons as observed in vivo. Here, we extended the HH model to incorporate synaptic input, to enable us to compare spike activity in the model with experimental data obtained in vivo. We refined the HH model parameters to closely match the data, and then matched the same experimental data with a modified IF model, using an evolutionary algorithm to optimise parameter matching. Finally we compared the properties of the modified HH model with those of the IF model to seek an explanation for differences between spike patterning in vitro and in vivo. We show that, with slight modifications, the original HH model, like the IF model, is able to closely match both the interspike interval (ISI) distributions of oxytocin neurons and the observed variability of spike firing rates in vivo and in vitro. This close match of both models to data depends on the presence of a slow activity-dependent hyperpolarisation (AHP); this is represented in both models and the parameters used in the HH model representation match well with optimal parameters of the IF model found by an evolutionary algorithm. The ability of both models to fit data closely also depends on a shorter hyperpolarising after potential (HAP); this is explicitly represented in the IF model, but in the HH model, it emerges from a combination of several components. The critical elements of this combination are identified.
Collapse
Affiliation(s)
- Trystan Leng
- Centre for Integrative Physiology, University of Edinburgh, Edinburgh, United Kingdom
| | - Gareth Leng
- Centre for Integrative Physiology, University of Edinburgh, Edinburgh, United Kingdom
| | - Duncan J. MacGregor
- Centre for Integrative Physiology, University of Edinburgh, Edinburgh, United Kingdom
- * E-mail:
| |
Collapse
|
37
|
|
38
|
Dodla R, Wilson CJ. Effect of Phase Response Curve Shape and Synaptic Driving Force on Synchronization of Coupled Neuronal Oscillators. Neural Comput 2017; 29:1769-1814. [PMID: 28562223 DOI: 10.1162/neco_a_00978] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The role of the phase response curve (PRC) shape on the synchrony of synaptically coupled oscillating neurons is examined. If the PRC is independent of the phase, because of the synaptic form of the coupling, synchrony is found to be stable for both excitatory and inhibitory coupling at all rates, whereas the antisynchrony becomes stable at low rates. A faster synaptic rise helps extend the stability of antisynchrony to higher rates. If the PRC is not constant but has a profile like that of a leaky integrate-and-fire model, then, in contrast to the earlier reports that did not include the voltage effects, mutual excitation could lead to stable synchrony provided the synaptic reversal potential is below the voltage level the neuron would have reached in the absence of the interaction and threshold reset. This level is controlled by the applied current and the leakage parameters. Such synchrony is contingent on significant phase response (that would result, for example, by a sharp PRC jump) occurring during the synaptic rising phase. The rising phase, however, does not contribute significantly if it occurs before the voltage spike reaches its peak. Then a stable near-synchronous state can still exist between type 1 PRC neurons if the PRC shows a left skewness in its shape. These results are examined comprehensively using perfect integrate-and-fire, leaky integrate-and-fire, and skewed PRC shapes under the assumption of the weakly coupled oscillator theory applied to synaptically coupled neuron models.
Collapse
Affiliation(s)
- Ramana Dodla
- Department of Biology, University of Texas at San Antonio, San Antonio, TX 78249, U.S.A.
| | - Charles J Wilson
- Department of Biology, University of Texas at San Antonio, San Antonio, TX 78249, U.S.A.
| |
Collapse
|
39
|
Butzin NC, Hochendoner P, Ogle CT, Mather WH. Entrainment of a Bacterial Synthetic Gene Oscillator through Proteolytic Queueing. ACS Synth Biol 2017; 6:455-462. [PMID: 27935286 DOI: 10.1021/acssynbio.6b00157] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Internal chemical oscillators (chemical clocks) direct the behavior of numerous biological systems, and maintenance of a given period and phase among many such oscillators may be important for their proper function. However, both environmental variability and fundamental molecular noise can cause biochemical oscillators to lose coherence. One solution to maintaining coherence is entrainment, where an external signal provides a cue that resets the phase of the oscillators. In this work, we study the entrainment of gene networks by a queueing interaction established by competition between proteins for a common proteolytic pathway. Principles of queueing entrainment are investigated for an established synthetic oscillator in Escherichia coli. We first explore this theoretically using a standard chemical reaction network model and a map-based model, both of which suggest that queueing entrainment can be achieved through pulsatile production of an additional protein competing for a common degradation pathway with the oscillator proteins. We then use a combination of microfluidics and fluorescence microscopy to verify that pulse trains modulating the production rate of a fluorescent protein targeted to the same protease (ClpXP) as the synthetic oscillator can entrain the oscillator.
Collapse
Affiliation(s)
- Nicholas C. Butzin
- Department
of Physics, Virginia Tech, 850 West Campus Drive, Blacksburg, Virginia 24061-0435, United States
| | - Philip Hochendoner
- Department
of Physics, Virginia Tech, 850 West Campus Drive, Blacksburg, Virginia 24061-0435, United States
| | - Curtis T. Ogle
- Department
of Physics, Virginia Tech, 850 West Campus Drive, Blacksburg, Virginia 24061-0435, United States
| | - William H. Mather
- Department
of Physics, Virginia Tech, 850 West Campus Drive, Blacksburg, Virginia 24061-0435, United States
- Department
of Biological Sciences, Virginia Tech, 1405 Perry Street, Blacksburg, Virginia 24061-0406, United States
| |
Collapse
|
40
|
Baram Y. Developmental metaplasticity in neural circuit codes of firing and structure. Neural Netw 2016; 85:182-196. [PMID: 27890605 DOI: 10.1016/j.neunet.2016.09.007] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2016] [Revised: 09/02/2016] [Accepted: 09/20/2016] [Indexed: 11/29/2022]
Abstract
Firing-rate dynamics have been hypothesized to mediate inter-neural information transfer in the brain. While the Hebbian paradigm, relating learning and memory to firing activity, has put synaptic efficacy variation at the center of cortical plasticity, we suggest that the external expression of plasticity by changes in the firing-rate dynamics represents a more general notion of plasticity. Hypothesizing that time constants of plasticity and firing dynamics increase with age, and employing the filtering property of the neuron, we obtain the elementary code of global attractors associated with the firing-rate dynamics in each developmental stage. We define a neural circuit connectivity code as an indivisible set of circuit structures generated by membrane and synapse activation and silencing. Synchronous firing patterns under parameter uniformity, and asynchronous circuit firing are shown to be driven, respectively, by membrane and synapse silencing and reactivation, and maintained by the neuronal filtering property. Analytic, graphical and simulation representation of the discrete iteration maps and of the global attractor codes of neural firing rate are found to be consistent with previous empirical neurobiological findings, which have lacked, however, a specific correspondence between firing modes, time constants, circuit connectivity and cortical developmental stages.
Collapse
Affiliation(s)
- Yoram Baram
- Computer Science Department, Technion - Israel Institute of Technology, Haifa 32000, Israel.
| |
Collapse
|
41
|
Aspart F, Ladenbauer J, Obermayer K. Extending Integrate-and-Fire Model Neurons to Account for the Effects of Weak Electric Fields and Input Filtering Mediated by the Dendrite. PLoS Comput Biol 2016; 12:e1005206. [PMID: 27893786 PMCID: PMC5125569 DOI: 10.1371/journal.pcbi.1005206] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2016] [Accepted: 10/17/2016] [Indexed: 12/31/2022] Open
Abstract
Transcranial brain stimulation and evidence of ephaptic coupling have recently sparked strong interests in understanding the effects of weak electric fields on the dynamics of brain networks and of coupled populations of neurons. The collective dynamics of large neuronal populations can be efficiently studied using single-compartment (point) model neurons of the integrate-and-fire (IF) type as their elements. These models, however, lack the dendritic morphology required to biophysically describe the effect of an extracellular electric field on the neuronal membrane voltage. Here, we extend the IF point neuron models to accurately reflect morphology dependent electric field effects extracted from a canonical spatial “ball-and-stick” (BS) neuron model. Even in the absence of an extracellular field, neuronal morphology by itself strongly affects the cellular response properties. We, therefore, derive additional components for leaky and nonlinear IF neuron models to reproduce the subthreshold voltage and spiking dynamics of the BS model exposed to both fluctuating somatic and dendritic inputs and an extracellular electric field. We show that an oscillatory electric field causes spike rate resonance, or equivalently, pronounced spike to field coherence. Its resonance frequency depends on the location of the synaptic background inputs. For somatic inputs the resonance appears in the beta and gamma frequency range, whereas for distal dendritic inputs it is shifted to even higher frequencies. Irrespective of an external electric field, the presence of a dendritic cable attenuates the subthreshold response at the soma to slowly-varying somatic inputs while implementing a low-pass filter for distal dendritic inputs. Our point neuron model extension is straightforward to implement and is computationally much more efficient compared to the original BS model. It is well suited for studying the dynamics of large populations of neurons with heterogeneous dendritic morphology with (and without) the influence of weak external electric fields. How extracellular electric fields—as generated endogenously or through transcranial brain stimulation—affect the dynamics of neuronal populations is of great interest but not well understood. To study neuronal activity at the network level single-compartment neuron models have been proven very successful, because of their computational efficiency and analytical tractability. Unfortunately, these models lack the dendritic morphology to biophysically account for the effects of electric fields, and for changes in synaptic integration due to morphology alone. Here, we consider a canonical, spatially extended model neuron and characterize its responses to fluctuating synaptic input as well as an oscillatory, weak electric field. In order to accurately reproduce these responses we analytically derive an extension for the popular integrate-and-fire point neuron models. We show that the dendritic cable acts as a filter for the synaptic input current, which depends on the input location, and that an electric field modulates the neuronal spike rate strongest at a certain (preferred) field frequency. These phenomena can be successfully reproduced using integrate-and-fire models, extended by a small number of components that are straightforward to implement. The extended point models are thus well suited for studying populations of coupled neurons with different morphology, exposed to extracellular electric fields.
Collapse
Affiliation(s)
- Florian Aspart
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- * E-mail: (FA); (JL); (KO)
| | - Josef Ladenbauer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- * E-mail: (FA); (JL); (KO)
| | - Klaus Obermayer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- * E-mail: (FA); (JL); (KO)
| |
Collapse
|
42
|
Neymotin SA, Suter BA, Dura-Bernal S, Shepherd GMG, Migliore M, Lytton WW. Optimizing computer models of corticospinal neurons to replicate in vitro dynamics. J Neurophysiol 2016; 117:148-162. [PMID: 27760819 DOI: 10.1152/jn.00570.2016] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2016] [Accepted: 10/13/2016] [Indexed: 11/22/2022] Open
Abstract
Corticospinal neurons (SPI), thick-tufted pyramidal neurons in motor cortex layer 5B that project caudally via the medullary pyramids, display distinct class-specific electrophysiological properties in vitro: strong sag with hyperpolarization, lack of adaptation, and a nearly linear frequency-current (F-I) relationship. We used our electrophysiological data to produce a pair of large archives of SPI neuron computer models in two model classes: 1) detailed models with full reconstruction; and 2) simplified models with six compartments. We used a PRAXIS and an evolutionary multiobjective optimization (EMO) in sequence to determine ion channel conductances. EMO selected good models from each of the two model classes to form the two model archives. Archived models showed tradeoffs across fitness functions. For example, parameters that produced excellent F-I fit produced a less-optimal fit for interspike voltage trajectory. Because of these tradeoffs, there was no single best model but rather models that would be best for particular usages for either single neuron or network explorations. Further exploration of exemplar models with strong F-I fit demonstrated that both the detailed and simple models produced excellent matches to the experimental data. Although dendritic ion identities and densities cannot yet be fully determined experimentally, we explored the consequences of a demonstrated proximal to distal density gradient of Ih, demonstrating that this would lead to a gradient of resonance properties with increased resonant frequencies more distally. We suggest that this dynamical feature could serve to make the cell particularly responsive to major frequency bands that differ by cortical layer. NEW & NOTEWORTHY We developed models of motor cortex corticospinal neurons that replicate in vitro dynamics, including hyperpolarization-induced sag and realistic firing patterns. Models demonstrated resonance in response to synaptic stimulation, with resonance frequency increasing in apical dendrites with increasing distance from soma, matching the increasing oscillation frequencies spanning deep to superficial cortical layers. This gradient may enable specific corticospinal neuron dendrites to entrain to relevant oscillations in different cortical layers, contributing to appropriate motor output commands.
Collapse
Affiliation(s)
- Samuel A Neymotin
- Department of Physiology and Pharmacology, State University of New York (SUNY) Downstate Medical Center, Brooklyn, New York;
| | - Benjamin A Suter
- Department of Physiology, Northwestern University, Chicago, Illinois
| | - Salvador Dura-Bernal
- Department of Physiology and Pharmacology, State University of New York (SUNY) Downstate Medical Center, Brooklyn, New York
| | | | - Michele Migliore
- Institute of Biophysics, National Research Council, Palermo, Italy
| | - William W Lytton
- Department of Physiology and Pharmacology, State University of New York (SUNY) Downstate Medical Center, Brooklyn, New York.,Department of Neurology, SUNY Downstate Medical Center, Brooklyn, New York.,Department of Neurology, Kings County Hospital Center, Brooklyn, New York; and.,The Robert F. Furchgott Center for Neural and Behavioral Science, Brooklyn, New York
| |
Collapse
|
43
|
Lansky P, Sacerdote L, Zucca C. The Gamma renewal process as an output of the diffusion leaky integrate-and-fire neuronal model. BIOLOGICAL CYBERNETICS 2016; 110:193-200. [PMID: 27246170 DOI: 10.1007/s00422-016-0690-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2015] [Accepted: 05/18/2016] [Indexed: 06/05/2023]
Abstract
Statistical properties of spike trains as well as other neurophysiological data suggest a number of mathematical models of neurons. These models range from entirely descriptive ones to those deduced from the properties of the real neurons. One of them, the diffusion leaky integrate-and-fire neuronal model, which is based on the Ornstein-Uhlenbeck (OU) stochastic process that is restricted by an absorbing barrier, can describe a wide range of neuronal activity in terms of its parameters. These parameters are readily associated with known physiological mechanisms. The other model is descriptive, Gamma renewal process, and its parameters only reflect the observed experimental data or assumed theoretical properties. Both of these commonly used models are related here. We show under which conditions the Gamma model is an output from the diffusion OU model. In some cases, we can see that the Gamma distribution is unrealistic to be achieved for the employed parameters of the OU process.
Collapse
Affiliation(s)
- Petr Lansky
- Institute of Physiology, Academy of Sciences of Czech Republic, Videnská 1083, 142 20, Prague 4, Czech Republic
| | - Laura Sacerdote
- Department of Mathematics "G. Peano", University of Torino, Via Carlo Alberto 10, 10123, Torino, Italy
| | - Cristina Zucca
- Department of Mathematics "G. Peano", University of Torino, Via Carlo Alberto 10, 10123, Torino, Italy.
| |
Collapse
|
44
|
Kobayashi R, Kitano K. Impact of slow K(+) currents on spike generation can be described by an adaptive threshold model. J Comput Neurosci 2016; 40:347-62. [PMID: 27085337 PMCID: PMC4860204 DOI: 10.1007/s10827-016-0601-0] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2015] [Revised: 03/06/2016] [Accepted: 04/01/2016] [Indexed: 12/01/2022]
Abstract
A neuron that is stimulated by rectangular current injections initially responds with a high firing rate, followed by a decrease in the firing rate. This phenomenon is called spike-frequency adaptation and is usually mediated by slow K(+) currents, such as the M-type K(+) current (I M ) or the Ca(2+)-activated K(+) current (I AHP ). It is not clear how the detailed biophysical mechanisms regulate spike generation in a cortical neuron. In this study, we investigated the impact of slow K(+) currents on spike generation mechanism by reducing a detailed conductance-based neuron model. We showed that the detailed model can be reduced to a multi-timescale adaptive threshold model, and derived the formulae that describe the relationship between slow K(+) current parameters and reduced model parameters. Our analysis of the reduced model suggests that slow K(+) currents have a differential effect on the noise tolerance in neural coding.
Collapse
Affiliation(s)
- Ryota Kobayashi
- Principles of Informatics Research Division, National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan. .,Department of Informatics, SOKENDAI (The Graduate University for Advanced Studies), 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan.
| | - Katsunori Kitano
- Department of Human and Computer Intelligence, Ritsumeikan University, 1-1-1 Nojihigashi, Kusatsu, Shiga, 525-8577, Japan
| |
Collapse
|
45
|
Banerjee A. Learning Precise Spike Train-to-Spike Train Transformations in Multilayer Feedforward Neuronal Networks. Neural Comput 2016; 28:826-48. [PMID: 26942750 DOI: 10.1162/neco_a_00829] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We derive a synaptic weight update rule for learning temporally precise spike train-to-spike train transformations in multilayer feedforward networks of spiking neurons. The framework, aimed at seamlessly generalizing error backpropagation to the deterministic spiking neuron setting, is based strictly on spike timing and avoids invoking concepts pertaining to spike rates or probabilistic models of spiking. The derivation is founded on two innovations. First, an error functional is proposed that compares the spike train emitted by the output neuron of the network to the desired spike train by way of their putative impact on a virtual postsynaptic neuron. This formulation sidesteps the need for spike alignment and leads to closed-form solutions for all quantities of interest. Second, virtual assignment of weights to spikes rather than synapses enables a perturbation analysis of individual spike times and synaptic weights of the output, as well as all intermediate neurons in the network, which yields the gradients of the error functional with respect to the said entities. Learning proceeds via a gradient descent mechanism that leverages these quantities. Simulation experiments demonstrate the efficacy of the proposed learning framework. The experiments also highlight asymmetries between synapses on excitatory and inhibitory neurons.
Collapse
Affiliation(s)
- Arunava Banerjee
- Computer and Information Science and Engineering Department, University of Florida, Gainesville, FL 32611-6120, U.S.A.
| |
Collapse
|
46
|
Mensi S, Hagens O, Gerstner W, Pozzorini C. Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons. PLoS Comput Biol 2016; 12:e1004761. [PMID: 26907675 PMCID: PMC4764342 DOI: 10.1371/journal.pcbi.1004761] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2015] [Accepted: 01/19/2016] [Indexed: 11/25/2022] Open
Abstract
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations. Over the last decades, a variety of simplified spiking models have been shown to achieve a surprisingly high performance in predicting the neuronal responses to in vitro somatic current injections. Because of the complex adaptive behavior featured by cortical neurons, this success is however restricted to limited stimulus ranges: model parameters optimized for a specific input regime are often inappropriate to describe the response to input currents with different statistical properties. In the present study, a new spiking neuron model is introduced that captures single-neuron computation over a wide range of input statistics and explains different aspects of the neuronal dynamics within a single framework. Our results indicate that complex forms of single neuron adaptation are mediated by the nonlinear dynamics of the firing threshold and that the input-output transformation performed by cortical pyramidal neurons can be intuitively understood in terms of an enhanced Generalized Linear Model in which both the input filter and the spike-history filter adapt to the input statistics.
Collapse
Affiliation(s)
- Skander Mensi
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Olivier Hagens
- Laboratory of Neural Microcircuitry (LNMC), Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Wulfram Gerstner
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Christian Pozzorini
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| |
Collapse
|
47
|
Luo X, Gee S, Sohal V, Small D. A point-process response model for spike trains from single neurons in neural circuits under optogenetic stimulation. Stat Med 2016; 35:455-74. [PMID: 26411923 PMCID: PMC4713323 DOI: 10.1002/sim.6742] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2014] [Accepted: 09/02/2015] [Indexed: 11/12/2022]
Abstract
Optogenetics is a new tool to study neuronal circuits that have been genetically modified to allow stimulation by flashes of light. We study recordings from single neurons within neural circuits under optogenetic stimulation. The data from these experiments present a statistical challenge of modeling a high-frequency point process (neuronal spikes) while the input is another high-frequency point process (light flashes). We further develop a generalized linear model approach to model the relationships between two point processes, employing additive point-process response functions. The resulting model, point-process responses for optogenetics (PRO), provides explicit nonlinear transformations to link the input point process with the output one. Such response functions may provide important and interpretable scientific insights into the properties of the biophysical process that governs neural spiking in response to optogenetic stimulation. We validate and compare the PRO model using a real dataset and simulations, and our model yields a superior area-under-the-curve value as high as 93% for predicting every future spike. For our experiment on the recurrent layer V circuit in the prefrontal cortex, the PRO model provides evidence that neurons integrate their inputs in a sophisticated manner. Another use of the model is that it enables understanding how neural circuits are altered under various disease conditions and/or experimental conditions by comparing the PRO parameters.
Collapse
Affiliation(s)
- X. Luo
- Department of Biostatistics, Brown University, Providence, Rhode Island 02912, USA
| | - S. Gee
- Department of Psychiatry and Neuroscience Graduate Program, University of California, San Francisco, California 94143, USA
| | - V. Sohal
- Department of Psychiatry and Neuroscience Graduate Program, University of California, San Francisco, California 94143, USA
| | - D. Small
- Department of Statistics, The Wharton School, University of Pennsylvania, Philadelphia, Pennsylvania 19104, USA
| |
Collapse
|
48
|
Seydnejad SR. Reconstruction of the input signal of the leaky integrate-and-fire neuronal model from its interspike intervals. BIOLOGICAL CYBERNETICS 2016; 110:3-15. [PMID: 26658736 DOI: 10.1007/s00422-015-0671-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/12/2014] [Accepted: 11/23/2015] [Indexed: 06/05/2023]
Abstract
Extracting the input signal of a neuron by analyzing its spike output is an important step toward understanding how external information is coded into discrete events of action potentials and how this information is exchanged between different neurons in the nervous system. Most of the existing methods analyze this decoding problem in a stochastic framework and use probabilistic metrics such as maximum-likelihood method to determine the parameters of the input signal assuming a leaky and integrate-and-fire (LIF) model. In this article, the input signal of the LIF model is considered as a combination of orthogonal basis functions. The coefficients of the basis functions are found by minimizing the norm of the observed spikes and those generated by the estimated signal. This approach gives rise to the deterministic reconstruction of the input signal and results in a simple matrix identity through which the coefficients of the basis functions and therefore the neuronal stimulus can be identified. The inherent noise of the neuron is considered as an additional factor in the membrane potential and is treated as the disturbance in the reconstruction algorithm. The performance of the proposed scheme is evaluated by numerical simulations, and it is shown that input signals with different characteristics can be well recovered by this algorithm.
Collapse
Affiliation(s)
- Saeid R Seydnejad
- Department of Electrical Engineering, Shahid Bahonar University of Kerman, 22 Bahman Blvd, Kerman, 7616914111, Iran.
| |
Collapse
|
49
|
Harrison PM, Badel L, Wall MJ, Richardson MJE. Experimentally Verified Parameter Sets for Modelling Heterogeneous Neocortical Pyramidal-Cell Populations. PLoS Comput Biol 2015; 11:e1004165. [PMID: 26291316 PMCID: PMC4546387 DOI: 10.1371/journal.pcbi.1004165] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2014] [Accepted: 01/30/2015] [Indexed: 11/19/2022] Open
Abstract
Models of neocortical networks are increasingly including the diversity of excitatory and inhibitory neuronal classes. Significant variability in cellular properties are also seen within a nominal neuronal class and this heterogeneity can be expected to influence the population response and information processing in networks. Recent studies have examined the population and network effects of variability in a particular neuronal parameter with some plausibly chosen distribution. However, the empirical variability and covariance seen across multiple parameters are rarely included, partly due to the lack of data on parameter correlations in forms convenient for model construction. To addess this we quantify the heterogeneity within and between the neocortical pyramidal-cell classes in layers 2/3, 4, and the slender-tufted and thick-tufted pyramidal cells of layer 5 using a combination of intracellular recordings, single-neuron modelling and statistical analyses. From the response to both square-pulse and naturalistic fluctuating stimuli, we examined the class-dependent variance and covariance of electrophysiological parameters and identify the role of the h current in generating parameter correlations. A byproduct of the dynamic I-V method we employed is the straightforward extraction of reduced neuron models from experiment. Empirically these models took the refractory exponential integrate-and-fire form and provide an accurate fit to the perisomatic voltage responses of the diverse pyramidal-cell populations when the class-dependent statistics of the model parameters were respected. By quantifying the parameter statistics we obtained an algorithm which generates populations of model neurons, for each of the four pyramidal-cell classes, that adhere to experimentally observed marginal distributions and parameter correlations. As well as providing this tool, which we hope will be of use for exploring the effects of heterogeneity in neocortical networks, we also provide the code for the dynamic I-V method and make the full electrophysiological data set available. Neurons are the fundamental components of the nervous system and a quantitative description of their properties is a prerequisite to understanding the complex structures they comprise, from microcircuits to networks. Mathematical modelling provides an essential tool to this end and there has been intense effort directed at analysing networks constructed from different classes of neurons. However, even neurons from the same class show a broad variability in parameter values and the distributions and correlations between these parameters are likely to significantly affect network properties. To quantify this variability, we used a combination of intracellular recording, single-neuron modelling, and statistical analysis to measure the physiological variability in pyramidal-cell populations of the neocortex. We employ protocols that measure parameters from both square-pulse and naturalistic stimuli, characterising the perisomatic integration properties of these cells and allowing for the straightforward extraction of mathematically tractable reduced neuron models. We provide algorithms to generate populations of these neuron models that respect the parameter variability and co-variability observed in our experiments. These represent novel tools for exploring heterogeneity in neocortical networks that will be useful for subsequent theoretical and numerical studies. Finally, we make our full electrophysiological dataset available for other research groups to extend and improve on our analysis.
Collapse
Affiliation(s)
- Paul M. Harrison
- MOAC Doctoral Training Centre, University of Warwick, Coventry, United Kingdom
- School of Life Sciences, University of Warwick, Coventry, United Kingdom
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
| | - Laurent Badel
- Laboratory for Circuit Mechanisms of Sensory Perception, RIKEN Brain Science Institute, Wako, Saitama, Japan
| | - Mark J. Wall
- School of Life Sciences, University of Warwick, Coventry, United Kingdom
| | | |
Collapse
|
50
|
Delarue F, Inglis J, Rubenthaler S, Tanré E. Global solvability of a networked integrate-and-fire model of McKean–Vlasov type. ANN APPL PROBAB 2015. [DOI: 10.1214/14-aap1044] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|