1
|
Kadakia N. Optimal control methods for nonlinear parameter estimation in biophysical neuron models. PLoS Comput Biol 2022; 18:e1010479. [PMID: 36108045 PMCID: PMC9514669 DOI: 10.1371/journal.pcbi.1010479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 09/27/2022] [Accepted: 08/10/2022] [Indexed: 11/29/2022] Open
Abstract
Functional forms of biophysically-realistic neuron models are constrained by neurobiological and anatomical considerations, such as cell morphologies and the presence of known ion channels. Despite these constraints, neuron models still contain unknown static parameters which must be inferred from experiment. This inference task is most readily cast into the framework of state-space models, which systematically takes into account partial observability and measurement noise. Inferring only dynamical state variables such as membrane voltages is a well-studied problem, and has been approached with a wide range of techniques beginning with the well-known Kalman filter. Inferring both states and fixed parameters, on the other hand, is less straightforward. Here, we develop a method for joint parameter and state inference that combines traditional state space modeling with chaotic synchronization and optimal control. Our methods are tailored particularly to situations with considerable measurement noise, sparse observability, very nonlinear or chaotic dynamics, and highly uninformed priors. We illustrate our approach both in a canonical chaotic model and in a phenomenological neuron model, showing that many unknown parameters can be uncovered reliably and accurately from short and noisy observed time traces. Our method holds promise for estimation in larger-scale systems, given ongoing improvements in calcium reporters and genetically-encoded voltage indicators. Systems neuroscience aims to understand how individual neurons and neural networks process external stimuli into behavioral responses. Underlying this characterization are mathematical models intimately shaped by experimental observations. But neural systems are high-dimensional and contain highly nonlinear interactions, so developing accurate models remains a challenge given current experimental capabilities. In practice, this means that the dynamical equations characterizing neural activity have many unknown parameters, and these parameters must be inferred from data. This inference problem is nontrivial owing to model nonlinearity, system and measurement noise, and the sparsity of observations from electrode recordings. Here, we present a novel method for inferring model parameters of neural systems. Our technique combines ideas from control theory and optimization, and amounts to using data to “control” estimates toward the best fit. Our method compares well in accuracy against other state-of-the-art inference methods, both in phenomenological chaotic systems and biophysical neuron models. Our work shows that many unknown model parameters of interest can be inferred from voltage measurements, despite signaling noise, instrument noise, and low observability.
Collapse
Affiliation(s)
- Nirag Kadakia
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT, United States of America
- Quantitative Biology Institute, Yale University, New Haven, CT, United States of America
- Swartz Foundation for Theoretical Neuroscience, Yale University, New Haven, CT, United States of America
- * E-mail:
| |
Collapse
|
2
|
Optimal control for parameter estimation in partially observed hypoelliptic stochastic differential equations. Comput Stat 2022. [DOI: 10.1007/s00180-022-01212-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
3
|
Inoue H, Hukushima K, Omori T. Estimating Distributions of Parameters in Nonlinear State Space Models with Replica Exchange Particle Marginal Metropolis–Hastings Method. ENTROPY 2022; 24:e24010115. [PMID: 35052141 PMCID: PMC8774595 DOI: 10.3390/e24010115] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/25/2021] [Revised: 12/29/2021] [Accepted: 01/07/2022] [Indexed: 02/04/2023]
Abstract
Extracting latent nonlinear dynamics from observed time-series data is important for understanding a dynamic system against the background of the observed data. A state space model is a probabilistic graphical model for time-series data, which describes the probabilistic dependence between latent variables at subsequent times and between latent variables and observations. Since, in many situations, the values of the parameters in the state space model are unknown, estimating the parameters from observations is an important task. The particle marginal Metropolis–Hastings (PMMH) method is a method for estimating the marginal posterior distribution of parameters obtained by marginalization over the distribution of latent variables in the state space model. Although, in principle, we can estimate the marginal posterior distribution of parameters by iterating this method infinitely, the estimated result depends on the initial values for a finite number of times in practice. In this paper, we propose a replica exchange particle marginal Metropolis–Hastings (REPMMH) method as a method to improve this problem by combining the PMMH method with the replica exchange method. By using the proposed method, we simultaneously realize a global search at a high temperature and a local fine search at a low temperature. We evaluate the proposed method using simulated data obtained from the Izhikevich neuron model and Lévy-driven stochastic volatility model, and we show that the proposed REPMMH method improves the problem of the initial value dependence in the PMMH method, and realizes efficient sampling of parameters in the state space models compared with existing methods.
Collapse
Affiliation(s)
- Hiroaki Inoue
- Graduate School of Engineering, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501, Japan;
| | - Koji Hukushima
- Graduate School of Arts and Sciences, The University of Tokyo, 3-8-1 Komaba, Meguro-ku, Tokyo 153-8902, Japan;
- Komaba Institute for Science, The University of Tokyo, 3-8-1 Komaba, Meguro-ku, Tokyo 153-8902, Japan
| | - Toshiaki Omori
- Graduate School of Engineering, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501, Japan;
- Organization for Advanced and Integrated Research, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501, Japan
- Center for Mathematical and Data Sciences, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501, Japan
- Correspondence:
| |
Collapse
|
4
|
Müller-Komorowska D, Parabucki A, Elyasaf G, Katz Y, Beck H, Lampl I. A novel theoretical framework for simultaneous measurement of excitatory and inhibitory conductances. PLoS Comput Biol 2021; 17:e1009725. [PMID: 34962935 PMCID: PMC8746761 DOI: 10.1371/journal.pcbi.1009725] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Revised: 01/10/2022] [Accepted: 12/06/2021] [Indexed: 11/20/2022] Open
Abstract
The firing of neurons throughout the brain is determined by the precise relations between excitatory and inhibitory inputs, and disruption of their balance underlies many psychiatric diseases. Whether or not these inputs covary over time or between repeated stimuli remains unclear due to the lack of experimental methods for measuring both inputs simultaneously. We developed a new analytical framework for instantaneous and simultaneous measurements of both the excitatory and inhibitory neuronal inputs during a single trial under current clamp recording. This can be achieved by injecting a current composed of two high frequency sinusoidal components followed by analytical extraction of the conductances. We demonstrate the ability of this method to measure both inputs in a single trial under realistic recording constraints and from morphologically realistic CA1 pyramidal model cells. Future experimental implementation of our new method will facilitate the understanding of fundamental questions about the health and disease of the nervous system. Most neurons in the brain receive synaptic inputs from both excitatory and inhibitory neurons. Together, these inputs determine neuronal activity: excitatory synapses shift the electrical potential across the membrane towards the threshold for generation of action potentials, whereas inhibitory synapses lower this potential away from the threshold. Action potentials are the rapid electrochemical signals that transmit information to other neurons and they are critical for the information processing abilities of the brain. Although there are many ways to measure either excitatory or inhibitory inputs, these methods have been unable to measure both at the same time. Measuring both inputs together is essential towards understanding how neurons integrate information. We developed a new analytical method to measure excitatory and inhibitory inputs at the same time from the voltage response to injection of an alternating current into a neuron. We describe the foundation of this new method and find that it works in biologically realistic simulations of neurons. By using this technique in real neurons, scientists could investigate basic principles of information processing in the healthy and diseased brain.
Collapse
Affiliation(s)
- Daniel Müller-Komorowska
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, University of Bonn Medical Center, Bonn, Germany.,International Max Planck Research School for Brain and Behavior, University of Bonn, Bonn, Germany
| | - Ana Parabucki
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| | - Gal Elyasaf
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| | - Yonatan Katz
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| | - Heinz Beck
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, University of Bonn Medical Center, Bonn, Germany
| | - Ilan Lampl
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| |
Collapse
|
5
|
Latimer KW, Rieke F, Pillow JW. Inferring synaptic inputs from spikes with a conductance-based neural encoding model. eLife 2019; 8:47012. [PMID: 31850846 PMCID: PMC6989090 DOI: 10.7554/elife.47012] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2019] [Accepted: 12/17/2019] [Indexed: 01/15/2023] Open
Abstract
Descriptive statistical models of neural responses generally aim to characterize the mapping from stimuli to spike responses while ignoring biophysical details of the encoding process. Here, we introduce an alternative approach, the conductance-based encoding model (CBEM), which describes a mapping from stimuli to excitatory and inhibitory synaptic conductances governing the dynamics of sub-threshold membrane potential. Remarkably, we show that the CBEM can be fit to extracellular spike train data and then used to predict excitatory and inhibitory synaptic currents. We validate these predictions with intracellular recordings from macaque retinal ganglion cells. Moreover, we offer a novel quasi-biophysical interpretation of the Poisson generalized linear model (GLM) as a special case of the CBEM in which excitation and inhibition are perfectly balanced. This work forges a new link between statistical and biophysical models of neural encoding and sheds new light on the biophysical variables that underlie spiking in the early visual pathway.
Collapse
Affiliation(s)
- Kenneth W Latimer
- Department of Physiology and Biophysics, University of Washington, Seattle, United States
| | - Fred Rieke
- Department of Physiology and Biophysics, University of Washington, Seattle, United States
| | - Jonathan W Pillow
- Princeton Neuroscience Institute, Department of Psychology, Princeton University, Princeton, United States
| |
Collapse
|
6
|
Clairon Q, Samson A. Optimal control for estimation in partially observed elliptic and hypoelliptic linear stochastic differential equations. STATISTICAL INFERENCE FOR STOCHASTIC PROCESSES 2019. [DOI: 10.1007/s11203-019-09199-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
7
|
Paninski L, Cunningham JP. Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience. Curr Opin Neurobiol 2019; 50:232-241. [PMID: 29738986 DOI: 10.1016/j.conb.2018.04.007] [Citation(s) in RCA: 41] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2017] [Revised: 03/12/2018] [Accepted: 04/06/2018] [Indexed: 01/01/2023]
Abstract
Modern large-scale multineuronal recording methodologies, including multielectrode arrays, calcium imaging, and optogenetic techniques, produce single-neuron resolution data of a magnitude and precision that were the realm of science fiction twenty years ago. The major bottlenecks in systems and circuit neuroscience no longer lie in simply collecting data from large neural populations, but also in understanding this data: developing novel scientific questions, with corresponding analysis techniques and experimental designs to fully harness these new capabilities and meaningfully interrogate these questions. Advances in methods for signal processing, network analysis, dimensionality reduction, and optimal control-developed in lockstep with advances in experimental neurotechnology-promise major breakthroughs in multiple fundamental neuroscience problems. These trends are clear in a broad array of subfields of modern neuroscience; this review focuses on recent advances in methods for analyzing neural time-series data with single-neuronal precision.
Collapse
Affiliation(s)
- L Paninski
- Department of Statistics, Grossman Center for the Statistics of Mind, Zuckerman Mind Brain Behavior Institute, Center for Theoretical Neuroscience, Columbia University, United States; Department of Neuroscience, Grossman Center for the Statistics of Mind, Zuckerman Mind Brain Behavior Institute, Center for Theoretical Neuroscience, Columbia University, United States.
| | - J P Cunningham
- Department of Statistics, Grossman Center for the Statistics of Mind, Zuckerman Mind Brain Behavior Institute, Center for Theoretical Neuroscience, Columbia University, United States
| |
Collapse
|
8
|
Evidence for Long-Timescale Patterns of Synaptic Inputs in CA1 of Awake Behaving Mice. J Neurosci 2017; 38:1821-1834. [PMID: 29279309 DOI: 10.1523/jneurosci.1519-17.2017] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2017] [Revised: 12/11/2017] [Accepted: 12/18/2017] [Indexed: 12/14/2022] Open
Abstract
Repeated sequences of neural activity are a pervasive feature of neural networks in vivo and in vitro In the hippocampus, sequential firing of many neurons over periods of 100-300 ms reoccurs during behavior and during periods of quiescence. However, it is not known whether the hippocampus produces longer sequences of activity or whether such sequences are restricted to specific network states. Furthermore, whether long repeated patterns of activity are transmitted to single cells downstream is unclear. To answer these questions, we recorded intracellularly from hippocampal CA1 of awake, behaving male mice to examine both subthreshold activity and spiking output in single neurons. In eight of nine recordings, we discovered long (900 ms) reoccurring subthreshold fluctuations or "repeats." Repeats generally were high-amplitude, nonoscillatory events reoccurring with 10 ms precision. Using statistical controls, we determined that repeats occurred more often than would be expected from unstructured network activity (e.g., by chance). Most spikes occurred during a repeat, and when a repeat contained a spike, the spike reoccurred with precision on the order of ≤20 ms, showing that long repeated patterns of subthreshold activity are strongly connected to spike output. Unexpectedly, we found that repeats occurred independently of classic hippocampal network states like theta oscillations or sharp-wave ripples. Together, these results reveal surprisingly long patterns of repeated activity in the hippocampal network that occur nonstochastically, are transmitted to single downstream neurons, and strongly shape their output. This suggests that the timescale of information transmission in the hippocampal network is much longer than previously thought.SIGNIFICANCE STATEMENT We found long (≥900 ms), repeated, subthreshold patterns of activity in CA1 of awake, behaving mice. These repeated patterns ("repeats") occurred more often than expected by chance and with 10 ms precision. Most spikes occurred within repeats and reoccurred with a precision on the order of 20 ms. Surprisingly, there was no correlation between repeat occurrence and classical network states such as theta oscillations and sharp-wave ripples. These results provide strong evidence that long patterns of activity are repeated and transmitted to downstream neurons, suggesting that the hippocampus can generate longer sequences of repeated activity than previously thought.
Collapse
|
9
|
Hu S, Zhang Q, Wang J, Chen Z. Real-time particle filtering and smoothing algorithms for detecting abrupt changes in neural ensemble spike activity. J Neurophysiol 2017; 119:1394-1410. [PMID: 29357468 DOI: 10.1152/jn.00684.2017] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/29/2023] Open
Abstract
Sequential change-point detection from time series data is a common problem in many neuroscience applications, such as seizure detection, anomaly detection, and pain detection. In our previous work (Chen Z, Zhang Q, Tong AP, Manders TR, Wang J. J Neural Eng 14: 036023, 2017), we developed a latent state-space model, known as the Poisson linear dynamical system, for detecting abrupt changes in neuronal ensemble spike activity. In online brain-machine interface (BMI) applications, a recursive filtering algorithm is used to track the changes in the latent variable. However, previous methods have been restricted to Gaussian dynamical noise and have used Gaussian approximation for the Poisson likelihood. To improve the detection speed, we introduce non-Gaussian dynamical noise for modeling a stochastic jump process in the latent state space. To efficiently estimate the state posterior that accommodates non-Gaussian noise and non-Gaussian likelihood, we propose particle filtering and smoothing algorithms for the change-point detection problem. To speed up the computation, we implement the proposed particle filtering algorithms using advanced graphics processing unit computing technology. We validate our algorithms, using both computer simulations and experimental data for acute pain detection. Finally, we discuss several important practical issues in the context of real-time closed-loop BMI applications. NEW & NOTEWORTHY Sequential change-point detection is an important problem in closed-loop neuroscience experiments. This study proposes novel sequential Monte Carlo methods to quickly detect the onset and offset of a stochastic jump process that drives the population spike activity. This new approach is robust with respect to spike sorting noise and varying levels of signal-to-noise ratio. The GPU implementation of the computational algorithm allows for parallel processing in real time.
Collapse
Affiliation(s)
- Sile Hu
- Department of Instrument Science and Technology, Zhejiang University , Hangzhou, Zhejiang , People's Republic of China.,Department of Psychiatry, New York University School of Medicine , New York, New York
| | - Qiaosheng Zhang
- Department of Anesthesiology, Perioperative Care, and Pain Medicine, New York University School of Medicine , New York, New York
| | - Jing Wang
- Department of Anesthesiology, Perioperative Care, and Pain Medicine, New York University School of Medicine , New York, New York.,Department of Neuroscience and Physiology, New York University School of Medicine , New York, New York
| | - Zhe Chen
- Department of Psychiatry, New York University School of Medicine , New York, New York.,Department of Neuroscience and Physiology, New York University School of Medicine , New York, New York
| |
Collapse
|
10
|
Wright NC, Hoseini MS, Yasar TB, Wessel R. Coupling of synaptic inputs to local cortical activity differs among neurons and adapts after stimulus onset. J Neurophysiol 2017; 118:3345-3359. [PMID: 28931610 DOI: 10.1152/jn.00398.2017] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Cortical activity contributes significantly to the high variability of sensory responses of interconnected pyramidal neurons, which has crucial implications for sensory coding. Yet, largely because of technical limitations of in vivo intracellular recordings, the coupling of a pyramidal neuron's synaptic inputs to the local cortical activity has evaded full understanding. Here we obtained excitatory synaptic conductance ( g) measurements from putative pyramidal neurons and local field potential (LFP) recordings from adjacent cortical circuits during visual processing in the turtle whole brain ex vivo preparation. We found a range of g-LFP coupling across neurons. Importantly, for a given neuron, g-LFP coupling increased at stimulus onset and then relaxed toward intermediate values during continued visual stimulation. A model network with clustered connectivity and synaptic depression reproduced both the diversity and the dynamics of g-LFP coupling. In conclusion, these results establish a rich dependence of single-neuron responses on anatomical, synaptic, and emergent network properties. NEW & NOTEWORTHY Cortical neurons are strongly influenced by the networks in which they are embedded. To understand sensory processing, we must identify the nature of this influence and its underlying mechanisms. Here we investigate synaptic inputs to cortical neurons, and the nearby local field potential, during visual processing. We find a range of neuron-to-network coupling across cortical neurons. This coupling is dynamically modulated during visual processing via biophysical and emergent network properties.
Collapse
Affiliation(s)
- Nathaniel C Wright
- Department of Physics, Washington University in St. Louis , St. Louis, Missouri
| | - Mahmood S Hoseini
- Department of Physics, Washington University in St. Louis , St. Louis, Missouri
| | - Tansel Baran Yasar
- Department of Physics, Washington University in St. Louis , St. Louis, Missouri
| | - Ralf Wessel
- Department of Physics, Washington University in St. Louis , St. Louis, Missouri
| |
Collapse
|
11
|
Kobayashi R, Nishimaru H, Nishijo H, Lansky P. A single spike deteriorates synaptic conductance estimation. Biosystems 2017; 161:41-45. [PMID: 28756162 DOI: 10.1016/j.biosystems.2017.07.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2017] [Revised: 07/19/2017] [Accepted: 07/20/2017] [Indexed: 11/19/2022]
Abstract
We investigated the estimation accuracy of synaptic conductances by analyzing simulated voltage traces generated by a Hodgkin-Huxley type model. We show that even a single spike substantially deteriorates the estimation. We also demonstrate that two approaches, namely, negative current injection and spike removal, can ameliorate this deterioration.
Collapse
Affiliation(s)
- Ryota Kobayashi
- Principles of Informatics Research Division, National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan; Department of Informatics, Graduate University for Advanced Studies (Sokendai), 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan.
| | - Hiroshi Nishimaru
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama, Sugitani 2630, Toyama 930-0194, Japan
| | - Hisao Nishijo
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama, Sugitani 2630, Toyama 930-0194, Japan
| | - Petr Lansky
- Institute of Physiology, The Czech Academy of Sciences, 142 20 Prague 4, Czech Republic
| |
Collapse
|
12
|
Vich C, Berg RW, Guillamon A, Ditlevsen S. Estimation of Synaptic Conductances in Presence of Nonlinear Effects Caused by Subthreshold Ionic Currents. Front Comput Neurosci 2017; 11:69. [PMID: 28790909 PMCID: PMC5524927 DOI: 10.3389/fncom.2017.00069] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Accepted: 07/07/2017] [Indexed: 11/13/2022] Open
Abstract
Subthreshold fluctuations in neuronal membrane potential traces contain nonlinear components, and employing nonlinear models might improve the statistical inference. We propose a new strategy to estimate synaptic conductances, which has been tested using in silico data and applied to in vivo recordings. The model is constructed to capture the nonlinearities caused by subthreshold activated currents, and the estimation procedure can discern between excitatory and inhibitory conductances using only one membrane potential trace. More precisely, we perform second order approximations of biophysical models to capture the subthreshold nonlinearities, resulting in quadratic integrate-and-fire models, and apply approximate maximum likelihood estimation where we only suppose that conductances are stationary in a 50–100 ms time window. The results show an improvement compared to existent procedures for the models tested here.
Collapse
Affiliation(s)
- Catalina Vich
- Departament de Matemàtiques i Informàtica, Universitat de les Illes BalearsPalma, Spain
| | - Rune W Berg
- Center for Neuroscience, University of CopenhagenCopenhagen, Denmark
| | - Antoni Guillamon
- Departament de Matemàtiques, Universitat Politècnica de CatalunyaBarcelona, Spain
| | - Susanne Ditlevsen
- Department of Mathematical Sciences, University of CopenhagenCopenhagen, Denmark
| |
Collapse
|
13
|
A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements. PLoS Comput Biol 2017; 13:e1005542. [PMID: 28574992 PMCID: PMC5456035 DOI: 10.1371/journal.pcbi.1005542] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Accepted: 04/26/2017] [Indexed: 01/21/2023] Open
Abstract
The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects of the nonlinear dynamics underlying observed neuronal time series, and directly link these to computational properties.
Collapse
|
14
|
Puggioni P, Jelitai M, Duguid I, van Rossum MCW. Extraction of Synaptic Input Properties in Vivo. Neural Comput 2017; 29:1745-1768. [PMID: 28562220 DOI: 10.1162/neco_a_00975] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
Knowledge of synaptic input is crucial for understanding synaptic integration and ultimately neural function. However, in vivo, the rates at which synaptic inputs arrive are high, so that it is typically impossible to detect single events. We show here that it is nevertheless possible to extract the properties of the events and, in particular, to extract the event rate, the synaptic time constants, and the properties of the event size distribution from in vivo voltage-clamp recordings. Applied to cerebellar interneurons, our method reveals that the synaptic input rate increases from 600 Hz during rest to 1000 Hz during locomotion, while the amplitude and shape of the synaptic events are unaffected by this state change. This method thus complements existing methods to measure neural function in vivo.
Collapse
Affiliation(s)
- Paolo Puggioni
- Neuroinformatics Doctoral Training Centre and Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, U.K.
| | - Marta Jelitai
- Centre for Integrative Physiology, University of Edinburgh, Edinburgh EH8 9XD, U.K.
| | - Ian Duguid
- Centre for Integrative Physiology, University of Edinburgh, Edinburgh EH8 9XD, U.K.
| | - Mark C W van Rossum
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, U.K.
| |
Collapse
|
15
|
Lankarany M, Heiss JE, Lampl I, Toyoizumi T. Simultaneous Bayesian Estimation of Excitatory and Inhibitory Synaptic Conductances by Exploiting Multiple Recorded Trials. Front Comput Neurosci 2016; 10:110. [PMID: 27867353 PMCID: PMC5095134 DOI: 10.3389/fncom.2016.00110] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2016] [Accepted: 10/03/2016] [Indexed: 12/04/2022] Open
Abstract
Advanced statistical methods have enabled trial-by-trial inference of the underlying excitatory and inhibitory synaptic conductances (SCs) of membrane-potential recordings. Simultaneous inference of both excitatory and inhibitory SCs sheds light on the neural circuits underlying the neural activity and advances our understanding of neural information processing. Conventional Bayesian methods can infer excitatory and inhibitory SCs based on a single trial of observed membrane potential. However, if multiple recorded trials are available, this typically leads to suboptimal estimation because they neglect common statistics (of synaptic inputs (SIs)) across trials. Here, we establish a new expectation maximization (EM) algorithm that improves such single-trial Bayesian methods by exploiting multiple recorded trials to extract common SI statistics across the trials. In this paper, the proposed EM algorithm is embedded in parallel Kalman filters or particle filters for multiple recorded trials to integrate their outputs to iteratively update the common SI statistics. These statistics are then used to infer the excitatory and inhibitory SCs of individual trials. We demonstrate the superior performance of multiple-trial Kalman filtering (MtKF) and particle filtering (MtPF) relative to that of the corresponding single-trial methods. While relative estimation error of excitatory and inhibitory SCs is known to depend on the level of current injection into a cell, our numerical simulations using MtKF show that both excitatory and inhibitory SCs are reliably inferred using an optimal level of current injection. Finally, we validate the robustness and applicability of our technique through simulation studies, and we apply MtKF to in vivo data recorded from the rat barrel cortex.
Collapse
Affiliation(s)
- Milad Lankarany
- Neurosciences and Mental Health, Department of Physiology and the Institute of Biomaterials and Biomedical Engineering, University of Toronto, The Hospital for Sick ChildrenToronto, ON, Canada; RIKEN Brain Science InstituteSaitama, Japan
| | - Jaime E Heiss
- Center for Neuroscience, Biosciences Division, SRI International Menlo Park, CA, USA
| | - Ilan Lampl
- Department of Neurobiology, Weizmann Institute of Science Rehovot, Israel
| | | |
Collapse
|
16
|
Teixeira da Silva JA, Dobránszki J. Problems with traditional science publishing and finding a wider niche for post-publication peer review. Account Res 2016; 22:22-40. [PMID: 25275622 DOI: 10.1080/08989621.2014.899909] [Citation(s) in RCA: 55] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
Science affects multiple basic sectors of society. Therefore, the findings made in science impact what takes place at a commercial level. More specifically, errors in the literature, incorrect findings, fraudulent data, poorly written scientific reports, or studies that cannot be reproduced not only serve as a burden on tax-payers' money, but they also serve to diminish public trust in science and its findings. Therefore, there is every need to fortify the validity of data that exists in the science literature, not only to build trust among peers, and to sustain that trust, but to reestablish trust in the public and private academic sectors that are witnessing a veritable battle-ground in the world of science publishing, in some ways spurred by the rapid evolution of the open access (OA) movement. Even though many science journals, traditional and OA, claim to be peer reviewed, the truth is that different levels of peer review occur, and in some cases no, insufficient, or pseudo-peer review takes place. This ultimately leads to the erosion of quality and importance of science, allowing essentially anything to become published, provided that an outlet can be found. In some cases, predatory OA journals serve this purpose, allowing papers to be published, often without any peer review or quality control. In the light of an explosion of such cases in predatory OA publishing, and in severe inefficiencies and possible bias in the peer review of even respectable science journals, as evidenced by the increasing attention given to retractions, there is an urgent need to reform the way in which authors, editors, and publishers conduct the first line of quality control, the peer review. One way to address the problem is through post-publication peer review (PPPR), an efficient complement to traditional peer-review that allows for the continuous improvement and strengthening of the quality of science publishing. PPPR may also serve as a way to renew trust in scientific findings by correcting the literature. This article explores what is broadly being said about PPPR in the literature, so as to establish awareness and a possible first-tier prototype for the sciences for which such a system is undeveloped or weak.
Collapse
|
17
|
Kobayashi R, Nishimaru H, Nishijo H. Estimation of excitatory and inhibitory synaptic conductance variations in motoneurons during locomotor-like rhythmic activity. Neuroscience 2016; 335:72-81. [PMID: 27561702 DOI: 10.1016/j.neuroscience.2016.08.027] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Revised: 08/06/2016] [Accepted: 08/12/2016] [Indexed: 11/28/2022]
Abstract
The rhythmic activity of motoneurons (MNs) that underlies locomotion in mammals is generated by synaptic inputs from the locomotor network in the spinal cord. Thus, the quantitative estimation of excitatory and inhibitory synaptic conductances is essential to understand the mechanism by which the network generates the functional motor output. Conductance estimation is obtained from the voltage-current relationship measured by voltage-clamp- or current-clamp-recording with knowledge of the leak parameters of the recorded neuron. However, it is often difficult to obtain sufficient data to estimate synaptic conductances due to technical difficulties in electrophysiological experiments using in vivo or in vitro preparations. To address this problem, we estimated the average variations in excitatory and inhibitory synaptic conductance during a locomotion cycle from a single voltage trace without measuring the leak parameters. We found that the conductance variations can be accurately reconstructed from a voltage trace of 10 cycles by analyzing synthetic data generated from a computational model. Next, the conductance variations were estimated from mouse spinal MNs in vitro during drug-induced-locomotor-like activity. We found that the peak of excitatory conductance occurred during the depolarizing phase of the locomotor cycle, whereas the peak of inhibitory conductance occurred during the hyperpolarizing phase. These results suggest that the locomotor-like activity is generated by push-pull modulation via excitatory and inhibitory synaptic inputs.
Collapse
Affiliation(s)
- Ryota Kobayashi
- Principles of Informatics Research Division, National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-0003, Japan; Department of Informatics, SOKENDAI (The Graduate University for Advanced Studies), 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan.
| | - Hiroshi Nishimaru
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama, Sugitani 2630, Toyama 930-0194, Japan; Faculty of Medicine, University of Tsukuba, Tsukuba 305-8575, Japan.
| | - Hisao Nishijo
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama, Sugitani 2630, Toyama 930-0194, Japan
| |
Collapse
|
18
|
Merel J, Shababo B, Naka A, Adesnik H, Paninski L. Bayesian methods for event analysis of intracellular currents. J Neurosci Methods 2016; 269:21-32. [DOI: 10.1016/j.jneumeth.2016.05.015] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2016] [Revised: 05/13/2016] [Accepted: 05/16/2016] [Indexed: 01/04/2023]
|
19
|
Yaşar TB, Wright NC, Wessel R. Inferring presynaptic population spiking from single-trial membrane potential recordings. J Neurosci Methods 2016; 259:13-21. [PMID: 26658223 DOI: 10.1016/j.jneumeth.2015.11.019] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2015] [Revised: 11/20/2015] [Accepted: 11/23/2015] [Indexed: 10/22/2022]
Abstract
BACKGROUND The time-varying membrane potential of a cortical neuron contains important information about the network activity. Extracting this information requires separating excitatory and inhibitory synaptic inputs from single-trial membrane potential recordings without averaging across trials. NEW METHOD We propose a method to extract the time course of excitatory and inhibitory synaptic inputs to a neuron from a single-trial membrane potential recording. The method takes advantage of the differences in the time constants and the reversal potentials of the excitatory and inhibitory synaptic currents, which allows the untangling of the two conductance types. RESULTS We evaluate the applicability of the method on a leaky integrate-and-fire model neuron and find high quality of estimation of excitatory synaptic conductance changes and presynaptic population spikes. Application of the method to a real cortical neuron with known synaptic inputs in a brain slice returns high-quality estimation of the time course of the excitatory synaptic conductance. Application of the method to membrane potential recordings from a cortical pyramidal neuron of an intact brain reveals complex network activity. COMPARISON WITH EXISTING METHODS Existing methods are based on repeated trials and thus are limited to estimating the statistical features of synaptic conductance changes, or, when based on single trials, are limited to special cases, have low temporal resolution, or are impractically complicated. CONCLUSIONS We propose and test an efficient method for estimating the full time course of excitatory and inhibitory synaptic conductances from single-trial membrane potential recordings. The method is sufficiently simple to ensure widespread use in neuroscience.
Collapse
Affiliation(s)
- Tansel Baran Yaşar
- Department of Physics, Campus Box 1105, Washington University, Saint Louis, MO 63130-4899, USA.
| | - Nathaniel Caleb Wright
- Department of Physics, Campus Box 1105, Washington University, Saint Louis, MO 63130-4899, USA.
| | - Ralf Wessel
- Department of Physics, Campus Box 1105, Washington University, Saint Louis, MO 63130-4899, USA.
| |
Collapse
|
20
|
Dissecting estimation of conductances in subthreshold regimes. J Comput Neurosci 2015; 39:271-87. [PMID: 26432075 DOI: 10.1007/s10827-015-0576-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2015] [Revised: 09/03/2015] [Accepted: 09/09/2015] [Indexed: 10/24/2022]
Abstract
We study the influence of subthreshold activity in the estimation of synaptic conductances. It is known that differences between actual conductances and the estimated ones using linear regression methods can be huge in spiking regimes, so caution has been taken to remove spiking activity from experimental data before proceeding to linear estimation. However, not much attention has been paid to the influence of ionic currents active in the non-spiking regime where such linear methods are still profusely used. In this paper, we use conductance-based models to test this influence using several representative mechanisms to induce ionic subthreshold activity. In all the cases, we show that the currents activated during subthreshold activity can lead to significant errors when estimating synaptic conductance linearly. Thus, our results add a new warning message when extracting conductance traces from intracellular recordings and the conclusions concerning neuronal activity that can be drawn from them. Additionally, we present, as a proof of concept, an alternative method that takes into account the main nonlinear effects of specific ionic subthreshold currents. This method, based on the quadratization of the subthreshold dynamics, allows us to reduce the relative errors of the estimated conductances by more than one order of magnitude. In experimental conditions, under appropriate fitting to canonical models, it could be useful to obtain better estimations as well even under the presence of noise.
Collapse
|
21
|
Kobayashi R, He J, Lansky P. Estimation of the synaptic input firing rates and characterization of the stimulation effects in an auditory neuron. Front Comput Neurosci 2015; 9:59. [PMID: 26042025 PMCID: PMC4435043 DOI: 10.3389/fncom.2015.00059] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2015] [Accepted: 04/30/2015] [Indexed: 11/15/2022] Open
Abstract
To understand information processing in neuronal circuits, it is important to infer how a sensory stimulus impacts on the synaptic input to a neuron. An increase in neuronal firing during the stimulation results from pure excitation or from a combination of excitation and inhibition. Here, we develop a method for estimating the rates of the excitatory and inhibitory synaptic inputs from a membrane voltage trace of a neuron. The method is based on a modified Ornstein-Uhlenbeck neuronal model, which aims to describe the stimulation effects on the synaptic input. The method is tested using a single-compartment neuron model with a realistic description of synaptic inputs, and it is applied to an intracellular voltage trace recorded from an auditory neuron in vivo. We find that the excitatory and inhibitory inputs increase during stimulation, suggesting that the acoustic stimuli are encoded by a combination of excitation and inhibition.
Collapse
Affiliation(s)
- Ryota Kobayashi
- Principles of Informatics Research Division, National Institute of InformaticsTokyo, Japan
- Department of Informatics, SOKENDAI (The Graduate University for Advanced Studies)Tokyo, Japan
| | - Jufang He
- Department of Biomedical Sciences, City University of Hong KongHong Kong, China
| | - Petr Lansky
- Institute of Physiology, The Czech Academy of SciencesPrague, Czech Republic
| |
Collapse
|
22
|
O'Leary T, Sutton AC, Marder E. Computational models in the age of large datasets. Curr Opin Neurobiol 2015; 32:87-94. [PMID: 25637959 DOI: 10.1016/j.conb.2015.01.006] [Citation(s) in RCA: 49] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2014] [Accepted: 01/10/2015] [Indexed: 10/24/2022]
Abstract
Technological advances in experimental neuroscience are generating vast quantities of data, from the dynamics of single molecules to the structure and activity patterns of large networks of neurons. How do we make sense of these voluminous, complex, disparate and often incomplete data? How do we find general principles in the morass of detail? Computational models are invaluable and necessary in this task and yield insights that cannot otherwise be obtained. However, building and interpreting good computational models is a substantial challenge, especially so in the era of large datasets. Fitting detailed models to experimental data is difficult and often requires onerous assumptions, while more loosely constrained conceptual models that explore broad hypotheses and principles can yield more useful insights.
Collapse
Affiliation(s)
- Timothy O'Leary
- Biology Department and Volen Center, Brandeis University, Waltham, MA 02454, United States
| | - Alexander C Sutton
- Biology Department and Volen Center, Brandeis University, Waltham, MA 02454, United States
| | - Eve Marder
- Biology Department and Volen Center, Brandeis University, Waltham, MA 02454, United States.
| |
Collapse
|
23
|
Balaguer-Ballester E, Tabas-Diaz A, Budka M. Can we identify non-stationary dynamics of trial-to-trial variability? PLoS One 2014; 9:e95648. [PMID: 24769735 PMCID: PMC4000201 DOI: 10.1371/journal.pone.0095648] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2014] [Accepted: 03/28/2014] [Indexed: 11/19/2022] Open
Abstract
Identifying sources of the apparent variability in non-stationary scenarios is a fundamental problem in many biological data analysis settings. For instance, neurophysiological responses to the same task often vary from each repetition of the same experiment (trial) to the next. The origin and functional role of this observed variability is one of the fundamental questions in neuroscience. The nature of such trial-to-trial dynamics however remains largely elusive to current data analysis approaches. A range of strategies have been proposed in modalities such as electro-encephalography but gaining a fundamental insight into latent sources of trial-to-trial variability in neural recordings is still a major challenge. In this paper, we present a proof-of-concept study to the analysis of trial-to-trial variability dynamics founded on non-autonomous dynamical systems. At this initial stage, we evaluate the capacity of a simple statistic based on the behaviour of trajectories in classification settings, the trajectory coherence, in order to identify trial-to-trial dynamics. First, we derive the conditions leading to observable changes in datasets generated by a compact dynamical system (the Duffing equation). This canonical system plays the role of a ubiquitous model of non-stationary supervised classification problems. Second, we estimate the coherence of class-trajectories in empirically reconstructed space of system states. We show how this analysis can discern variations attributable to non-autonomous deterministic processes from stochastic fluctuations. The analyses are benchmarked using simulated and two different real datasets which have been shown to exhibit attractor dynamics. As an illustrative example, we focused on the analysis of the rat's frontal cortex ensemble dynamics during a decision-making task. Results suggest that, in line with recent hypotheses, rather than internal noise, it is the deterministic trend which most likely underlies the observed trial-to-trial variability. Thus, the empirical tool developed within this study potentially allows us to infer the source of variability in in-vivo neural recordings.
Collapse
Affiliation(s)
- Emili Balaguer-Ballester
- Faculty of Science and Technology, Bournemouth University, United Kingdom
- Bernstein Center for Computational Neuroscience, Medical Faculty Mannheim and Heidelberg University, Mannheim, Germany
- * E-mail:
| | | | - Marcin Budka
- Faculty of Science and Technology, Bournemouth University, United Kingdom
| |
Collapse
|
24
|
Chizhov AV, Malinina E, Druzin M, Graham LJ, Johansson S. Firing clamp: a novel method for single-trial estimation of excitatory and inhibitory synaptic neuronal conductances. Front Cell Neurosci 2014; 8:86. [PMID: 24734000 PMCID: PMC3973923 DOI: 10.3389/fncel.2014.00086] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2013] [Accepted: 03/08/2014] [Indexed: 11/13/2022] Open
Abstract
Understanding non-stationary neuronal activity as seen in vivo requires estimation of both excitatory and inhibitory synaptic conductances from a single trial of recording. For this purpose, we propose a new intracellular recording method, called “firing clamp.” Synaptic conductances are estimated from the characteristics of artificially evoked probe spikes, namely the spike amplitude and the mean subthreshold potential, which are sensitive to both excitatory and inhibitory synaptic input signals. The probe spikes, timed at a fixed rate, are evoked in the dynamic-clamp mode by injected meander-like current steps, with the step duration depending on neuronal membrane voltage. We test the method with perforated-patch recordings from isolated cells stimulated by external application or synaptic release of transmitter, and validate the method with simulations of a biophysically-detailed neuron model. The results are compared with the conductance estimates based on conventional current-clamp recordings.
Collapse
Affiliation(s)
- Anton V Chizhov
- Computational Physics Laboratory, Division of Plasma Physics, Atomic Physics and Astrophysics, A.F. Ioffe Physical-Technical Institute of the Russian Academy of Sciences St. Petersburg, Russia
| | - Evgenya Malinina
- Section for Physiology, Department of Integrative Medical Biology, Umea University Umea, Sweden
| | - Michael Druzin
- Section for Physiology, Department of Integrative Medical Biology, Umea University Umea, Sweden ; Department of Neurodynamics and Neurobiology, Lobachevsky State University of Nizhny Novgorod Nizhny Novgorod, Russia
| | - Lyle J Graham
- Neurophysiology and New Microscopies Laboratory, INSERM U603 - CNRS UMR 8154, Université Paris Descartes Paris, France
| | - Staffan Johansson
- Section for Physiology, Department of Integrative Medical Biology, Umea University Umea, Sweden
| |
Collapse
|
25
|
Fast state-space methods for inferring dendritic synaptic connectivity. J Comput Neurosci 2013; 36:415-43. [DOI: 10.1007/s10827-013-0478-0] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2012] [Revised: 07/22/2013] [Accepted: 08/14/2013] [Indexed: 02/06/2023]
|
26
|
Lankarany M, Zhu WP, Swamy MNS, Toyoizumi T. Inferring trial-to-trial excitatory and inhibitory synaptic inputs from membrane potential using Gaussian mixture Kalman filtering. Front Comput Neurosci 2013; 7:109. [PMID: 24027523 PMCID: PMC3759749 DOI: 10.3389/fncom.2013.00109] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2013] [Accepted: 07/24/2013] [Indexed: 12/03/2022] Open
Abstract
Time-varying excitatory and inhibitory synaptic inputs govern activity of neurons and process information in the brain. The importance of trial-to-trial fluctuations of synaptic inputs has recently been investigated in neuroscience. Such fluctuations are ignored in the most conventional techniques because they are removed when trials are averaged during linear regression techniques. Here, we propose a novel recursive algorithm based on Gaussian mixture Kalman filtering (GMKF) for estimating time-varying excitatory and inhibitory synaptic inputs from single trials of noisy membrane potential in current clamp recordings. The KF is followed by an expectation maximization (EM) algorithm to infer the statistical parameters (time-varying mean and variance) of the synaptic inputs in a non-parametric manner. As our proposed algorithm is repeated recursively, the inferred parameters of the mixtures are used to initiate the next iteration. Unlike other recent algorithms, our algorithm does not assume an a priori distribution from which the synaptic inputs are generated. Instead, the algorithm recursively estimates such a distribution by fitting a Gaussian mixture model (GMM). The performance of the proposed algorithms is compared to a previously proposed PF-based algorithm (Paninski et al., 2012) with several illustrative examples, assuming that the distribution of synaptic input is unknown. If noise is small, the performance of our algorithms is similar to that of the previous one. However, if noise is large, they can significantly outperform the previous proposal. These promising results suggest that our algorithm is a robust and efficient technique for estimating time varying excitatory and inhibitory synaptic conductances from single trials of membrane potential recordings.
Collapse
Affiliation(s)
- M Lankarany
- Department of Electrical and Computer Engineering, Concordia University Montreal, QC, Canada ; RIKEN Brain Science Institute Saitama, Japan
| | | | | | | |
Collapse
|
27
|
Lankarany M, Zhu WP, Swamy MNS, Toyoizumi T. Trial-to-trial tracking of excitatory and inhibitory synaptic conductance using Gaussian-mixture Kalman filtering. BMC Neurosci 2013. [PMCID: PMC3704248 DOI: 10.1186/1471-2202-14-s1-o2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
28
|
Odom SE, Borisyuk A. Estimating three synaptic conductances in a stochastic neural model. J Comput Neurosci 2012; 33:191-205. [PMID: 22322649 DOI: 10.1007/s10827-012-0382-z] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2011] [Revised: 12/13/2011] [Accepted: 01/12/2012] [Indexed: 11/28/2022]
Abstract
We present a method for the reconstruction of three stimulus-evoked time-varying synaptic input conductances from voltage recordings. Our approach is based on exploiting the stochastic nature of synaptic conductances and membrane voltage. Starting with the assumption that the variances of the conductances are known, we use a stochastic differential equation to model dynamics of membrane potential and derive equations for first and second moments that can be solved to find conductances. We successfully apply the new reconstruction method to simulated data. We also explore the robustness of the method as the assumptions of the underlying model are relaxed. We vary the noise levels, the reversal potentials, the number of stimulus repetitions, and the accuracy of conductance variance estimation to quantify the robustness of reconstruction. These studies pave the way for the application of the method to experimental data.
Collapse
Affiliation(s)
- Stephen E Odom
- Department of Mathematics, University of Utah, Salt Lake City, UT 84112, USA.
| | | |
Collapse
|
29
|
Paninski L, Ahmadian Y, Ferreira DG, Koyama S, Rahnama Rad K, Vidne M, Vogelstein J, Wu W. A new look at state-space models for neural data. J Comput Neurosci 2010; 29:107-126. [PMID: 19649698 PMCID: PMC3712521 DOI: 10.1007/s10827-009-0179-x] [Citation(s) in RCA: 105] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2008] [Revised: 07/06/2009] [Accepted: 07/16/2009] [Indexed: 10/20/2022]
Abstract
State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in state-space models with non-Gaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these approximations, but that nonetheless retain the computational efficiency of the approximate methods. We discuss a variety of examples, applying these direct optimization techniques to problems in spike train smoothing, stimulus decoding, parameter estimation, and inference of synaptic properties. Along the way, we point out connections to some related standard statistical methods, including spline smoothing and isotonic regression. Finally, we note that the computational methods reviewed here do not in fact depend on the state-space setting at all; instead, the key property we are exploiting involves the bandedness of certain matrices. We close by discussing some applications of this more general point of view, including Markov chain Monte Carlo methods for neural decoding and efficient estimation of spatially-varying firing rates.
Collapse
Affiliation(s)
- Liam Paninski
- Department of Statistics and Center for Theoretical Neuroscience, Columbia University, New York, NY, USA.
| | - Yashar Ahmadian
- Department of Statistics and Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Daniel Gil Ferreira
- Department of Statistics and Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Shinsuke Koyama
- Department of Statistics, Carnegie Mellon University, Pittsburgh, PA, USA
| | - Kamiar Rahnama Rad
- Department of Statistics and Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Michael Vidne
- Department of Statistics and Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Joshua Vogelstein
- Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA
| | - Wei Wu
- Department of Statistics, Florida State University, Tallahassee, FL, USA
| |
Collapse
|