1
|
Schieferstein N, Schwalger T, Lindner B, Kempter R. Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations. PLoS Comput Biol 2024; 20:e1011886. [PMID: 38377147 PMCID: PMC10923461 DOI: 10.1371/journal.pcbi.1011886] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2023] [Revised: 03/08/2024] [Accepted: 02/01/2024] [Indexed: 02/22/2024] Open
Abstract
Hippocampal ripple oscillations have been implicated in important cognitive functions such as memory consolidation and planning. Multiple computational models have been proposed to explain the emergence of ripple oscillations, relying either on excitation or inhibition as the main pacemaker. Nevertheless, the generating mechanism of ripples remains unclear. An interesting dynamical feature of experimentally measured ripples, which may advance model selection, is intra-ripple frequency accommodation (IFA): a decay of the instantaneous ripple frequency over the course of a ripple event. So far, only a feedback-based inhibition-first model, which relies on delayed inhibitory synaptic coupling, has been shown to reproduce IFA. Here we use an analytical mean-field approach and numerical simulations of a leaky integrate-and-fire spiking network to explain the mechanism of IFA. We develop a drift-based approximation for the oscillation dynamics of the population rate and the mean membrane potential of interneurons under strong excitatory drive and strong inhibitory coupling. For IFA, the speed at which the excitatory drive changes is critical. We demonstrate that IFA arises due to a speed-dependent hysteresis effect in the dynamics of the mean membrane potential, when the interneurons receive transient, sharp wave-associated excitation. We thus predict that the IFA asymmetry vanishes in the limit of slowly changing drive, but is otherwise a robust feature of the feedback-based inhibition-first ripple model.
Collapse
Affiliation(s)
- Natalie Schieferstein
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Richard Kempter
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Einstein Center for Neurosciences, Berlin, Germany
| |
Collapse
|
2
|
Mysin I. Phase relations of interneuronal activity relative to theta rhythm. Front Neural Circuits 2023; 17:1198573. [PMID: 37484208 PMCID: PMC10358363 DOI: 10.3389/fncir.2023.1198573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2023] [Accepted: 06/20/2023] [Indexed: 07/25/2023] Open
Abstract
The theta rhythm plays a crucial role in synchronizing neural activity during attention and memory processes. However, the mechanisms behind the formation of neural activity during theta rhythm generation remain unknown. To address this, we propose a mathematical model that explains the distribution of interneurons in the CA1 field during the theta rhythm phase. Our model consists of a network of seven types of interneurons in the CA1 field that receive inputs from the CA3 field, entorhinal cortex, and local pyramidal neurons in the CA1 field. By adjusting the parameters of the connections in the model. We demonstrate that it is possible to replicate the experimentally observed phase relations between interneurons and the theta rhythm. Our model predicts that populations of interneurons receive unimodal excitation and inhibition with coinciding peaks, and that excitation dominates to determine the firing dynamics of interneurons.
Collapse
|
3
|
Dumont G, Pérez-Cervera A, Gutkin B. A framework for macroscopic phase-resetting curves for generalised spiking neural networks. PLoS Comput Biol 2022; 18:e1010363. [PMID: 35913991 PMCID: PMC9371324 DOI: 10.1371/journal.pcbi.1010363] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2021] [Revised: 08/11/2022] [Accepted: 07/06/2022] [Indexed: 11/18/2022] Open
Abstract
Brain rhythms emerge from synchronization among interconnected spiking neurons. Key properties of such rhythms can be gleaned from the phase-resetting curve (PRC). Inferring the PRC and developing a systematic phase reduction theory for large-scale brain rhythms remains an outstanding challenge. Here we present a theoretical framework and methodology to compute the PRC of generic spiking networks with emergent collective oscillations. We adopt a renewal approach where neurons are described by the time since their last action potential, a description that can reproduce the dynamical feature of many cell types. For a sufficiently large number of neurons, the network dynamics are well captured by a continuity equation known as the refractory density equation. We develop an adjoint method for this equation giving a semi-analytical expression of the infinitesimal PRC. We confirm the validity of our framework for specific examples of neural networks. Our theoretical framework can link key biological properties at the individual neuron scale and the macroscopic oscillatory network properties. Beyond spiking networks, the approach is applicable to a broad class of systems that can be described by renewal processes. The formation of oscillatory neuronal assemblies at the network level has been hypothesized to be fundamental to many cognitive and motor functions. One prominent tool to understand the dynamics of oscillatory activity response to stimuli, and hence the neural code for which it is a substrate, is a nonlinear measure called Phase-Resetting Curve (PRC). At the network scale, the PRC defines the measure of how a given synaptic input perturbs the timing of next upcoming volley of spike assemblies: either advancing or delaying this timing. As a further application, one can use PRCs to make unambiguous predictions about whether communicating networks of neurons will phase-lock as it is often observed across the cortical areas and what would be this stable phase-configuration: synchronous, asynchronous or with asymmetric phase-shifts. The latter configuration also implies a preferential flow of information form the leading network to the follower, thereby giving causal signatures of directed functional connectivity. Because of the key position of the PRC in studying synchrony, information flow and entrainment to external forcing, it is crucial to move toward a theory that allows to compute the PRCs of network-wide oscillations not only for a restricted class of models, as has been done in the past, but to network descriptions that are generalized and can reflect flexibly single cell properties. In this manuscript, we tackle this issue by showing how the PRC for network oscillations can be computed using the adjoint systems of partial differential equations that define the dynamics of the neural activity density.
Collapse
Affiliation(s)
- Grégory Dumont
- Group for Neural Theory, LNC INSERM U960, DEC, Ecole Normale Supérieure - PSL University, Paris France
- * E-mail:
| | - Alberto Pérez-Cervera
- Center for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow
- Instituto de Matemática Interdisciplinar, Universidad Complutense de Madrid, Madrid, Spain
| | - Boris Gutkin
- Group for Neural Theory, LNC INSERM U960, DEC, Ecole Normale Supérieure - PSL University, Paris France
- Center for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow
| |
Collapse
|
4
|
Chizhov AV, Amakhin DV, Smirnova EY, Zaitsev AV. Ictal wavefront propagation in slices and simulations with conductance-based refractory density model. PLoS Comput Biol 2022; 18:e1009782. [PMID: 35041661 PMCID: PMC8797236 DOI: 10.1371/journal.pcbi.1009782] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Revised: 01/28/2022] [Accepted: 12/21/2021] [Indexed: 12/04/2022] Open
Abstract
The mechanisms determining ictal discharge (ID) propagation are still not clear. In the present study, we aimed to examine these mechanisms in animal and mathematical models of epileptiform activity. Using double-patch and extracellular potassium ion concentration recordings in rat hippocampal-cortical slices, we observed that IDs moved at a speed of about 1 mm/s or less. The mechanisms of such slow propagation have been studied with a mathematical, conductance-based refractory density (CBRD) model that describes the GABA- and glutamatergic neuronal populations’ interactions and ion dynamics in brain tissue. The modeling study reveals two main factors triggerring IDs: (i) increased interneuronal activity leading to chloride ion accumulation and a consequent depolarizing GABAergic effect and (ii) the elevation of extracellular potassium ion concentration. The local synaptic transmission followed by local potassium ion extrusion and GABA receptor-mediated chloride ion accumulation underlies the ID wavefront’s propagation. In contrast, potassium ion diffusion in the extracellular space is slower and does not affect ID’s speed. The short discharges, constituting the ID, propagate much faster than the ID front. The accumulation of sodium ions inside neurons due to their hyperactivity and glutamatergic currents boosts the Na+/K+ pump, which terminates the ID. Knowledge of the mechanism of ID generation and propagation contributes to the development of new treatments against epilepsy. During an epileptic seizure, neuronal excitation spreads across the brain tissue and is accompanied by significant changes in ionic concentrations. Ictal discharge front spreads at low speeds, less than 1 mm/s. Mechanisms underlying this phenomenon are not yet well understood. We study these mechanisms using electrophysiological recordings in brain slices and computer simulations. Our detailed biophysical model describing neuronal populations’ interaction, spatial propagation, and ionic dynamics reproduces the generation and propagation of spontaneously repeating ictal discharges. The simulations are consistent with our recordings of the electrical activity and the extracellular potassium ion concentration. We distinguished between the two alternative mechanisms of the ictal wavefront propagation: (i) the diffusion of potassium ions released from excited neurons, which depolarizes distant neurons and thus supports excitation, and (ii) the axonal spread of excitation followed by the local extracellular potassium ion accumulation that supports the excitation. Our simulations provide evidence in favor of the latter mechanism. Our experiment-based modeling contributes to a mathematical description of brain tissue functioning and potentially contributes to developing new treatments against epilepsy.
Collapse
Affiliation(s)
- Anton V. Chizhov
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
- * E-mail:
| | - Dmitry V. Amakhin
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| | - Elena Yu. Smirnova
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
- Institute of Experimental Medicine, Almazov National Medical Research Centre, Saint Petersburg, Russia
| | - Aleksey V. Zaitsev
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| |
Collapse
|
5
|
Schwalger T. Mapping input noise to escape noise in integrate-and-fire neurons: a level-crossing approach. BIOLOGICAL CYBERNETICS 2021; 115:539-562. [PMID: 34668051 PMCID: PMC8551127 DOI: 10.1007/s00422-021-00899-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 09/27/2021] [Indexed: 06/13/2023]
Abstract
Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate ("escape noise"). While input noise lends itself to modeling biophysical noise processes, the phenomenological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with colored input noise and time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced across the sub- and suprathreshold firing regime compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise, enabling progress in the theory of neuronal population dynamics.
Collapse
Affiliation(s)
- Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623, Berlin, Germany.
- Bernstein Center for Computational Neuroscience Berlin, 10115, Berlin, Germany.
| |
Collapse
|
6
|
Chizhov AV, Graham LJ. A strategy for mapping biophysical to abstract neuronal network models applied to primary visual cortex. PLoS Comput Biol 2021; 17:e1009007. [PMID: 34398895 PMCID: PMC8389851 DOI: 10.1371/journal.pcbi.1009007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Revised: 08/26/2021] [Accepted: 07/27/2021] [Indexed: 11/18/2022] Open
Abstract
A fundamental challenge for the theoretical study of neuronal networks is to make the link between complex biophysical models based directly on experimental data, to progressively simpler mathematical models that allow the derivation of general operating principles. We present a strategy that successively maps a relatively detailed biophysical population model, comprising conductance-based Hodgkin-Huxley type neuron models with connectivity rules derived from anatomical data, to various representations with fewer parameters, finishing with a firing rate network model that permits analysis. We apply this methodology to primary visual cortex of higher mammals, focusing on the functional property of stimulus orientation selectivity of receptive fields of individual neurons. The mapping produces compact expressions for the parameters of the abstract model that clearly identify the impact of specific electrophysiological and anatomical parameters on the analytical results, in particular as manifested by specific functional signatures of visual cortex, including input-output sharpening, conductance invariance, virtual rotation and the tilt after effect. Importantly, qualitative differences between model behaviours point out consequences of various simplifications. The strategy may be applied to other neuronal systems with appropriate modifications. A hierarchy of theoretical approaches to study a neuronal network depends on a tradeoff between biological fidelity and mathematical tractibility. Biophysically-detailed models consider cellular mechanisms and anatomically defined synaptic circuits, but are often too complex to reveal insights into fundamental principles. In contrast, increasingly abstract reduced models facilitate analytical insights. To better ground the latter to the underlying biology, we describe a systematic procedure to move across the model hierarchy that allows understanding how changes in biological parameters—physiological, pathophysiological, or because of new data—impact the behaviour of the network. We apply this approach to mammalian primary visual cortex, and examine how the different models in the hierarchy reproduce functional signatures of this area, in particular the tuning of neurons to the orientation of a visual stimulus. Our work provides a navigation of the complex parameter space of neural network models faithful to biology, as well as highlighting how simplifications made for mathematical convenience can fundamentally change their behaviour.
Collapse
Affiliation(s)
- Anton V. Chizhov
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- * E-mail:
| | - Lyle J. Graham
- Centre Giovanni Borelli - CNRS UMR9010, Université de Paris, France
| |
Collapse
|
7
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
8
|
Chizhov A, Merkulyeva N. Refractory density model of cortical direction selectivity: Lagged-nonlagged, transient-sustained, and On-Off thalamic neuron-based mechanisms and intracortical amplification. PLoS Comput Biol 2020; 16:e1008333. [PMID: 33052899 PMCID: PMC7605712 DOI: 10.1371/journal.pcbi.1008333] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2020] [Revised: 11/02/2020] [Accepted: 09/12/2020] [Indexed: 11/18/2022] Open
Abstract
A biophysically detailed description of the mechanisms of the primary vision is still being developed. We have incorporated a simplified, filter-based description of retino-thalamic visual signal processing into the detailed, conductance-based refractory density description of the neuronal population activity of the primary visual cortex. We compared four mechanisms of the direction selectivity (DS), three of them being based on asymmetrical projections of different types of thalamic neurons to the cortex, distinguishing between (i) lagged and nonlagged, (ii) transient and sustained, and (iii) On and Off neurons. The fourth mechanism implies a lack of subcortical bias and is an epiphenomenon of intracortical interactions between orientation columns. The simulations of the cortical response to moving gratings have verified that first three mechanisms provide DS to an extent compared with experimental data and that the biophysical model realistically reproduces characteristics of the visual cortex activity, such as membrane potential, firing rate, and synaptic conductances. The proposed model reveals the difference between the mechanisms of both the intact and the silenced cortex, favoring the second mechanism. In the fourth case, DS is weaker but significant; it completely vanishes in the silenced cortex.DS in the On-Off mechanism derives from the nonlinear interactions within the orientation map. Results of simulations can help to identify a prevailing mechanism of DS in V1. This is a step towards a comprehensive biophysical modeling of the primary visual system in the frameworks of the population rate coding concept. A major mechanism that underlies tuning of cortical neurons to the direction of a moving stimulus is still debated. Considering the visual cortex structured with orientation-selective columns, we have realized and compared in our biophysically detailed mathematical model four hypothetical mechanisms of the direction selectivity (DS) known from experiments. The present model accomplishes our previous model that was tuned to experimental data on excitability in slices and reproduces orientation tuning effects in vivo. In simulations, we have found that the convergence of inputs from so-called transient and sustained (or lagged and nonlagged) thalamic neurons in the cortex provides an initial bias for DS, whereas cortical interactions amplify the tuning. In the absence of any bias, DS emerges as an epiphenomenon of the orientation map. In the case of a biased convergence of On- and Off- thalamic inputs, DS emerges with the help of the intracortical interactions on the orientation map, also. Thus, we have proposed a comprehensive description of the primary vision and revealed characteristic features of different mechanisms of DS in the visual cortex with columnar structure.
Collapse
Affiliation(s)
- Anton Chizhov
- Ioffe Institute, St.-Petersburg, Russia
- Sechenov Institute of Evolutionary Physiology and Biochemistry of RAS, St.-Petersburg, Russia
- * E-mail:
| | | |
Collapse
|
9
|
René A, Longtin A, Macke JH. Inference of a Mesoscopic Population Model from Population Spike Trains. Neural Comput 2020; 32:1448-1498. [DOI: 10.1162/neco_a_01292] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Understanding how rich dynamics emerge in neural populations requires models exhibiting a wide range of behaviors while remaining interpretable in terms of connectivity and single-neuron dynamics. However, it has been challenging to fit such mechanistic spiking networks at the single-neuron scale to empirical population data. To close this gap, we propose to fit such data at a mesoscale, using a mechanistic but low-dimensional and, hence, statistically tractable model. The mesoscopic representation is obtained by approximating a population of neurons as multiple homogeneous pools of neurons and modeling the dynamics of the aggregate population activity within each pool. We derive the likelihood of both single-neuron and connectivity parameters given this activity, which can then be used to optimize parameters by gradient ascent on the log likelihood or perform Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. We illustrate this approach using a model of generalized integrate-and-fire neurons for which mesoscopic dynamics have been previously derived and show that both single-neuron and connectivity parameters can be recovered from simulated data. In particular, our inference method extracts posterior correlations between model parameters, which define parameter subsets able to reproduce the data. We compute the Bayesian posterior for combinations of parameters using MCMC sampling and investigate how the approximations inherent in a mesoscopic population model affect the accuracy of the inferred single-neuron parameters.
Collapse
Affiliation(s)
- Alexandre René
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada; Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany; and Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich 52425, Germany
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada, and Brain and Mind Research Institute, University of Ottawa, Ottawa K1H 8M5, Canada
| | - Jakob H. Macke
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany, and Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of Munich, Munich 80333, Germany
| |
Collapse
|
10
|
Schwalger T, Chizhov AV. Mind the last spike - firing rate models for mesoscopic populations of spiking neurons. Curr Opin Neurobiol 2019; 58:155-166. [PMID: 31590003 DOI: 10.1016/j.conb.2019.08.003] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/25/2019] [Indexed: 02/07/2023]
Abstract
The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons - a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany; Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany.
| | - Anton V Chizhov
- Ioffe Institute, 194021 Saint-Petersburg, Russia; Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, 194223 Saint-Petersburg, Russia
| |
Collapse
|
11
|
Conductance-Based Refractory Density Approach for a Population of Bursting Neurons. Bull Math Biol 2019; 81:4124-4143. [PMID: 31313084 DOI: 10.1007/s11538-019-00643-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 07/09/2019] [Indexed: 12/29/2022]
Abstract
The conductance-based refractory density (CBRD) approach is a parsimonious mathematical-computational framework for modelling interacting populations of regular spiking neurons, which, however, has not been yet extended for a population of bursting neurons. The canonical CBRD method allows to describe the firing activity of a statistical ensemble of uncoupled Hodgkin-Huxley-like neurons (differentiated by noise) and has demonstrated its validity against experimental data. The present manuscript generalises the CBRD for a population of bursting neurons; however, in this pilot computational study, we consider the simplest setting in which each individual neuron is governed by a piecewise linear bursting dynamics. The resulting population model makes use of slow-fast analysis, which leads to a novel methodology that combines CBRD with the theory of multiple timescale dynamics. The main prospect is that it opens novel avenues for mathematical explorations, as well as, the derivation of more sophisticated population activity from Hodgkin-Huxley-like bursting neurons, which will allow to capture the activity of synchronised bursting activity in hyper-excitable brain states (e.g. onset of epilepsy).
Collapse
|
12
|
Qu G, Fan B, Fu X, Yu Y. The Impact of Frequency Scale on the Response Sensitivity and Reliability of Cortical Neurons to 1/f β Input Signals. Front Cell Neurosci 2019; 13:311. [PMID: 31354432 PMCID: PMC6637762 DOI: 10.3389/fncel.2019.00311] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2019] [Accepted: 06/25/2019] [Indexed: 12/16/2022] Open
Abstract
What type of principle features intrinsic inside of the fluctuated input signals could drive neurons with the maximal excitations is one of the crucial neural coding issues. In this article, we examined both experimentally and theoretically the cortical neuronal responsivity (including firing rate and spike timing reliability) to input signals with different intrinsic correlational statistics (e.g., white-type noise, showed 1/f0 power spectrum, pink noise 1/f, and brown noises 1/f2) and different frequency ranges. Our results revealed that the response sensitivity and reliability of cortical neurons is much higher in response to 1/f noise stimuli with long-term correlations than 1/f0 with short-term correlations for a broad frequency range, and also higher than 1/f2 for all frequency ranges. In addition, we found that neuronal sensitivity diverges to opposite directions for 1/f noise comparing with 1/f0 white noise as a function of cutoff frequency of input signal. As the cutoff frequency is progressively increased from 50 to 1,000 Hz, the neuronal responsiveness increased gradually for 1/f noise, while decreased exponentially for white noise. Computational simulations of a general cortical model revealed that, neuronal sensitivity and reliability to input signal statistics was majorly dominated by fast sodium inactivation, potassium activation, and membrane time constants.
Collapse
Affiliation(s)
| | | | | | - Yuguo Yu
- State Key Laboratory of Medical Neurobiology, School of Life Science, Human Phenome Institute, Institute of Brain Science, Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
| |
Collapse
|
13
|
Chizhov AV, Amakhin DV, Zaitsev AV. Mathematical model of Na-K-Cl homeostasis in ictal and interictal discharges. PLoS One 2019; 14:e0213904. [PMID: 30875397 PMCID: PMC6420042 DOI: 10.1371/journal.pone.0213904] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2018] [Accepted: 03/04/2019] [Indexed: 12/22/2022] Open
Abstract
Despite big experimental data on the phenomena and mechanisms of the generation of ictal and interictal discharges (IDs and IIDs), mathematical models that can describe the synaptic interactions of neurons and the ionic dynamics in biophysical detail are not well-established. Based on experimental recordings of combined hippocampal-entorhinal cortex slices from rats in a high-potassium and a low-magnesium solution containing 4-aminopyridine as well as previous observations of similar experimental models, this type of mathematical model has been developed. The model describes neuronal excitation through the application of the conductance-based refractory density approach for three neuronal populations: two populations of glutamatergic neurons with hyperpolarizing and depolarizing GABAergic synapses and one GABAergic population. The ionic dynamics account for the contributions of voltage-gated and synaptic channels, active and passive transporters, and diffusion. The relatively slow dynamics of potassium, chloride, and sodium ion concentrations determine the transitions from pure GABAergic IIDs to IDs and GABA-glutamatergic IIDs. The model reproduces different types of IIDs, including those initiated by interneurons; repetitive IDs; tonic and bursting modes of an ID composed of clustered IID-like events. The simulations revealed contributions from different ionic channels to the ion concentration dynamics before and during ID generation. The proposed model is a step forward to an optimal mathematical description of the mechanisms of epileptic discharges.
Collapse
Affiliation(s)
- Anton V. Chizhov
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- * E-mail:
| | - Dmitry V. Amakhin
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| | - Aleksey V. Zaitsev
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| |
Collapse
|
14
|
Spatial propagation of interictal discharges along the cortex. Biochem Biophys Res Commun 2018; 508:1245-1251. [PMID: 30563766 DOI: 10.1016/j.bbrc.2018.12.070] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2018] [Accepted: 12/11/2018] [Indexed: 11/22/2022]
Abstract
Interictal discharges (IIDs) accompany epileptic seizures and highlight the mechanisms of pathological activity. The propagation of IIDs along the neural tissue is not well understood. To simulate IID propagation, this study proposes a new mathematical model that uses the conductance-based refractory density approach for glutamatergic and GABAergic neuronal populations. The mathematical model is found to be consistent with experimental double-patch registrations in the 4-aminopyridine in vitro model of epilepsy. In slices, the spontaneous activity of interneurons leads to their synchronization by means of the depolarizing GABAmediated response, thus initiating IIDs. Modeling reveals a clustering of interneuronal synchronization followed by IIDs with activity fronts that propagate along the cortex. The GABA-mediated depolarization either remains to be subthreshold for the principal neurons and thus results in pure GABAergic IIDs (IID1s) or leads to glutamatergic excitation, thus resulting in another type of IIDs (IID2s). In both the model and experiment, IIDs propagate as waves, with constant activity profiles and velocity. The speed of IIDs is of the order of tens of mm/s and is larger for IID2s than for IID1s (40 and 20 mm/s, respectively). The simulations, consistent with experimental observations, show that the wavelike propagation of IIDs initiated by interneurons is determined by local synaptic connectivity under the conditions of depolarizing GABA.
Collapse
|
15
|
Amakhin DV, Soboleva EB, Ergina JL, Malkin SL, Chizhov AV, Zaitsev AV. Seizure-Induced Potentiation of AMPA Receptor-Mediated Synaptic Transmission in the Entorhinal Cortex. Front Cell Neurosci 2018; 12:486. [PMID: 30618633 PMCID: PMC6297849 DOI: 10.3389/fncel.2018.00486] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2018] [Accepted: 11/29/2018] [Indexed: 11/22/2022] Open
Abstract
Excessive excitation is considered one of the key mechanisms underlying epileptic seizures. We investigated changes in the evoked postsynaptic responses of medial entorhinal cortex (ERC) pyramidal neurons by seizure-like events (SLEs), using the modified 4-aminopyridine (4-AP) model of epileptiform activity. Rat brain slices were perfused with pro-epileptic solution contained 4-AP and elevated potassium and reduced magnesium concentration. We demonstrated that 15-min robust epileptiform activity in slices leads to an increase in the amplitude of α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid receptor (AMPAR)-mediated component of the evoked response, as well as an increase in the polysynaptic γ-aminobutyric acid (GABA) and N-methyl-D-aspartate (NMDA) receptor-mediated components. The increase in AMPA-mediated postsynaptic conductance depends on NMDA receptor activation. It persists for at least 15 min after the cessation of SLEs and is partly attributed to the inclusion of calcium-permeable AMPA receptors in the postsynaptic membrane. The mathematical modeling of the evoked responses using the conductance-based refractory density (CBRD) approach indicated that such augmentation of the AMPA receptor function and depolarization by GABA receptors results in prolonged firing that explains the increase in polysynaptic components, which contribute to overall network excitability. Taken together, our data suggest that AMPA receptor enhancement could be a critical determinant of sustained status epilepticus (SE).
Collapse
Affiliation(s)
- Dmitry V Amakhin
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| | - Elena B Soboleva
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| | - Julia L Ergina
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| | - Sergey L Malkin
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| | - Anton V Chizhov
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia.,Ioffe Institute, Russian Academy of Sciences, Saint Petersburg, Russia
| | - Aleksey V Zaitsev
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia.,Institute of Experimental Medicine, Almazov National Medical Research Centre, Saint Petersburg, Russia
| |
Collapse
|
16
|
Buchin A, Kerr CC, Huberfeld G, Miles R, Gutkin B. Adaptation and Inhibition Control Pathological Synchronization in a Model of Focal Epileptic Seizure. eNeuro 2018; 5:ENEURO.0019-18.2018. [PMID: 30302390 PMCID: PMC6173584 DOI: 10.1523/eneuro.0019-18.2018] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2018] [Revised: 06/07/2018] [Accepted: 06/07/2018] [Indexed: 01/12/2023] Open
Abstract
Pharmacoresistant epilepsy is a common neurological disorder in which increased neuronal intrinsic excitability and synaptic excitation lead to pathologically synchronous behavior in the brain. In the majority of experimental and theoretical epilepsy models, epilepsy is associated with reduced inhibition in the pathological neural circuits, yet effects of intrinsic excitability are usually not explicitly analyzed. Here we present a novel neural mass model that includes intrinsic excitability in the form of spike-frequency adaptation in the excitatory population. We validated our model using local field potential (LFP) data recorded from human hippocampal/subicular slices. We found that synaptic conductances and slow adaptation in the excitatory population both play essential roles for generating seizures and pre-ictal oscillations. Using bifurcation analysis, we found that transitions towards seizure and back to the resting state take place via Andronov-Hopf bifurcations. These simulations therefore suggest that single neuron adaptation as well as synaptic inhibition are responsible for orchestrating seizure dynamics and transition towards the epileptic state.
Collapse
Affiliation(s)
- Anatoly Buchin
- University of Washington, Department of Physiology and Biophysics (United States, Seattle), 1959 NE Pacific St, 98195
| | - Cliff C. Kerr
- University of Sydney, School of Physics (Australia, Sydney), Physics Rd, NSW 2006
| | - Gilles Huberfeld
- Sorbonne Université-UPMC, Pitié-Salpêtrière Hô, Neurophysiology Department (France, Paris), 47-83 Boulevard de l’Hôpital, 75013
- Institut national de la santé et de la recherche médicale Unit 1129 “Infantile Epilepsies and Brain Plasticity”, Paris Descartes University, Sorbonne Paris Cité University group, (France, Paris), 149 rue de Sévres 75015
| | - Richard Miles
- Brain and Spine Institute, Cortex and Epilepsie Group (France, Paris), 47 Boulevard Hôpital, 75013
| | - Boris Gutkin
- Paris Sciences & Lettres Research University, Laboratoire des Neurosciences Cognitives, Group for Neural Theory (France, Paris), 29, rue d'Ulm, 75005 France
- National Research University Higher School of Economics, Center for Cognition and Decision Making (Russia, Moscow), 20 Myasnitskaya, 109316
| |
Collapse
|
17
|
Chizhov AV. Conductance-based refractory density approach: comparison with experimental data and generalization to lognormal distribution of input current. BIOLOGICAL CYBERNETICS 2017; 111:353-364. [PMID: 28819690 DOI: 10.1007/s00422-017-0727-9] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/21/2016] [Accepted: 07/31/2017] [Indexed: 06/07/2023]
Abstract
The conductance-based refractory density (CBRD) approach is an efficient tool for modeling interacting neuronal populations. The model describes the firing activity of a statistical ensemble of uncoupled Hodgkin-Huxley-like neurons, each receiving individual Gaussian noise and a common time-varying deterministic input. However, the approach requires experimental validation and extension to cases of distributed input signals (or input weights) among different neurons of such an ensemble. Here the CBRD model is verified by comparing with experimental data and then generalized for a lognormal (LN) distribution of the input weights. The model with equal weights is shown to reproduce efficiently the post-spike time histograms and the membrane voltage of experimental multiple trial response of single neurons to a step-wise current injection. The responses reveal a more rapid reaction of the firing-rate than voltage. Slow adaptive potassium channels strongly affected the shape of the responses. Next, a computationally efficient CBRD model is derived for a population with the LN input weight distribution and is compared with the original model with equal input weights. The analysis shows that the LN distribution: (1) provides a faster response, (2) eliminates oscillations, (3) leads to higher sensitivity to weak stimuli, and (4) increases the coefficient of variation of interspike intervals. In addition, a simplified firing-rate type model is tested, showing improved precision in the case of a LN distribution of weights. The CBRD approach is recommended for complex, biophysically detailed simulations of interacting neuronal populations, while the modified firing-rate type model is recommended for computationally reduced simulations.
Collapse
Affiliation(s)
- Anton V Chizhov
- Ioffe Institute, Politekhnicheskaya str., 26, St. Petersburg, Russia, 194021.
- Sechenov Institute of Evolutionary Physiology and Biochemistry of Russian Academy of Sciences, Torez pr., 44, St. Petersburg, Russia, 194223.
| |
Collapse
|
18
|
Computational model of interictal discharges triggered by interneurons. PLoS One 2017; 12:e0185752. [PMID: 28977038 PMCID: PMC5627938 DOI: 10.1371/journal.pone.0185752] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2017] [Accepted: 09/19/2017] [Indexed: 11/19/2022] Open
Abstract
Interictal discharges (IIDs) are abnormal waveforms registered in the periods before or between seizures. IIDs that are initiated by GABAergic interneurons have not been mathematically modeled yet. In the present study, a mathematical model that describes the mechanisms of these discharges is proposed. The model is based on the experimental recordings of IIDs in pyramidal neurons of the rat entorhinal cortex and estimations of synaptic conductances during IIDs. IIDs were induced in cortico-hippocampal slices by applying an extracellular solution with 4-aminopyridine, high potassium, and low magnesium concentrations. Two different types of IIDs initiated by interneurons were observed. The first type of IID (IID1) was pure GABAergic. The second type of IID (IID2) was induced by GABAergic excitation and maintained by recurrent interactions of both GABA- and glutamatergic neuronal populations. The model employed the conductance-based refractory density (CBRD) approach, which accurately approximates the firing rate of a population of similar Hodgkin-Huxley-like neurons. The model of coupled excitatory and inhibitory populations includes AMPA, NMDA, and GABA-receptor-mediated synapses and gap junctions. These neurons receive both arbitrary deterministic input and individual colored Gaussian noise. Both types of IIDs were successfully reproduced in the model by setting two different depolarized levels for GABA-mediated current reversal potential. It was revealed that short-term synaptic depression is a crucial factor in ceasing each of the discharges, and it also determines their durations and frequencies.
Collapse
|
19
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
20
|
Chizhov AV, Sanchez-Aguilera A, Rodrigues S, de la Prida LM. Simplest relationship between local field potential and intracellular signals in layered neural tissue. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062704. [PMID: 26764724 DOI: 10.1103/physreve.92.062704] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2015] [Indexed: 06/05/2023]
Abstract
The relationship between the extracellularly measured electric field potential resulting from synaptic activity in an ensemble of neurons and intracellular signals in these neurons is an important but still open question. Based on a model neuron with a cylindrical dendrite and lumped soma, we derive a formula that substantiates a proportionality between the local field potential and the total somatic transmembrane current that emerges from the difference between the somatic and dendritic membrane potentials. The formula is tested by intra- and extracellular recordings of evoked synaptic responses in hippocampal slices. Additionally, the contribution of different membrane currents to the field potential is demonstrated in a two-population mean-field model. Our formalism, which allows for a simple estimation of unknown dendritic currents directly from somatic measurements, provides an interpretation of the local field potential in terms of intracellularly measurable synaptic signals. It is also applicable to the study of cortical activity using two-compartment neuronal population models.
Collapse
Affiliation(s)
- Anton V Chizhov
- Ioffe Institute, RAS, Politekhnicheskaya str., 26, 194021, St.-Petersburg, Russia
- Sechenov Institute of Evolutionary Physiology and Biochemistry of RAS, Torez pr. 44, 194223, St.-Petersburg, Russia
| | | | - Serafim Rodrigues
- Centre for Robotics and Neural Systems, School of Computing and Mathematics, Plymouth University, Drake Circus, Plymouth, Devon PL4 8AA, United Kingdom
| | | |
Collapse
|
21
|
Chizhov AV, Smirnova EY, Kim KK, Zaitsev AV. A simple Markov model of sodium channels with a dynamic threshold. J Comput Neurosci 2014; 37:181-91. [PMID: 24469252 DOI: 10.1007/s10827-014-0496-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2013] [Revised: 12/15/2013] [Accepted: 01/16/2014] [Indexed: 11/27/2022]
Abstract
Characteristics of action potential generation are important to understanding brain functioning and, thus, must be understood and modeled. It is still an open question what model can describe concurrently the phenomena of sharp spike shape, the spike threshold variability, and the divisive effect of shunting on the gain of frequency-current dependence. We reproduced these three effects experimentally by patch-clamp recordings in cortical slices, but we failed to simulate them by any of 11 known neuron models, including one- and multi-compartment, with Hodgkin-Huxley and Markov equation-based sodium channel approximations, and those taking into account sodium channel subtype heterogeneity. Basing on our voltage-clamp data characterizing the dependence of sodium channel activation threshold on history of depolarization, we propose a 3-state Markov model with a closed-to-open state transition threshold dependent on slow inactivation. This model reproduces the all three phenomena. As a reduction of this model, a leaky integrate-and-fire model with a dynamic threshold also shows the effect of gain reduction by shunt. These results argue for the mechanism of gain reduction through threshold dynamics determined by the slow inactivation of sodium channels.
Collapse
Affiliation(s)
- A V Chizhov
- A.F. Ioffe Physical-Technical Institute of the Russian Academy of Sciences, Politekhnicheskaya str., 26, 194021, Saint-Petersburg, Russia,
| | | | | | | |
Collapse
|
22
|
Chizhov AV. Conductance-based refractory density model of primary visual cortex. J Comput Neurosci 2013; 36:297-319. [PMID: 23888313 DOI: 10.1007/s10827-013-0473-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2013] [Revised: 06/17/2013] [Accepted: 06/27/2013] [Indexed: 10/26/2022]
Abstract
A layered continual population model of primary visual cortex has been constructed, which reproduces a set of experimental data, including postsynaptic responses of single neurons on extracellular electric stimulation and spatially distributed activity patterns in response to visual stimulation. In the model, synaptically interacting excitatory and inhibitory neuronal populations are described by a conductance-based refractory density approach. Populations of two-compartment excitatory and inhibitory neurons in cortical layers 2/3 and 4 are distributed in the 2-d cortical space and connected by AMPA, NMDA and GABA type synapses. The external connections are pinwheel-like, according to the orientation of a stimulus. Intracortical connections are isotropic local and patchy between neurons with similar orientations. The model proposes better temporal resolution and more detailed elaboration than conventional mean-field models. In comparison to large network simulations, it excludes a posteriori statistical data manipulation and provides better computational efficiency and minimal parametrization.
Collapse
Affiliation(s)
- Anton V Chizhov
- A.F. Ioffe Physical-Technical Institute of RAS, Politekhnicheskaya str., 26, 194021, St.-Petersburg, Russia,
| |
Collapse
|
23
|
Ly C. A principled dimension-reduction method for the population density approach to modeling networks of neurons with synaptic dynamics. Neural Comput 2013; 25:2682-708. [PMID: 23777517 DOI: 10.1162/neco_a_00489] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The population density approach to neural network modeling has been utilized in a variety of contexts. The idea is to group many similar noisy neurons into populations and track the probability density function for each population that encompasses the proportion of neurons with a particular state rather than simulating individual neurons (i.e., Monte Carlo). It is commonly used for both analytic insight and as a time-saving computational tool. The main shortcoming of this method is that when realistic attributes are incorporated in the underlying neuron model, the dimension of the probability density function increases, leading to intractable equations or, at best, computationally intensive simulations. Thus, developing principled dimension-reduction methods is essential for the robustness of these powerful methods. As a more pragmatic tool, it would be of great value for the larger theoretical neuroscience community. For exposition of this method, we consider a single uncoupled population of leaky integrate-and-fire neurons receiving external excitatory synaptic input only. We present a dimension-reduction method that reduces a two-dimensional partial differential-integral equation to a computationally efficient one-dimensional system and gives qualitatively accurate results in both the steady-state and nonequilibrium regimes. The method, termed modified mean-field method, is based entirely on the governing equations and not on any auxiliary variables or parameters, and it does not require fine-tuning. The principles of the modified mean-field method have potential applicability to more realistic (i.e., higher-dimensional) neural networks.
Collapse
Affiliation(s)
- Cheng Ly
- Department of Statistical Sciences and Operations Research, Virginia Commonwealth University Richmond, VA 23284-3083, USA.
| |
Collapse
|
24
|
Tetzlaff T, Helias M, Einevoll GT, Diesmann M. Decorrelation of neural-network activity by inhibitory feedback. PLoS Comput Biol 2012; 8:e1002596. [PMID: 23133368 PMCID: PMC3487539 DOI: 10.1371/journal.pcbi.1002596] [Citation(s) in RCA: 112] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2011] [Accepted: 05/20/2012] [Indexed: 11/19/2022] Open
Abstract
Correlations in spike-train ensembles can seriously impair the encoding of information by their spatio-temporal structure. An inevitable source of correlation in finite neural networks is common presynaptic input to pairs of neurons. Recent studies demonstrate that spike correlations in recurrent neural networks are considerably smaller than expected based on the amount of shared presynaptic input. Here, we explain this observation by means of a linear network model and simulations of networks of leaky integrate-and-fire neurons. We show that inhibitory feedback efficiently suppresses pairwise correlations and, hence, population-rate fluctuations, thereby assigning inhibitory neurons the new role of active decorrelation. We quantify this decorrelation by comparing the responses of the intact recurrent network (feedback system) and systems where the statistics of the feedback channel is perturbed (feedforward system). Manipulations of the feedback statistics can lead to a significant increase in the power and coherence of the population response. In particular, neglecting correlations within the ensemble of feedback channels or between the external stimulus and the feedback amplifies population-rate fluctuations by orders of magnitude. The fluctuation suppression in homogeneous inhibitory networks is explained by a negative feedback loop in the one-dimensional dynamics of the compound activity. Similarly, a change of coordinates exposes an effective negative feedback loop in the compound dynamics of stable excitatory-inhibitory networks. The suppression of input correlations in finite networks is explained by the population averaged correlations in the linear network model: In purely inhibitory networks, shared-input correlations are canceled by negative spike-train correlations. In excitatory-inhibitory networks, spike-train correlations are typically positive. Here, the suppression of input correlations is not a result of the mere existence of correlations between excitatory (E) and inhibitory (I) neurons, but a consequence of a particular structure of correlations among the three possible pairings (EE, EI, II).
Collapse
Affiliation(s)
- Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), Computational and Systems Neuroscience, Research Center Jülich, Jülich, Germany.
| | | | | | | |
Collapse
|
25
|
WONG WK, WANG Z, ZHEN B, LEUNG SYS. RELATIONSHIP BETWEEN APPLICABILITY OF CURRENT-BASED SYNAPSES AND UNIFORMITY OF FIRING PATTERNS. Int J Neural Syst 2012; 22:1250017. [DOI: 10.1142/s0129065712500177] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
The purpose of this paper is to identify situations in neural network modeling where current-based synapses are applicable. The applicability of current-based synapse model for studying post-transient behavior of neural networks is discussed in terms of average synaptic current strength induced by per spike during one firing cycle of a neuron (or briefly per spike synaptic current strength). It was found that current-based synapse models are applicable in both situations where both the interspike intervals of the neurons and the distribution of firing times of the neurons are uniform, and where the firing of all neurons is synchronized. If neither the interspike intervals nor the distribution of firing times of the neurons is uniform or the reversal potential is between the rest and threshold potentials, current-based synapse models may be oversimplified.
Collapse
Affiliation(s)
- W. K. WONG
- Institute of Textiles and Clothing, The Hong Kong Polytechnic University, Hunghom, Kowloon, Hong Kong, China
| | - Z. WANG
- College of Information Science and Technology, Donghua University, Shanghai, China
| | - B. ZHEN
- Institute of Textiles and Clothing, The Hong Kong Polytechnic University, Hunghom, Kowloon, Hong Kong, China
| | - SYS. LEUNG
- Institute of Textiles and Clothing, The Hong Kong Polytechnic University, Hunghom, Kowloon, Hong Kong, China
| |
Collapse
|
26
|
Accurate and fast simulation of channel noise in conductance-based model neurons by diffusion approximation. PLoS Comput Biol 2011; 7:e1001102. [PMID: 21423712 PMCID: PMC3053314 DOI: 10.1371/journal.pcbi.1001102] [Citation(s) in RCA: 66] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2010] [Accepted: 01/28/2011] [Indexed: 11/19/2022] Open
Abstract
Stochastic channel gating is the major source of intrinsic neuronal noise whose functional consequences at the microcircuit- and network-levels have been only partly explored. A systematic study of this channel noise in large ensembles of biophysically detailed model neurons calls for the availability of fast numerical methods. In fact, exact techniques employ the microscopic simulation of the random opening and closing of individual ion channels, usually based on Markov models, whose computational loads are prohibitive for next generation massive computer models of the brain. In this work, we operatively define a procedure for translating any Markov model describing voltage- or ligand-gated membrane ion-conductances into an effective stochastic version, whose computer simulation is efficient, without compromising accuracy. Our approximation is based on an improved Langevin-like approach, which employs stochastic differential equations and no Montecarlo methods. As opposed to an earlier proposal recently debated in the literature, our approximation reproduces accurately the statistical properties of the exact microscopic simulations, under a variety of conditions, from spontaneous to evoked response features. In addition, our method is not restricted to the Hodgkin-Huxley sodium and potassium currents and is general for a variety of voltage- and ligand-gated ion currents. As a by-product, the analysis of the properties emerging in exact Markov schemes by standard probability calculus enables us for the first time to analytically identify the sources of inaccuracy of the previous proposal, while providing solid ground for its modification and improvement we present here. A possible approach to understanding the neuronal bases of the computational properties of the nervous system consists of modelling its basic building blocks, neurons and synapses, and then simulating their collective activity emerging in large networks. In developing such models, a satisfactory description level must be chosen as a compromise between simplicity and faithfulness in reproducing experimental data. Deterministic neuron models – i.e., models that upon repeated simulation with fixed parameter values provide the same results – are usually made up of ordinary differential equations and allow for relatively fast simulation times. By contrast, they do not describe accurately the underlying stochastic response properties arising from the microscopical correlate of neuronal excitability. Stochastic models are usually based on mathematical descriptions of individual ion channels, or on an effective macroscopic account of their random opening and closing. In this contribution we describe a general method to transform any deterministic neuron model into its effective stochastic version that accurately replicates the statistical properties of ion channels random kinetics.
Collapse
|
27
|
Helias M, Deger M, Rotter S, Diesmann M. Finite post synaptic potentials cause a fast neuronal response. Front Neurosci 2011; 5:19. [PMID: 21427776 PMCID: PMC3047297 DOI: 10.3389/fnins.2011.00019] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2010] [Accepted: 02/07/2011] [Indexed: 01/23/2023] Open
Abstract
A generic property of the communication between neurons is the exchange of pulses at discrete time points, the action potentials. However, the prevalent theory of spiking neuronal networks of integrate-and-fire model neurons relies on two assumptions: the superposition of many afferent synaptic impulses is approximated by Gaussian white noise, equivalent to a vanishing magnitude of the synaptic impulses, and the transfer of time varying signals by neurons is assessable by linearization. Going beyond both approximations, we find that in the presence of synaptic impulses the response to transient inputs differs qualitatively from previous predictions. It is instantaneous rather than exhibiting low-pass characteristics, depends non-linearly on the amplitude of the impulse, is asymmetric for excitation and inhibition and is promoted by a characteristic level of synaptic background noise. These findings resolve contradictions between the earlier theory and experimental observations. Here we review the recent theoretical progress that enabled these insights. We explain why the membrane potential near threshold is sensitive to properties of the afferent noise and show how this shapes the neural response. A further extension of the theory to time evolution in discrete steps quantifies simulation artifacts and yields improved methods to cross check results.
Collapse
Affiliation(s)
| | - Moritz Deger
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
- Computational Neuroscience, Faculty of Biology, Albert-Ludwig UniversityFreiburg, Germany
| | - Markus Diesmann
- RIKEN Brain Science InstituteWako City, Japan
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
- Institute for Neuroscience and Medicine (INM-6), Computational and Systems Neuroscience, Research Center JülichGermany
- Brain and Neural Systems Team, Computational Science Research Program, RIKENWako City, Japan
| |
Collapse
|
28
|
Helias M, Deger M, Rotter S, Diesmann M. Instantaneous non-linear processing by pulse-coupled threshold units. PLoS Comput Biol 2010; 6. [PMID: 20856583 PMCID: PMC2936519 DOI: 10.1371/journal.pcbi.1000929] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2010] [Accepted: 08/10/2010] [Indexed: 11/18/2022] Open
Abstract
Contemporary theory of spiking neuronal networks is based on the linear response of the integrate-and-fire neuron model derived in the diffusion limit. We find that for non-zero synaptic weights, the response to transient inputs differs qualitatively from this approximation. The response is instantaneous rather than exhibiting low-pass characteristics, non-linearly dependent on the input amplitude, asymmetric for excitation and inhibition, and is promoted by a characteristic level of synaptic background noise. We show that at threshold the probability density of the potential drops to zero within the range of one synaptic weight and explain how this shapes the response. The novel mechanism is exhibited on the network level and is a generic property of pulse-coupled networks of threshold units. Our work demonstrates a fast-firing response of nerve cells that remained unconsidered in network analysis, because it is inaccessible by the otherwise successful linear response theory. For the sake of analytic tractability, this theory assumes infinitesimally weak synaptic coupling. However, realistic synaptic impulses cause a measurable deflection of the membrane potential. Here we quantify the effect of this pulse-coupling on the firing rate and the membrane-potential distribution. We demonstrate how the postsynaptic potentials give rise to a fast, non-linear rate transient present for excitatory, but not for inhibitory, inputs. It is particularly pronounced in the presence of a characteristic level of synaptic background noise. We show that feed-forward inhibition enhances the fast response on the network level. This enables a mode of information processing based on short-lived activity transients. Moreover, the non-linear neural response appears on a time scale that critically interacts with spike-timing dependent synaptic plasticity rules. Our results are derived for biologically realistic synaptic amplitudes, but also extend earlier work based on Gaussian white noise. The novel theoretical framework is generically applicable to any threshold unit governed by a stochastic differential equation driven by finite jumps. Therefore, our results are relevant for a wide range of biological, physical, and technical systems.
Collapse
|
29
|
Buchin AY, Chizhov AV. Firing-rate model of a population of adaptive neurons. Biophysics (Nagoya-shi) 2010. [DOI: 10.1134/s0006350910040135] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
|
30
|
Moreno-Bote R, Parga N. Response of Integrate-and-Fire Neurons to Noisy Inputs Filtered by Synapses with Arbitrary Timescales: Firing Rate and Correlations. Neural Comput 2010; 22:1528-72. [DOI: 10.1162/neco.2010.06-09-1036] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Delivery of neurotransmitter produces on a synapse a current that flows through the membrane and gets transmitted into the soma of the neuron, where it is integrated. The decay time of the current depends on the synaptic receptor's type and ranges from a few (e.g., AMPA receptors) to a few hundred milliseconds (e.g., NMDA receptors). The role of the variety of synaptic timescales, several of them coexisting in the same neuron, is at present not understood. A prime question to answer is which is the effect of temporal filtering at different timescales of the incoming spike trains on the neuron's response. Here, based on our previous work on linear synaptic filtering, we build a general theory for the stationary firing response of integrate-and-fire (IF) neurons receiving stochastic inputs filtered by one, two, or multiple synaptic channels, each characterized by an arbitrary timescale. The formalism applies to arbitrary IF model neurons and arbitrary forms of input noise (i.e., not required to be gaussian or to have small amplitude), as well as to any form of synaptic filtering (linear or nonlinear). The theory determines with exact analytical expressions the firing rate of an IF neuron for long synaptic time constants using the adiabatic approach. The correlated spiking (cross-correlations function) of two neurons receiving common as well as independent sources of noise is also described. The theory is illustrated using leaky, quadratic, and noise-thresholded IF neurons. Although the adiabatic approach is exact when at least one of the synaptic timescales is long, it provides a good prediction of the firing rate even when the timescales of the synapses are comparable to that of the leak of the neuron; it is not required that the synaptic time constants are longer than the mean interspike intervals or that the noise has small variance. The distribution of the potential for general IF neurons is also characterized. Our results provide powerful analytical tools that can allow a quantitative description of the dynamics of neuronal networks with realistic synaptic dynamics.
Collapse
Affiliation(s)
- Rubén Moreno-Bote
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, New York, 14627, U.S.A., and Departamento de Física Teórica, Universidad Autónoma de Madrid, Cantoblanco 28049, Madrid, Spain
| | - Néstor Parga
- Departamento de Física Teórica, Universidad Autónoma de Madrid, Cantoblanco 29049, Madrid, Spain
| |
Collapse
|
31
|
Fiasconaro A, Spagnolo B. Stability measures in metastable states with Gaussian colored noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2009; 80:041110. [PMID: 19905276 DOI: 10.1103/physreve.80.041110] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/28/2009] [Revised: 08/23/2009] [Indexed: 05/28/2023]
Abstract
We present a study of the escape time from a metastable state of an overdamped Brownian particle in the presence of colored noise generated by Ornstein-Uhlenbeck process. We analyze the role of the correlation time on the enhancement of the mean first passage time through a potential barrier and on the behavior of the mean growth rate coefficient as a function of the noise intensity. We observe the noise-enhanced stability effect for all the initial unstable states used and for all values of the correlation time tau(c) investigated. We can distinguish two dynamical regimes characterized by weak and strong correlated noises, depending on the value of tau(c) with respect to the relaxation time of the system.
Collapse
Affiliation(s)
- Alessandro Fiasconaro
- Dipartimento di Fisica e Tecnologie Relative, Group of Interdisciplinary Physics, Università di Palermo and CNISM-INFM, Viale delle Scienze, I-90128 Palermo, Italy.
| | | |
Collapse
|
32
|
Chizhov AV, Smirnova EY, Graham LJ. Mapping between V1 models of orientation selectivity: From a distributed multi-population conductance-based refractory density model to a firing-rate ring model. BMC Neurosci 2009. [DOI: 10.1186/1471-2202-10-s1-p181] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
33
|
Ly C, Tranchina D. Spike train statistics and dynamics with synaptic input from any renewal process: a population density approach. Neural Comput 2009; 21:360-96. [PMID: 19431264 DOI: 10.1162/neco.2008.03-08-743] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physiological consequences. We extend PDF methods to models with synaptic event times governed by any modulated renewal process. We focus on the integrate-and-fire neuron with instantaneous synaptic kinetics and a random elementary excitatory postsynaptic potential (EPSP), A. Between presynaptic events, the membrane voltage, v, decays exponentially toward rest, while s, the time since the last synaptic input event, evolves with unit velocity. When a synaptic event arrives, v jumps by A, and s is reset to zero. If v crosses the threshold voltage, an action potential occurs, and v is reset to v(reset). The probability per unit time of a synaptic event at time t, given the elapsed time s since the last event, h(s, t), depends on specifics of the renewal process. We study how regularity of the train of synaptic input events affects output spike rate, PDF and coefficient of variation (CV) of the interspike interval, and the autocorrelation function of the output spike train. In the limit of a deterministic, clocklike train of input events, the PDF of the interspike interval converges to a sum of delta functions, with coefficients determined by the PDF for A. The limiting autocorrelation function of the output spike train is a sum of delta functions whose coefficients fall under a damped oscillatory envelope. When the EPSP CV, sigma A/mu A, is equal to 0.45, a CV for the intersynaptic event interval, sigma T/mu T = 0.35, is functionally equivalent to a deterministic periodic train of synaptic input events (CV = 0) with respect to spike statistics. We discuss the relevance to neural network simulations.
Collapse
Affiliation(s)
- Cheng Ly
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, USA.
| | | |
Collapse
|
34
|
Ly C, Doiron B. Divisive gain modulation with dynamic stimuli in integrate-and-fire neurons. PLoS Comput Biol 2009; 5:e1000365. [PMID: 19390603 PMCID: PMC2667215 DOI: 10.1371/journal.pcbi.1000365] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2008] [Accepted: 03/18/2009] [Indexed: 11/18/2022] Open
Abstract
The modulation of the sensitivity, or gain, of neural responses to input is an important component of neural computation. It has been shown that divisive gain modulation of neural responses can result from a stochastic shunting from balanced (mixed excitation and inhibition) background activity. This gain control scheme was developed and explored with static inputs, where the membrane and spike train statistics were stationary in time. However, input statistics, such as the firing rates of pre-synaptic neurons, are often dynamic, varying on timescales comparable to typical membrane time constants. Using a population density approach for integrate-and-fire neurons with dynamic and temporally rich inputs, we find that the same fluctuation-induced divisive gain modulation is operative for dynamic inputs driving nonequilibrium responses. Moreover, the degree of divisive scaling of the dynamic response is quantitatively the same as the steady-state responses—thus, gain modulation via balanced conductance fluctuations generalizes in a straight-forward way to a dynamic setting. Many neural computations, including sensory and motor processing, require neurons to control their sensitivity (often termed ‘gain’) to stimuli. One common form of gain manipulation is divisive gain control, where the neural response to a specific stimulus is simply scaled by a constant. Most previous theoretical and experimental work on divisive gain control have assumed input statistics to be constant in time. However, realistic inputs can be highly time-varying, often with time-varying statistics, and divisive gain control remains to be extended to these cases. A widespread mechanism for divisive gain control for static inputs is through an increase in stimulus independent membrane fluctuations. We address the question of whether this divisive gain control scheme is indeed operative for time-varying inputs. Using simplified spiking neuron models, we employ accurate theoretical methods to estimate the dynamic neural response. We find that gain control via membrane fluctuations does indeed extend to the time-varying regime, and moreover, the degree of divisive scaling does not depend on the timescales of the driving input. This significantly increases the relevance of this form of divisive gain control for neural computations where input statistics change in time, as expected during normal sensory and motor behavior.
Collapse
Affiliation(s)
- Cheng Ly
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- * E-mail: (CL); (BD)
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- * E-mail: (CL); (BD)
| |
Collapse
|