1
|
Schlungbaum M, Barayeu A, Grewe J, Benda J, Lindner B. Effect of burst spikes on linear and nonlinear signal transmission in spiking neurons. J Comput Neurosci 2025; 53:37-60. [PMID: 39560916 PMCID: PMC11868171 DOI: 10.1007/s10827-024-00883-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2024] [Revised: 08/29/2024] [Accepted: 10/04/2024] [Indexed: 11/20/2024]
Abstract
We study the impact of bursts on spike statistics and neural signal transmission. We propose a stochastic burst algorithm that is applied to a burst-free spike train and adds a random number of temporally-jittered burst spikes to each spike. This simple algorithm ignores any possible stimulus-dependence of bursting but allows to relate spectra and signal-transmission characteristics of burst-free and burst-endowed spike trains. By averaging over the various statistical ensembles, we find a frequency-dependent factor connecting the linear and also the second-order susceptibility of the spike trains with and without bursts. The relation between spectra is more complicated: besides a frequency-dependent multiplicative factor it also involves an additional frequency-dependent offset. We confirm these relations for the (burst-free) spike trains of a stochastic integrate-and-fire neuron and identify frequency ranges in which the transmission is boosted or diminished by bursting. We then consider bursty spike trains of electroreceptor afferents of weakly electric fish and approach the role of burst spikes as follows. We compare the spectral statistics of the bursty spike train to (i) that of a spike train with burst spikes removed and to (ii) that of the spike train in (i) endowed by bursts according to our algorithm. Significant spectral features are explained by our signal-independent burst algorithm, e.g. the burst-induced boosting of the nonlinear response. A difference is seen in the information transfer for the original bursty spike train and our burst-endowed spike train. Our algorithm is thus helpful to identify different effects of bursting.
Collapse
Affiliation(s)
- Maria Schlungbaum
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Alexandra Barayeu
- Neuroethology, Institute for Neurobiology of Eberhard Karls University Tübingen, Auf der Morgenstelle 28, 72076, Tübingen, Germany
| | - Jan Grewe
- Neuroethology, Institute for Neurobiology of Eberhard Karls University Tübingen, Auf der Morgenstelle 28, 72076, Tübingen, Germany
- Bernstein Center for Computational Neuroscience Tübingen, Maria-von-Linden-Straße 6, 72076, Tübingen, Germany
| | - Jan Benda
- Neuroethology, Institute for Neurobiology of Eberhard Karls University Tübingen, Auf der Morgenstelle 28, 72076, Tübingen, Germany
- Bernstein Center for Computational Neuroscience Tübingen, Maria-von-Linden-Straße 6, 72076, Tübingen, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
2
|
Kiessling L, Lindner B. Extraction of parameters of a stochastic integrate-and-fire model with adaptation from voltage recordings. BIOLOGICAL CYBERNETICS 2024; 119:2. [PMID: 39738681 PMCID: PMC11685267 DOI: 10.1007/s00422-024-01000-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/07/2024] [Accepted: 11/21/2024] [Indexed: 01/02/2025]
Abstract
Integrate-and-fire models are an important class of phenomenological neuronal models that are frequently used in computational studies of single neural activity, population activity, and recurrent neural networks. If these models are used to understand and interpret electrophysiological data, it is important to reliably estimate the values of the model's parameters. However, there are no standard methods for the parameter estimation of Integrate-and-fire models. Here, we identify the model parameters of an adaptive integrate-and-fire neuron with temporally correlated noise by analyzing membrane potential and spike trains in response to a current step. Explicit formulas for the parameters are analytically derived by stationary and time-dependent ensemble averaging of the model dynamics. Specifically, we give mathematical expressions for the adaptation time constant, the adaptation strength, the membrane time constant, and the mean constant input current. These theoretical predictions are validated by numerical simulations for a broad range of system parameters. Importantly, we demonstrate that parameters can be extracted by using only a modest number of trials. This is particularly encouraging, as the number of trials in experimental settings is often limited. Hence, our formulas may be useful for the extraction of effective parameters from neurophysiological data obtained from standard current-step experiments.
Collapse
Affiliation(s)
- Lilli Kiessling
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Technische, Universit Berlin, Hardenbergstr. 36, 10623, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
3
|
Goldobin DS, di Volo M, Torcini A. Discrete Synaptic Events Induce Global Oscillations in Balanced Neural Networks. PHYSICAL REVIEW LETTERS 2024; 133:238401. [PMID: 39714685 DOI: 10.1103/physrevlett.133.238401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/10/2023] [Revised: 06/05/2024] [Accepted: 10/31/2024] [Indexed: 12/24/2024]
Abstract
Despite the fact that neural dynamics is triggered by discrete synaptic events, the neural response is usually obtained within the diffusion approximation representing the synaptic inputs as Gaussian noise. We derive a mean-field formalism encompassing synaptic shot noise for sparse balanced neural networks. For low (high) excitatory drive (inhibitory feedback) global oscillations emerge via continuous or hysteretic transitions, correctly predicted by our approach, but not from the diffusion approximation. At sufficiently low in-degrees the nature of these global oscillations changes from drift driven to cluster activation.
Collapse
Affiliation(s)
| | | | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, CY Cergy Paris Université, CNRS, UMR 8089, 95302 Cergy-Pontoise cedex, France
- CNR-Consiglio Nazionale delle Ricerche-Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
- INFN Sezione di Firenze, Via Sansone 1, 50019 Sesto Fiorentino, Italy
| |
Collapse
|
4
|
Vinci GV, Mattia M. Rosetta stone for the population dynamics of spiking neuron networks. Phys Rev E 2024; 110:034303. [PMID: 39425388 DOI: 10.1103/physreve.110.034303] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Accepted: 07/30/2024] [Indexed: 10/21/2024]
Abstract
Populations of spiking neuron models have densities of their microscopic variables (e.g., single-cell membrane potentials) whose evolution fully capture the collective dynamics of biological networks, even outside equilibrium. Despite its general applicability, the Fokker-Planck equation governing such evolution is mainly studied within the borders of the linear response theory, although alternative spectral expansion approaches offer some advantages in the study of the out-of-equilibrium dynamics. This is mainly due to the difficulty in computing the state-dependent coefficients of the expanded system of differential equations. Here, we address this issue by deriving analytic expressions for such coefficients by pairing perturbative solutions of the Fokker-Planck approach with their counterparts from the spectral expansion. A tight relationship emerges between several of these coefficients and the Laplace transform of the interspike interval density (i.e., the distribution of first-passage times). "Coefficients" like the current-to-rate gain function, the eigenvalues of the Fokker-Planck operator and its eigenfunctions at the boundaries are derived without resorting to integral expressions. For the leaky integrate-and-fire neurons, the coupling terms between stationary and nonstationary modes are also worked out paving the way to accurately characterize the critical points and the relaxation timescales in networks of interacting populations.
Collapse
|
5
|
Spaeth A, Haussler D, Teodorescu M. Model-agnostic neural mean field with a data-driven transfer function. NEUROMORPHIC COMPUTING AND ENGINEERING 2024; 4:034013. [PMID: 39310743 PMCID: PMC11413991 DOI: 10.1088/2634-4386/ad787f] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/04/2024] [Revised: 09/02/2024] [Accepted: 09/09/2024] [Indexed: 09/25/2024]
Abstract
As one of the most complex systems known to science, modeling brain behavior and function is both fascinating and extremely difficult. Empirical data is increasingly available from ex vivo human brain organoids and surgical samples, as well as in vivo animal models, so the problem of modeling the behavior of large-scale neuronal systems is more relevant than ever. The statistical physics concept of a mean-field model offers a tractable way to bridge the gap between single-neuron and population-level descriptions of neuronal activity, by modeling the behavior of a single representative neuron and extending this to the population. However, existing neural mean-field methods typically either take the limit of small interaction sizes, or are applicable only to the specific neuron models for which they were derived. This paper derives a mean-field model by fitting a transfer function called Refractory SoftPlus, which is simple yet applicable to a broad variety of neuron types. The transfer function is fitted numerically to simulated spike time data, and is entirely agnostic to the underlying neuronal dynamics. The resulting mean-field model predicts the response of a network of randomly connected neurons to a time-varying external stimulus with a high degree of accuracy. Furthermore, it enables an accurate approximate bifurcation analysis as a function of the level of recurrent input. This model does not assume large presynaptic rates or small postsynaptic potential size, allowing mean-field models to be developed even for populations with large interaction terms.
Collapse
Affiliation(s)
- Alex Spaeth
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| | - David Haussler
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| | - Mircea Teodorescu
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| |
Collapse
|
6
|
Ramlow L, Lindner B. Noise intensity of a Markov chain. Phys Rev E 2024; 110:014139. [PMID: 39161007 DOI: 10.1103/physreve.110.014139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2024] [Accepted: 07/08/2024] [Indexed: 08/21/2024]
Abstract
Stochastic transitions between discrete microscopic states play an important role in many physical and biological systems. Often these transitions lead to fluctuations on a macroscopic scale. A classic example from neuroscience is the stochastic opening and closing of ion channels and the resulting fluctuations in membrane current. When the microscopic transitions are fast, the macroscopic fluctuations are nearly uncorrelated and can be fully characterized by their mean and noise intensity. We show how, for an arbitrary Markov chain, the noise intensity can be determined from an algebraic equation, based on the transition rate matrix; these results are in agreement with earlier results from the theory of zero-frequency noise in quantum mechanical and classical systems. We demonstrate the validity of the theory using an analytically tractable two-state Markovian dichotomous noise, an eight-state model for a calcium channel subunit (De Young-Keizer model), and Markov models of the voltage-gated sodium and potassium channels as they appear in a stochastic version of the Hodgkin-Huxley model.
Collapse
|
7
|
Puttkammer F, Lindner B. Fluctuation-response relations for integrate-and-fire models with an absolute refractory period. BIOLOGICAL CYBERNETICS 2024; 118:7-19. [PMID: 38261004 PMCID: PMC11068698 DOI: 10.1007/s00422-023-00982-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 12/11/2023] [Indexed: 01/24/2024]
Abstract
We study the problem of relating the spontaneous fluctuations of a stochastic integrate-and-fire (IF) model to the response of the instantaneous firing rate to time-dependent stimulation if the IF model is endowed with a non-vanishing refractory period and a finite (stereotypical) spike shape. This seemingly harmless addition to the model is shown to complicate the analysis put forward by Lindner Phys. Rev. Lett. (2022), i.e., the incorporation of the reset into the model equation, the Rice-like averaging of the stochastic differential equation, and the application of the Furutsu-Novikov theorem. We derive a still exact (although more complicated) fluctuation-response relation (FRR) for an IF model with refractory state and a white Gaussian background noise. We also briefly discuss an approximation for the case of a colored Gaussian noise and conclude with a summary and outlook on open problems.
Collapse
Affiliation(s)
- Friedrich Puttkammer
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| |
Collapse
|
8
|
Spaeth A, Haussler D, Teodorescu M. Model-Agnostic Neural Mean Field With The Refractory SoftPlus Transfer Function. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.05.579047. [PMID: 38370695 PMCID: PMC10871173 DOI: 10.1101/2024.02.05.579047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/20/2024]
Abstract
Due to the complexity of neuronal networks and the nonlinear dynamics of individual neurons, it is challenging to develop a systems-level model which is accurate enough to be useful yet tractable enough to apply. Mean-field models which extrapolate from single-neuron descriptions to large-scale models can be derived from the neuron's transfer function, which gives its firing rate as a function of its synaptic input. However, analytically derived transfer functions are applicable only to the neurons and noise models from which they were originally derived. In recent work, approximate transfer functions have been empirically derived by fitting a sigmoidal curve, which imposes a maximum firing rate and applies only in the diffusion limit, restricting applications. In this paper, we propose an approximate transfer function called Refractory SoftPlus, which is simple yet applicable to a broad variety of neuron types. Refractory SoftPlus activation functions allow the derivation of simple empirically approximated mean-field models using simulation results, which enables prediction of the response of a network of randomly connected neurons to a time-varying external stimulus with a high degree of accuracy. These models also support an accurate approximate bifurcation analysis as a function of the level of recurrent input. Finally, the model works without assuming large presynaptic rates or small postsynaptic potential size, allowing mean-field models to be developed even for populations with large interaction terms.
Collapse
Affiliation(s)
- Alex Spaeth
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - David Haussler
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - Mircea Teodorescu
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
| |
Collapse
|
9
|
Richardson MJE. Linear and nonlinear integrate-and-fire neurons driven by synaptic shot noise with reversal potentials. Phys Rev E 2024; 109:024407. [PMID: 38491664 DOI: 10.1103/physreve.109.024407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Accepted: 12/18/2023] [Indexed: 03/18/2024]
Abstract
The steady-state firing rate and firing-rate response of the leaky and exponential integrate-and-fire models receiving synaptic shot noise with excitatory and inhibitory reversal potentials is examined. For the particular case where the underlying synaptic conductances are exponentially distributed, it is shown that the master equation for a population of such model neurons can be reduced from an integrodifferential form to a more tractable set of three differential equations. The system is nevertheless more challenging analytically than for current-based synapses: where possible, analytical results are provided with an efficient numerical scheme and code provided for other quantities. The increased tractability of the framework developed supports an ongoing critical comparison between models in which synapses are treated with and without reversal potentials, such as recently in the context of networks with balanced excitatory and inhibitory conductances.
Collapse
Affiliation(s)
- Magnus J E Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry CV4 7AL, United Kingdom
| |
Collapse
|
10
|
Becker LA, Li B, Priebe NJ, Seidemann E, Taillefumier T. Exact analysis of the subthreshold variability for conductance-based neuronal models with synchronous synaptic inputs. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.04.17.536739. [PMID: 37131647 PMCID: PMC10153111 DOI: 10.1101/2023.04.17.536739] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
The spiking activity of neocortical neurons exhibits a striking level of variability, even when these networks are driven by identical stimuli. The approximately Poisson firing of neurons has led to the hypothesis that these neural networks operate in the asynchronous state. In the asynchronous state neurons fire independently from one another, so that the probability that a neuron experience synchronous synaptic inputs is exceedingly low. While the models of asynchronous neurons lead to observed spiking variability, it is not clear whether the asynchronous state can also account for the level of subthreshold membrane potential variability. We propose a new analytical framework to rigorously quantify the subthreshold variability of a single conductance-based neuron in response to synaptic inputs with prescribed degrees of synchrony. Technically we leverage the theory of exchangeability to model input synchrony via jump-process-based synaptic drives; we then perform a moment analysis of the stationary response of a neuronal model with all-or-none conductances that neglects post-spiking reset. As a result, we produce exact, interpretable closed forms for the first two stationary moments of the membrane voltage, with explicit dependence on the input synaptic numbers, strengths, and synchrony. For biophysically relevant parameters, we find that the asynchronous regime only yields realistic subthreshold variability (voltage variance ≅ 4-9mV 2 ) when driven by a restricted number of large synapses, compatible with strong thalamic drive. By contrast, we find that achieving realistic subthreshold variability with dense cortico-cortical inputs requires including weak but nonzero input synchrony, consistent with measured pairwise spiking correlations. We also show that without synchrony, the neural variability averages out to zero for all scaling limits with vanishing synaptic weights, independent of any balanced state hypothesis. This result challenges the theoretical basis for mean-field theories of the asynchronous state.
Collapse
|
11
|
Lindner B. Fluctuation-Dissipation Relations for Spiking Neurons. PHYSICAL REVIEW LETTERS 2022; 129:198101. [PMID: 36399734 DOI: 10.1103/physrevlett.129.198101] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 09/27/2022] [Accepted: 10/17/2022] [Indexed: 06/16/2023]
Abstract
Spontaneous fluctuations and stimulus response are essential features of neural functioning, but how they are connected is poorly understood. I derive fluctuation-dissipation relations (FDR) between the spontaneous spike and voltage correlations and the firing rate susceptibility for (i) the leaky integrate-and-fire (IF) model with white noise and (ii) an IF model with arbitrary voltage dependence, an adaptation current, and correlated noise. The FDRs can be used to derive thus far unknown statistics analytically [model (i)] or the otherwise inaccessible intrinsic noise statistics [model (ii)].
Collapse
Affiliation(s)
- Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
12
|
Wang B, Aljadeff J. Multiplicative Shot-Noise: A New Route to Stability of Plastic Networks. PHYSICAL REVIEW LETTERS 2022; 129:068101. [PMID: 36018633 DOI: 10.1103/physrevlett.129.068101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 06/30/2022] [Indexed: 06/15/2023]
Abstract
Fluctuations of synaptic weights, among many other physical, biological, and ecological quantities, are driven by coincident events of two "parent" processes. We propose a multiplicative shot-noise model that can capture the behaviors of a broad range of such natural phenomena, and analytically derive an approximation that accurately predicts its statistics. We apply our results to study the effects of a multiplicative synaptic plasticity rule that was recently extracted from measurements in physiological conditions. Using mean-field theory analysis and network simulations, we investigate how this rule shapes the connectivity and dynamics of recurrent spiking neural networks. The multiplicative plasticity rule is shown to support efficient learning of input stimuli, and it gives a stable, unimodal synaptic-weight distribution with a large fraction of strong synapses. The strong synapses remain stable over long times but do not "run away." Our results suggest that the multiplicative shot-noise offers a new route to understand the tradeoff between flexibility and stability in neural circuits and other dynamic networks.
Collapse
Affiliation(s)
- Bin Wang
- Department of Physics, University of California San Diego, La Jolla, California 92093, USA
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California San Diego, La Jolla, California 92093, USA
| |
Collapse
|
13
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
14
|
di Volo M, Segneri M, Goldobin DS, Politi A, Torcini A. Coherent oscillations in balanced neural networks driven by endogenous fluctuations. CHAOS (WOODBURY, N.Y.) 2022; 32:023120. [PMID: 35232059 DOI: 10.1063/5.0075751] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Accepted: 01/26/2022] [Indexed: 06/14/2023]
Abstract
We present a detailed analysis of the dynamical regimes observed in a balanced network of identical quadratic integrate-and-fire neurons with sparse connectivity for homogeneous and heterogeneous in-degree distributions. Depending on the parameter values, either an asynchronous regime or periodic oscillations spontaneously emerge. Numerical simulations are compared with a mean-field model based on a self-consistent Fokker-Planck equation (FPE). The FPE reproduces quite well the asynchronous dynamics in the homogeneous case by either assuming a Poissonian or renewal distribution for the incoming spike trains. An exact self-consistent solution for the mean firing rate obtained in the limit of infinite in-degree allows identifying balanced regimes that can be either mean- or fluctuation-driven. A low-dimensional reduction of the FPE in terms of circular cumulants is also considered. Two cumulants suffice to reproduce the transition scenario observed in the network. The emergence of periodic collective oscillations is well captured both in the homogeneous and heterogeneous setups by the mean-field models upon tuning either the connectivity or the input DC current. In the heterogeneous situation, we analyze also the role of structural heterogeneity.
Collapse
Affiliation(s)
- Matteo di Volo
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| | - Marco Segneri
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| | - Denis S Goldobin
- Institute of Continuous Media Mechanics, Ural Branch of RAS, Acad. Korolev street 1, 614013 Perm, Russia
| | - Antonio Politi
- Institute for Pure and Applied Mathematics and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| |
Collapse
|
15
|
Chakraborty B, Mukhopadhyay S. Characterization of Generalizability of Spike Timing Dependent Plasticity Trained Spiking Neural Networks. Front Neurosci 2021; 15:695357. [PMID: 34776837 PMCID: PMC8589121 DOI: 10.3389/fnins.2021.695357] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2021] [Accepted: 09/29/2021] [Indexed: 11/30/2022] Open
Abstract
A Spiking Neural Network (SNN) is trained with Spike Timing Dependent Plasticity (STDP), which is a neuro-inspired unsupervised learning method for various machine learning applications. This paper studies the generalizability properties of the STDP learning processes using the Hausdorff dimension of the trajectories of the learning algorithm. The paper analyzes the effects of STDP learning models and associated hyper-parameters on the generalizability properties of an SNN. The analysis is used to develop a Bayesian optimization approach to optimize the hyper-parameters for an STDP model for improving the generalizability properties of an SNN.
Collapse
Affiliation(s)
- Biswadeep Chakraborty
- Department of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, United States
| | | |
Collapse
|
16
|
Foo C, Lozada A, Aljadeff J, Li Y, Wang JW, Slesinger PA, Kleinfeld D. Reinforcement learning links spontaneous cortical dopamine impulses to reward. Curr Biol 2021; 31:4111-4119.e4. [PMID: 34302743 DOI: 10.1016/j.cub.2021.06.069] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Revised: 05/28/2021] [Accepted: 06/24/2021] [Indexed: 11/15/2022]
Abstract
In their pioneering study on dopamine release, Romo and Schultz speculated "...that the amount of dopamine released by unmodulated spontaneous impulse activity exerts a tonic, permissive influence on neuronal processes more actively engaged in preparation of self-initiated movements...."1 Motivated by the suggestion of "spontaneous impulses," as well as by the "ramp up" of dopaminergic neuronal activity that occurs when rodents navigate to a reward,2-5 we asked two questions. First, are there spontaneous impulses of dopamine that are released in cortex? Using cell-based optical sensors of extrasynaptic dopamine, [DA]ex,6 we found that spontaneous dopamine impulses in cortex of naive mice occur at a rate of ∼0.01 per second. Next, can mice be trained to change the amplitude and/or timing of dopamine events triggered by internal brain dynamics, much as they can change the amplitude and timing of dopamine impulses based on an external cue?7-9 Using a reinforcement learning paradigm based solely on rewards that were gated by feedback from real-time measurements of [DA]ex, we found that mice can volitionally modulate their spontaneous [DA]ex. In particular, by only the second session of daily, hour-long training, mice increased the rate of impulses of [DA]ex, increased the amplitude of the impulses, and increased their tonic level of [DA]ex for a reward. Critically, mice learned to reliably elicit [DA]ex impulses prior to receiving a reward. These effects reversed when the reward was removed. We posit that spontaneous dopamine impulses may serve as a salient cognitive event in behavioral planning.
Collapse
Affiliation(s)
- Conrad Foo
- Department of Physics, University of California at San Diego, La Jolla, CA 92093, USA
| | - Adrian Lozada
- Department of Physics, University of California at San Diego, La Jolla, CA 92093, USA
| | - Johnatan Aljadeff
- Section of Neurobiology, University of California at San Diego, La Jolla, CA 92093, USA
| | - Yulong Li
- Peking University, School of Life Sciences, Peking University, Beijing 100871, P.R. China
| | - Jing W Wang
- Section of Neurobiology, University of California at San Diego, La Jolla, CA 92093, USA
| | - Paul A Slesinger
- Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY, USA.
| | - David Kleinfeld
- Department of Physics, University of California at San Diego, La Jolla, CA 92093, USA; Section of Neurobiology, University of California at San Diego, La Jolla, CA 92093, USA.
| |
Collapse
|
17
|
Knoll G, Lindner B. Recurrence-mediated suprathreshold stochastic resonance. J Comput Neurosci 2021; 49:407-418. [PMID: 34003421 PMCID: PMC8556192 DOI: 10.1007/s10827-021-00788-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 04/21/2021] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
It has previously been shown that the encoding of time-dependent signals by feedforward networks (FFNs) of processing units exhibits suprathreshold stochastic resonance (SSR), which is an optimal signal transmission for a finite level of independent, individual stochasticity in the single units. In this study, a recurrent spiking network is simulated to demonstrate that SSR can be also caused by network noise in place of intrinsic noise. The level of autonomously generated fluctuations in the network can be controlled by the strength of synapses, and hence the coding fraction (our measure of information transmission) exhibits a maximum as a function of the synaptic coupling strength. The presence of a coding peak at an optimal coupling strength is robust over a wide range of individual, network, and signal parameters, although the optimal strength and peak magnitude depend on the parameter being varied. We also perform control experiments with an FFN illustrating that the optimized coding fraction is due to the change in noise level and not from other effects entailed when changing the coupling strength. These results also indicate that the non-white (temporally correlated) network noise in general provides an extra boost to encoding performance compared to the FFN driven by intrinsic white noise fluctuations.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany. .,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany.,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
18
|
Platkiewicz J, Saccomano Z, McKenzie S, English D, Amarasingham A. Monosynaptic inference via finely-timed spikes. J Comput Neurosci 2021; 49:131-157. [PMID: 33507429 DOI: 10.1007/s10827-020-00770-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2019] [Revised: 09/04/2020] [Accepted: 10/19/2020] [Indexed: 10/22/2022]
Abstract
Observations of finely-timed spike relationships in population recordings have been used to support partial reconstruction of neural microcircuit diagrams. In this approach, fine-timescale components of paired spike train interactions are isolated and subsequently attributed to synaptic parameters. Recent perturbation studies strengthen the case for such an inference, yet the complete set of measurements needed to calibrate statistical models is unavailable. To address this gap, we study features of pairwise spiking in a large-scale in vivo dataset where presynaptic neurons were explicitly decoupled from network activity by juxtacellular stimulation. We then construct biophysical models of paired spike trains to reproduce the observed phenomenology of in vivo monosynaptic interactions, including both fine-timescale spike-spike correlations and firing irregularity. A key characteristic of these models is that the paired neurons are coupled by rapidly-fluctuating background inputs. We quantify a monosynapse's causal effect by comparing the postsynaptic train with its counterfactual, when the monosynapse is removed. Subsequently, we develop statistical techniques for estimating this causal effect from the pre- and post-synaptic spike trains. A particular focus is the justification and application of a nonparametric separation of timescale principle to implement synaptic inference. Using simulated data generated from the biophysical models, we characterize the regimes in which the estimators accurately identify the monosynaptic effect. A secondary goal is to initiate a critical exploration of neurostatistical assumptions in terms of biophysical mechanisms, particularly with regards to the challenging but arguably fundamental issue of fast, unobservable nonstationarities in background dynamics.
Collapse
Affiliation(s)
- Jonathan Platkiewicz
- Department of Mathematics, The City College of New York, The City University of New York, New York, NY, 10031, USA
| | - Zachary Saccomano
- Department of Biology, The Graduate Center, The City University of New York, New York, NY, 10016, USA
| | - Sam McKenzie
- Neuroscience Institute, New York University, New York, NY, 10016, USA
| | - Daniel English
- School of Neuroscience, Virginia Tech, Blacksburg, VA, 24060, USA
| | - Asohan Amarasingham
- Department of Mathematics, The City College of New York, The City University of New York, New York, NY, 10031, USA.
- Department of Biology, The Graduate Center, The City University of New York, New York, NY, 10016, USA.
- Departments of Computer Science and Psychology, The Graduate Center, The City University of New York, New York, NY, 10016, USA.
| |
Collapse
|
19
|
Bernardi D, Doron G, Brecht M, Lindner B. A network model of the barrel cortex combined with a differentiator detector reproduces features of the behavioral response to single-neuron stimulation. PLoS Comput Biol 2021; 17:e1007831. [PMID: 33556070 PMCID: PMC7895413 DOI: 10.1371/journal.pcbi.1007831] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 02/19/2021] [Accepted: 01/17/2021] [Indexed: 11/23/2022] Open
Abstract
The stimulation of a single neuron in the rat somatosensory cortex can elicit a behavioral response. The probability of a behavioral response does not depend appreciably on the duration or intensity of a constant stimulation, whereas the response probability increases significantly upon injection of an irregular current. Biological mechanisms that can potentially suppress a constant input signal are present in the dynamics of both neurons and synapses and seem ideal candidates to explain these experimental findings. Here, we study a large network of integrate-and-fire neurons with several salient features of neuronal populations in the rat barrel cortex. The model includes cellular spike-frequency adaptation, experimentally constrained numbers and types of chemical synapses endowed with short-term plasticity, and gap junctions. Numerical simulations of this model indicate that cellular and synaptic adaptation mechanisms alone may not suffice to account for the experimental results if the local network activity is read out by an integrator. However, a circuit that approximates a differentiator can detect the single-cell stimulation with a reliability that barely depends on the length or intensity of the stimulus, but that increases when an irregular signal is used. This finding is in accordance with the experimental results obtained for the stimulation of a regularly-spiking excitatory cell. It is widely assumed that only a large group of neurons can encode a stimulus or control behavior. This tenet of neuroscience has been challenged by experiments in which stimulating a single cortical neuron has had a measurable effect on an animal’s behavior. Recently, theoretical studies have explored how a single-neuron stimulation could be detected in a large recurrent network. However, these studies missed essential biological mechanisms of cortical networks and are unable to explain more recent experiments in the barrel cortex. Here, to describe the stimulated brain area, we propose and study a network model endowed with many important biological features of the barrel cortex. Importantly, we also investigate different readout mechanisms, i.e. ways in which the stimulation effects can propagate to other brain areas. We show that a readout network which tracks rapid variations in the local network activity is in agreement with the experiments. Our model demonstrates a possible mechanism for how the stimulation of a single neuron translates into a signal at the population level, which is taken as a proxy of the animal’s response. Our results illustrate the power of spiking neural networks to properly describe the effects of a single neuron’s activity.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- Center for Translational Neurophysiology of Speech and Communication, Fondazione Istituto Italiano di Tecnologia, Ferrara, Italy
- * E-mail:
| | - Guy Doron
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Michael Brecht
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
20
|
Zerlaut Y, Zucca S, Panzeri S, Fellin T. The Spectrum of Asynchronous Dynamics in Spiking Networks as a Model for the Diversity of Non-rhythmic Waking States in the Neocortex. Cell Rep 2020; 27:1119-1132.e7. [PMID: 31018128 PMCID: PMC6486483 DOI: 10.1016/j.celrep.2019.03.102] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2018] [Revised: 03/02/2019] [Accepted: 03/27/2019] [Indexed: 11/15/2022] Open
Abstract
The awake cortex exhibits diverse non-rhythmic network states. However, how these states emerge and how each state impacts network function is unclear. Here, we demonstrate that model networks of spiking neurons with moderate recurrent interactions display a spectrum of non-rhythmic asynchronous dynamics based on the level of afferent excitation, from afferent input-dominated (AD) regimes, characterized by unbalanced synaptic currents and sparse firing, to recurrent input-dominated (RD) regimes, characterized by balanced synaptic currents and dense firing. The model predicted regime-specific relationships between different neural biophysical properties, which were all experimentally validated in the somatosensory cortex (S1) of awake mice. Moreover, AD regimes more precisely encoded spatiotemporal patterns of presynaptic activity, while RD regimes better encoded the strength of afferent inputs. These results provide a theoretical foundation for how recurrent neocortical circuits generate non-rhythmic waking states and how these different states modulate the processing of incoming information.
Collapse
Affiliation(s)
- Yann Zerlaut
- Neural Coding Laboratory, Istituto Italiano di Tecnologia, Genova, Italy; Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy.
| | - Stefano Zucca
- Neural Coding Laboratory, Istituto Italiano di Tecnologia, Genova, Italy; Optical Approaches to Brain Function Laboratory, Istituto Italiano di Tecnologia, Genova, Italy
| | - Stefano Panzeri
- Neural Coding Laboratory, Istituto Italiano di Tecnologia, Genova, Italy; Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy.
| | - Tommaso Fellin
- Neural Coding Laboratory, Istituto Italiano di Tecnologia, Genova, Italy; Optical Approaches to Brain Function Laboratory, Istituto Italiano di Tecnologia, Genova, Italy.
| |
Collapse
|
21
|
Bostner Ž, Knoll G, Lindner B. Information filtering by coincidence detection of synchronous population output: analytical approaches to the coherence function of a two-stage neural system. BIOLOGICAL CYBERNETICS 2020; 114:403-418. [PMID: 32583370 PMCID: PMC7326833 DOI: 10.1007/s00422-020-00838-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Accepted: 05/18/2020] [Indexed: 06/11/2023]
Abstract
Information about time-dependent sensory stimuli is encoded in the activity of neural populations; distinct aspects of the stimulus are read out by different types of neurons: while overall information is perceived by integrator cells, so-called coincidence detector cells are driven mainly by the synchronous activity in the population that encodes predominantly high-frequency content of the input signal (high-pass information filtering). Previously, an analytically accessible statistic called the partial synchronous output was introduced as a proxy for the coincidence detector cell's output in order to approximate its information transmission. In the first part of the current paper, we compare the information filtering properties (specifically, the coherence function) of this proxy to those of a simple coincidence detector neuron. We show that the latter's coherence function can indeed be well-approximated by the partial synchronous output with a time scale and threshold criterion that are related approximately linearly to the membrane time constant and firing threshold of the coincidence detector cell. In the second part of the paper, we propose an alternative theory for the spectral measures (including the coherence) of the coincidence detector cell that combines linear-response theory for shot-noise driven integrate-and-fire neurons with a novel perturbation ansatz for the spectra of spike-trains driven by colored noise. We demonstrate how the variability of the synaptic weights for connections from the population to the coincidence detector can shape the information transmission of the entire two-stage system.
Collapse
Affiliation(s)
- Žiga Bostner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| |
Collapse
|
22
|
Manz P, Goedeke S, Memmesheimer RM. Dynamics and computation in mixed networks containing neurons that accelerate towards spiking. Phys Rev E 2019; 100:042404. [PMID: 31770941 DOI: 10.1103/physreve.100.042404] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2018] [Indexed: 11/07/2022]
Abstract
Networks in the brain consist of different types of neurons. Here we investigate the influence of neuron diversity on the dynamics, phase space structure, and computational capabilities of spiking neural networks. We find that already a single neuron of a different type can qualitatively change the network dynamics and that mixed networks may combine the computational capabilities of ones with a single-neuron type. We study inhibitory networks of concave leaky (LIF) and convex "antileaky" (XIF) integrate-and-fire neurons that generalize irregularly spiking nonchaotic LIF neuron networks. Endowed with simple conductance-based synapses for XIF neurons, our networks can generate a balanced state of irregular asynchronous spiking as well. We determine the voltage probability distributions and self-consistent firing rates assuming Poisson input with finite-size spike impacts. Further, we compute the full spectrum of Lyapunov exponents (LEs) and the covariant Lyapunov vectors (CLVs) specifying the corresponding perturbation directions. We find that there is approximately one positive LE for each XIF neuron. This indicates in particular that a single XIF neuron renders the network dynamics chaotic. A simple mean-field approach, which can be justified by properties of the CLVs, explains the finding. As an application, we propose a spike-based computing scheme where our networks serve as computational reservoirs and their different stability properties yield different computational capabilities.
Collapse
Affiliation(s)
- Paul Manz
- Neural Network Dynamics and Computation, Institute for Genetics, University of Bonn, 53115 Bonn, Germany
| | - Sven Goedeke
- Neural Network Dynamics and Computation, Institute for Genetics, University of Bonn, 53115 Bonn, Germany
| | - Raoul-Martin Memmesheimer
- Neural Network Dynamics and Computation, Institute for Genetics, University of Bonn, 53115 Bonn, Germany
| |
Collapse
|
23
|
Baker C, Ebsch C, Lampl I, Rosenbaum R. Correlated states in balanced neuronal networks. Phys Rev E 2019; 99:052414. [PMID: 31212573 DOI: 10.1103/physreve.99.052414] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2018] [Indexed: 06/09/2023]
Abstract
Understanding the magnitude and structure of interneuronal correlations and their relationship to synaptic connectivity structure is an important and difficult problem in computational neuroscience. Early studies show that neuronal network models with excitatory-inhibitory balance naturally create very weak spike train correlations, defining the "asynchronous state." Later work showed that, under some connectivity structures, balanced networks can produce larger correlations between some neuron pairs, even when the average correlation is very small. All of these previous studies assume that the local network receives feedforward synaptic input from a population of uncorrelated spike trains. We show that when spike trains providing feedforward input are correlated, the downstream recurrent network produces much larger correlations. We provide an in-depth analysis of the resulting "correlated state" in balanced networks and show that, unlike the asynchronous state, it produces a tight excitatory-inhibitory balance consistent with in vivo cortical recordings.
Collapse
Affiliation(s)
- Cody Baker
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Christopher Ebsch
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Ilan Lampl
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, 7610001, Israel
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana 46556, USA
| |
Collapse
|
24
|
Bernardi D, Lindner B. Detecting single-cell stimulation in a large network of integrate-and-fire neurons. Phys Rev E 2019; 99:032304. [PMID: 30999410 DOI: 10.1103/physreve.99.032304] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2018] [Indexed: 06/09/2023]
Abstract
Several experiments have shown that the stimulation of a single neuron in the cortex can influence the local network activity and even the behavior of an animal. From the theoretical point of view, it is not clear how stimulating a single cell in a cortical network can evoke a statistically significant change in the activity of a large population. Our previous study considered a random network of integrate-and-fire neurons and proposed a way of detecting the stimulation of a single neuron in the activity of a local network: a threshold detector biased toward a specific subset of neurons. Here, we revisit this model and extend it by introducing a second network acting as a readout. In the simplest scenario, the readout consists of a collection of integrate-and-fire neurons with no recurrent connections. In this case, the ability to detect the stimulus does not improve. However, a readout network with both feed-forward and local recurrent inhibition permits detection with a very small bias, if compared to the readout scheme introduced previously. The crucial role of inhibition is to reduce global input cross correlations, the main factor limiting detectability. Finally, we show that this result is robust if recurrent excitatory connections are included or if a different kind of readout bias (in the synaptic amplitudes instead of connection probability) is used.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
25
|
Heiberg T, Kriener B, Tetzlaff T, Einevoll GT, Plesser HE. Firing-rate models for neurons with a broad repertoire of spiking behaviors. J Comput Neurosci 2018; 45:103-132. [PMID: 30146661 PMCID: PMC6208914 DOI: 10.1007/s10827-018-0693-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2018] [Revised: 08/01/2018] [Accepted: 08/02/2018] [Indexed: 11/29/2022]
Abstract
Capturing the response behavior of spiking neuron models with rate-based models facilitates the investigation of neuronal networks using powerful methods for rate-based network dynamics. To this end, we investigate the responses of two widely used neuron model types, the Izhikevich and augmented multi-adapative threshold (AMAT) models, to a range of spiking inputs ranging from step responses to natural spike data. We find (i) that linear-nonlinear firing rate models fitted to test data can be used to describe the firing-rate responses of AMAT and Izhikevich spiking neuron models in many cases; (ii) that firing-rate responses are generally too complex to be captured by first-order low-pass filters but require bandpass filters instead; (iii) that linear-nonlinear models capture the response of AMAT models better than of Izhikevich models; (iv) that the wide range of response types evoked by current-injection experiments collapses to few response types when neurons are driven by stationary or sinusoidally modulated Poisson input; and (v) that AMAT and Izhikevich models show different responses to spike input despite identical responses to current injections. Together, these findings suggest that rate-based models of network dynamics may capture a wider range of neuronal response properties by incorporating second-order bandpass filters fitted to responses of spiking model neurons. These models may contribute to bringing rate-based network modeling closer to the reality of biological neuronal networks.
Collapse
Affiliation(s)
- Thomas Heiberg
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Birgit Kriener
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway.,Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, Jülich, Germany.,Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Jülich, Germany.,JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Gaute T Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway.,Department of Physics, University of Oslo, Oslo, Norway
| | - Hans E Plesser
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway. .,Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, Jülich, Germany.
| |
Collapse
|
26
|
Bird AD, Richardson MJE. Transmission of temporally correlated spike trains through synapses with short-term depression. PLoS Comput Biol 2018; 14:e1006232. [PMID: 29933363 PMCID: PMC6039054 DOI: 10.1371/journal.pcbi.1006232] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Revised: 07/10/2018] [Accepted: 05/24/2018] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic depression, caused by depletion of releasable neurotransmitter, modulates the strength of neuronal connections in a history-dependent manner. Quantifying the statistics of synaptic transmission requires stochastic models that link probabilistic neurotransmitter release with presynaptic spike-train statistics. Common approaches are to model the presynaptic spike train as either regular or a memory-less Poisson process: few analytical results are available that describe depressing synapses when the afferent spike train has more complex, temporally correlated statistics such as bursts. Here we present a series of analytical results—from vesicle release-site occupancy statistics, via neurotransmitter release, to the post-synaptic voltage mean and variance—for depressing synapses driven by correlated presynaptic spike trains. The class of presynaptic drive considered is that fully characterised by the inter-spike-interval distribution and encompasses a broad range of models used for neuronal circuit and network analyses, such as integrate-and-fire models with a complete post-spike reset and receiving sufficiently short-time correlated drive. We further demonstrate that the derived post-synaptic voltage mean and variance allow for a simple and accurate approximation of the firing rate of the post-synaptic neuron, using the exponential integrate-and-fire model as an example. These results extend the level of biological detail included in models of synaptic transmission and will allow for the incorporation of more complex and physiologically relevant firing patterns into future studies of neuronal networks. Synapses between neurons transmit signals with strengths that vary with the history of their activity, over scales from milliseconds to decades. Short-term changes in synaptic strength modulate and sculpt ongoing neuronal activity, whereas long-term changes underpin memory formation. Here we focus on changes of strength over timescales of less than a second caused by transitory depletion of the neurotransmitters that carry signals across the synapse. Neurotransmitters are stored in small vesicles that release their contents, with a certain probability, when the presynaptic neuron is active. Once a vesicle has been used it is replenished after a variable delay. There is therefore a complex interaction between the pattern of incoming signals to the synapse and the probablistic release and restock of packaged neurotransmitter. Here we extend existing models to examine how correlated synaptic activity is transmitted through synapses and affects the voltage fluctuations and firing rate of the target neuron. Our results provide a framework that will allow for the inclusion of biophysically realistic synaptic behaviour in studies of neuronal circuits.
Collapse
Affiliation(s)
- Alex D. Bird
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
- Ernst Strüngmann Institute for Neuroscience, Max Planck Society, Frankfurt, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt, Germany
- * E-mail: (ADB); (MJER)
| | - Magnus J. E. Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry, United Kingdom
- * E-mail: (ADB); (MJER)
| |
Collapse
|
27
|
Pena RFO, Vellmer S, Bernardi D, Roque AC, Lindner B. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks. Front Comput Neurosci 2018; 12:9. [PMID: 29551968 PMCID: PMC5840464 DOI: 10.3389/fncom.2018.00009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2017] [Accepted: 02/07/2018] [Indexed: 11/13/2022] Open
Abstract
Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.
Collapse
Affiliation(s)
- Rodrigo F O Pena
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Sebastian Vellmer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Davide Bernardi
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Antonio C Roque
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| |
Collapse
|
28
|
Up-Down-Like Background Spiking Can Enhance Neural Information Transmission. eNeuro 2018; 4:eN-TNC-0282-17. [PMID: 29354678 PMCID: PMC5773284 DOI: 10.1523/eneuro.0282-17.2017] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2017] [Revised: 11/15/2017] [Accepted: 11/20/2017] [Indexed: 11/23/2022] Open
Abstract
How neurons transmit information about sensory or internal signals is strongly influenced by ongoing internal activity. Depending on brain state, this background spiking can occur asynchronously or clustered in up states, periods of collective firing that are interspersed by silent down states. Here, we study which effect such up-down (UD) transitions have on signal transmission. In a simple model, we obtain numerical and analytical results for information theoretic measures. We find that, surprisingly, an UD background can benefit information transmission: when background activity is sparse, it is advantageous to distribute spikes into up states rather than uniformly in time. We reproduce the same effect in a more realistic recurrent network and show that signal transmission is further improved by incorporating that up states propagate across cortex as traveling waves. We propose that traveling UD activity might represent a compromise between reducing metabolic strain and maintaining information transmission capabilities.
Collapse
|
29
|
Shomali SR, Ahmadabadi MN, Shimazaki H, Rasuli SN. How does transient signaling input affect the spike timing of postsynaptic neuron near the threshold regime: an analytical study. J Comput Neurosci 2017; 44:147-171. [PMID: 29192377 PMCID: PMC5851711 DOI: 10.1007/s10827-017-0664-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Revised: 07/14/2017] [Accepted: 09/11/2017] [Indexed: 11/05/2022]
Abstract
The noisy threshold regime, where even a small set of presynaptic neurons can significantly affect postsynaptic spike-timing, is suggested as a key requisite for computation in neurons with high variability. It also has been proposed that signals under the noisy conditions are successfully transferred by a few strong synapses and/or by an assembly of nearly synchronous synaptic activities. We analytically investigate the impact of a transient signaling input on a leaky integrate-and-fire postsynaptic neuron that receives background noise near the threshold regime. The signaling input models a single strong synapse or a set of synchronous synapses, while the background noise represents a lot of weak synapses. We find an analytic solution that explains how the first-passage time (ISI) density is changed by transient signaling input. The analysis allows us to connect properties of the signaling input like spike timing and amplitude with postsynaptic first-passage time density in a noisy environment. Based on the analytic solution, we calculate the Fisher information with respect to the signaling input’s amplitude. For a wide range of amplitudes, we observe a non-monotonic behavior for the Fisher information as a function of background noise. Moreover, Fisher information non-trivially depends on the signaling input’s amplitude; changing the amplitude, we observe one maximum in the high level of the background noise. The single maximum splits into two maximums in the low noise regime. This finding demonstrates the benefit of the analytic solution in investigating signal transfer by neurons.
Collapse
Affiliation(s)
- Safura Rashid Shomali
- School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5746 (1954851167), Tehran, Iran.
| | - Majid Nili Ahmadabadi
- Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, 14395-515, Iran
| | - Hideaki Shimazaki
- Graduate School of Informatics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto, 606-8501, Japan.,Honda Research Institute Japan, Honcho 8-1, Wako-shi, Saitama, 351-0188, Japan
| | - Seyyed Nader Rasuli
- Department of Physics, University of Guilan, Rasht, 41335-1914, Iran.,School of Physics, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran, Iran
| |
Collapse
|
30
|
Bistability and up/down state alternations in inhibition-dominated randomly connected networks of LIF neurons. Sci Rep 2017; 7:11916. [PMID: 28931930 PMCID: PMC5607291 DOI: 10.1038/s41598-017-12033-y] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2017] [Accepted: 08/30/2017] [Indexed: 11/09/2022] Open
Abstract
Electrophysiological recordings in cortex in vivo have revealed a rich variety of dynamical regimes ranging from irregular asynchronous states to a diversity of synchronized states, depending on species, anesthesia, and external stimulation. The average population firing rate in these states is typically low. We study analytically and numerically a network of sparsely connected excitatory and inhibitory integrate-and-fire neurons in the inhibition-dominated, low firing rate regime. For sufficiently high values of the external input, the network exhibits an asynchronous low firing frequency state (L). Depending on synaptic time constants, we show that two scenarios may occur when external inputs are decreased: (1) the L state can destabilize through a Hopf bifucation as the external input is decreased, leading to synchronized oscillations spanning d δ to β frequencies; (2) the network can reach a bistable region, between the low firing frequency network state (L) and a quiescent one (Q). Adding an adaptation current to excitatory neurons leads to spontaneous alternations between L and Q states, similar to experimental observations on UP and DOWN states alternations.
Collapse
|
31
|
Bernardi D, Lindner B. Optimal Detection of a Localized Perturbation in Random Networks of Integrate-and-Fire Neurons. PHYSICAL REVIEW LETTERS 2017; 118:268301. [PMID: 28707933 DOI: 10.1103/physrevlett.118.268301] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/19/2016] [Indexed: 06/07/2023]
Abstract
Experimental and theoretical studies suggest that cortical networks are chaotic and coding relies on averages over large populations. However, there is evidence that rats can respond to the short stimulation of a single cortical cell, a theoretically unexplained fact. We study effects of single-cell stimulation on a large recurrent network of integrate-and-fire neurons and propose a simple way to detect the perturbation. Detection rates obtained from simulations and analytical estimates are similar to experimental response rates if the readout is slightly biased towards specific neurons. Near-optimal detection is attained for a broad range of intermediate values of the mean coupling between neurons.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
32
|
Augustin M, Ladenbauer J, Baumann F, Obermayer K. Low-dimensional spike rate models derived from networks of adaptive integrate-and-fire neurons: Comparison and implementation. PLoS Comput Biol 2017. [PMID: 28644841 PMCID: PMC5507472 DOI: 10.1371/journal.pcbi.1005545] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023] Open
Abstract
The spiking activity of single neurons can be well described by a nonlinear integrate-and-fire model that includes somatic adaptation. When exposed to fluctuating inputs sparsely coupled populations of these model neurons exhibit stochastic collective dynamics that can be effectively characterized using the Fokker-Planck equation. This approach, however, leads to a model with an infinite-dimensional state space and non-standard boundary conditions. Here we derive from that description four simple models for the spike rate dynamics in terms of low-dimensional ordinary differential equations using two different reduction techniques: one uses the spectral decomposition of the Fokker-Planck operator, the other is based on a cascade of two linear filters and a nonlinearity, which are determined from the Fokker-Planck equation and semi-analytically approximated. We evaluate the reduced models for a wide range of biologically plausible input statistics and find that both approximation approaches lead to spike rate models that accurately reproduce the spiking behavior of the underlying adaptive integrate-and-fire population. Particularly the cascade-based models are overall most accurate and robust, especially in the sensitive region of rapidly changing input. For the mean-driven regime, when input fluctuations are not too strong and fast, however, the best performing model is based on the spectral decomposition. The low-dimensional models also well reproduce stable oscillatory spike rate dynamics that are generated either by recurrent synaptic excitation and neuronal adaptation or through delayed inhibitory synaptic feedback. The computational demands of the reduced models are very low but the implementation complexity differs between the different model variants. Therefore we have made available implementations that allow to numerically integrate the low-dimensional spike rate models as well as the Fokker-Planck partial differential equation in efficient ways for arbitrary model parametrizations as open source software. The derived spike rate descriptions retain a direct link to the properties of single neurons, allow for convenient mathematical analyses of network states, and are well suited for application in neural mass/mean-field based brain network models. Characterizing the dynamics of biophysically modeled, large neuronal networks usually involves extensive numerical simulations. As an alternative to this expensive procedure we propose efficient models that describe the network activity in terms of a few ordinary differential equations. These systems are simple to solve and allow for convenient investigations of asynchronous, oscillatory or chaotic network states because linear stability analyses and powerful related methods are readily applicable. We build upon two research lines on which substantial efforts have been exerted in the last two decades: (i) the development of single neuron models of reduced complexity that can accurately reproduce a large repertoire of observed neuronal behavior, and (ii) different approaches to approximate the Fokker-Planck equation that represents the collective dynamics of large neuronal networks. We combine these advances and extend recent approximation methods of the latter kind to obtain spike rate models that surprisingly well reproduce the macroscopic dynamics of the underlying neuronal network. At the same time the microscopic properties are retained through the single neuron model parameters. To enable a fast adoption we have released an efficient Python implementation as open source software under a free license.
Collapse
Affiliation(s)
- Moritz Augustin
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Josef Ladenbauer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, École Normale Supérieure, Paris, France
| | - Fabian Baumann
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Klaus Obermayer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
33
|
Droste F, Lindner B. Exact analytical results for integrate-and-fire neurons driven by excitatory shot noise. J Comput Neurosci 2017; 43:81-91. [PMID: 28585050 DOI: 10.1007/s10827-017-0649-5] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2017] [Revised: 04/14/2017] [Accepted: 05/04/2017] [Indexed: 11/24/2022]
Abstract
A neuron receives input from other neurons via electrical pulses, so-called spikes. The pulse-like nature of the input is frequently neglected in analytical studies; instead, the input is usually approximated to be Gaussian. Recent experimental studies have shown, however, that an assumption underlying this approximation is often not met: Individual presynaptic spikes can have a significant effect on a neuron's dynamics. It is thus desirable to explicitly account for the pulse-like nature of neural input, i.e. consider neurons driven by a shot noise - a long-standing problem that is mathematically challenging. In this work, we exploit the fact that excitatory shot noise with exponentially distributed weights can be obtained as a limit case of dichotomous noise, a Markovian two-state process. This allows us to obtain novel exact expressions for the stationary voltage density and the moments of the interspike-interval density of general integrate-and-fire neurons driven by such an input. For the special case of leaky integrate-and-fire neurons, we also give expressions for the power spectrum and the linear response to a signal. We verify and illustrate our expressions by comparison to simulations of leaky-, quadratic- and exponential integrate-and-fire neurons.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany. .,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany
| |
Collapse
|
34
|
Lai YM, de Kamps M. Population density equations for stochastic processes with memory kernels. Phys Rev E 2017; 95:062125. [PMID: 28709222 DOI: 10.1103/physreve.95.062125] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2016] [Indexed: 06/07/2023]
Abstract
We present a method for solving population density equations (PDEs)--a mean-field technique describing homogeneous populations of uncoupled neurons-where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation-a recent result from random network theory-describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.
Collapse
Affiliation(s)
- Yi Ming Lai
- Institute for Artificial and Biological Computation, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom
| | - Marc de Kamps
- Institute for Artificial and Biological Computation, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom
| |
Collapse
|
35
|
Exact firing time statistics of neurons driven by discrete inhibitory noise. Sci Rep 2017; 7:1577. [PMID: 28484244 PMCID: PMC5431561 DOI: 10.1038/s41598-017-01658-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2017] [Accepted: 03/29/2017] [Indexed: 12/15/2022] Open
Abstract
Neurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory pre- synaptic neurons, which determines the firing activity of the stimulated neuron. In order to investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous post- synaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.
Collapse
|
36
|
Droste F, Lindner B. Exact results for power spectrum and susceptibility of a leaky integrate-and-fire neuron with two-state noise. Phys Rev E 2017; 95:012411. [PMID: 28208429 DOI: 10.1103/physreve.95.012411] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2016] [Indexed: 11/07/2022]
Abstract
The response properties of excitable systems driven by colored noise are of great interest, but are usually mathematically only accessible via approximations. For this reason, dichotomous noise, a rare example of a colored noise leading often to analytically tractable problems, has been extensively used in the study of stochastic systems. Here, we calculate exact expressions for the power spectrum and the susceptibility of a leaky integrate-and-fire neuron driven by asymmetric dichotomous noise. While our results are in excellent agreement with simulations, they also highlight a limitation of using dichotomous noise as a simple model for more complex fluctuations: Both power spectrum and susceptibility exhibit an undamped periodic structure, the origin of which we discuss in detail.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
37
|
Cain N, Iyer R, Koch C, Mihalas S. The Computational Properties of a Simplified Cortical Column Model. PLoS Comput Biol 2016; 12:e1005045. [PMID: 27617444 PMCID: PMC5019422 DOI: 10.1371/journal.pcbi.1005045] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2015] [Accepted: 07/01/2016] [Indexed: 01/09/2023] Open
Abstract
The mammalian neocortex has a repetitious, laminar structure and performs functions integral to higher cognitive processes, including sensory perception, memory, and coordinated motor output. What computations does this circuitry subserve that link these unique structural elements to their function? Potjans and Diesmann (2014) parameterized a four-layer, two cell type (i.e. excitatory and inhibitory) model of a cortical column with homogeneous populations and cell type dependent connection probabilities. We implement a version of their model using a displacement integro-partial differential equation (DiPDE) population density model. This approach, exact in the limit of large homogeneous populations, provides a fast numerical method to solve equations describing the full probability density distribution of neuronal membrane potentials. It lends itself to quickly analyzing the mean response properties of population-scale firing rate dynamics. We use this strategy to examine the input-output relationship of the Potjans and Diesmann cortical column model to understand its computational properties. When inputs are constrained to jointly and equally target excitatory and inhibitory neurons, we find a large linear regime where the effect of a multi-layer input signal can be reduced to a linear combination of component signals. One of these, a simple subtractive operation, can act as an error signal passed between hierarchical processing stages. What computations do existing biophysically-plausible models of cortex perform on their inputs, and how do these computations relate to theories of cortical processing? We begin with a computational model of cortical tissue and seek to understand its input/output transformations. Our approach limits confirmation bias, and differs from a more constructionist approach of starting with a computational theory and then creating a model that can implement its necessary features. We here choose a population-level modeling technique that does not sacrifice accuracy, as it well-approximates the mean firing-rate of a population of leaky integrate-and-fire neurons. We extend this approach to simulate recurrently coupled neural populations, and characterize the computational properties of the Potjans and Diesmann cortical column model. We find that this model is capable of computing linear operations and naturally generates a subtraction operation implicated in theories of predictive coding. Although our quantitative findings are restricted to this particular model, we demonstrate that these conclusions are not highly sensitive to the model parameterization.
Collapse
Affiliation(s)
- Nicholas Cain
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Ramakrishnan Iyer
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Christof Koch
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Stefan Mihalas
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| |
Collapse
|
38
|
Roles for Coincidence Detection in Coding Amplitude-Modulated Sounds. PLoS Comput Biol 2016; 12:e1004997. [PMID: 27322612 PMCID: PMC4920552 DOI: 10.1371/journal.pcbi.1004997] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2016] [Accepted: 05/25/2016] [Indexed: 12/30/2022] Open
Abstract
Many sensory neurons encode temporal information by detecting coincident arrivals of synaptic inputs. In the mammalian auditory brainstem, binaural neurons of the medial superior olive (MSO) are known to act as coincidence detectors, whereas in the lateral superior olive (LSO) roles of coincidence detection have remained unclear. LSO neurons receive excitatory and inhibitory inputs driven by ipsilateral and contralateral acoustic stimuli, respectively, and vary their output spike rates according to interaural level differences. In addition, LSO neurons are also sensitive to binaural phase differences of low-frequency tones and envelopes of amplitude-modulated (AM) sounds. Previous physiological recordings in vivo found considerable variations in monaural AM-tuning across neurons. To investigate the underlying mechanisms of the observed temporal tuning properties of LSO and their sources of variability, we used a simple coincidence counting model and examined how specific parameters of coincidence detection affect monaural and binaural AM coding. Spike rates and phase-locking of evoked excitatory and spontaneous inhibitory inputs had only minor effects on LSO output to monaural AM inputs. In contrast, the coincidence threshold of the model neuron affected both the overall spike rates and the half-peak positions of the AM-tuning curve, whereas the width of the coincidence window merely influenced the output spike rates. The duration of the refractory period affected only the low-frequency portion of the monaural AM-tuning curve. Unlike monaural AM coding, temporal factors, such as the coincidence window and the effective duration of inhibition, played a major role in determining the trough positions of simulated binaural phase-response curves. In addition, empirically-observed level-dependence of binaural phase-coding was reproduced in the framework of our minimalistic coincidence counting model. These modeling results suggest that coincidence detection of excitatory and inhibitory synaptic inputs is essential for LSO neurons to encode both monaural and binaural AM sounds. Detecting coincident arrivals of synaptic inputs is a shared fundamental property of many sensory neurons. Such 'coincidence detection' usually refers to the detection of synchronized excitatory inputs only. Experimental evidence, however, indicated that some auditory neurons are also sensitive to the relative timing of excitatory and inhibitory synaptic inputs. This type of sensitivity is suggested to be important for coding temporal information of amplitude-modulated sounds, such as speech and other naturalistic sounds. In this study, we used a minimal model of coincidence detection to identify the key elements for temporal information processing. Our series of simulations demonstrated that (1) the threshold and time window for coincidence detection were the major factors for determining the response properties to excitatory inputs, and that (2) timed interactions between excitatory and inhibitory synaptic inputs are responsible for determining the temporal tuning properties of the neuron. These results suggest that coincidence detection is an essential function of neurons that detect the 'anti-coincidence' of excitatory and inhibitory inputs to encode temporal information of sounds.
Collapse
|
39
|
Rosenbaum R. A Diffusion Approximation and Numerical Methods for Adaptive Neuron Models with Stochastic Inputs. Front Comput Neurosci 2016; 10:39. [PMID: 27148036 PMCID: PMC4840919 DOI: 10.3389/fncom.2016.00039] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2015] [Accepted: 04/04/2016] [Indexed: 11/16/2022] Open
Abstract
Characterizing the spiking statistics of neurons receiving noisy synaptic input is a central problem in computational neuroscience. Monte Carlo approaches to this problem are computationally expensive and often fail to provide mechanistic insight. Thus, the field has seen the development of mathematical and numerical approaches, often relying on a Fokker-Planck formalism. These approaches force a compromise between biological realism, accuracy and computational efficiency. In this article we develop an extension of existing diffusion approximations to more accurately approximate the response of neurons with adaptation currents and noisy synaptic currents. The implementation refines existing numerical schemes for solving the associated Fokker-Planck equations to improve computationally efficiency and accuracy. Computer code implementing the developed algorithms is made available to the public.
Collapse
Affiliation(s)
- Robert Rosenbaum
- Applied and Computational Mathematics and Statistics, University of Notre Dame Notre Dame, IN, USA
| |
Collapse
|
40
|
Rajdl K, Lansky P. Stein's neuronal model with pooled renewal input. BIOLOGICAL CYBERNETICS 2015; 109:389-399. [PMID: 25910437 DOI: 10.1007/s00422-015-0650-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2014] [Accepted: 04/08/2015] [Indexed: 06/04/2023]
Abstract
The input of Stein's model of a single neuron is usually described by using a Poisson process, which is assumed to represent the behaviour of spikes pooled from a large number of presynaptic spike trains. However, such a description of the input is not always appropriate as the variability cannot be separated from the intensity. Therefore, we create and study Stein's model with a more general input, a sum of equilibrium renewal processes. The mean and variance of the membrane potential are derived for this model. Using these formulas and numerical simulations, the model is analyzed to study the influence of the input variability on the properties of the membrane potential and the output spike trains. The generalized Stein's model is compared with the original Stein's model with Poissonian input using the relative difference of variances of membrane potential at steady state and the integral square error of output interspike intervals. Both of the criteria show large differences between the models for input with high variability.
Collapse
Affiliation(s)
- Kamil Rajdl
- Department of Mathematics and Statistics, Faculty of Science, Masaryk University, Kotlarska 2, 611 37, Brno, Czech Republic,
| | | |
Collapse
|
41
|
Müller-Hansen F, Droste F, Lindner B. Statistics of a neuron model driven by asymmetric colored noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 91:022718. [PMID: 25768542 DOI: 10.1103/physreve.91.022718] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/25/2014] [Indexed: 06/04/2023]
Abstract
Irregular firing of neurons can be modeled as a stochastic process. Here we study the perfect integrate-and-fire neuron driven by dichotomous noise, a Markovian process that jumps between two states (i.e., possesses a non-Gaussian statistics) and exhibits nonvanishing temporal correlations (i.e., represents a colored noise). Specifically, we consider asymmetric dichotomous noise with two different transition rates. Using a first-passage-time formulation, we derive exact expressions for the probability density and the serial correlation coefficient of the interspike interval (time interval between two subsequent neural action potentials) and the power spectrum of the spike train. Furthermore, we extend the model by including additional Gaussian white noise, and we give approximations for the interspike interval (ISI) statistics in this case. Numerical simulations are used to validate the exact analytical results for pure dichotomous noise, and to test the approximations of the ISI statistics when Gaussian white noise is included. The results may help to understand how correlations and asymmetry of noise and signals in nerve cells shape neuronal firing statistics.
Collapse
Affiliation(s)
- Finn Müller-Hansen
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Freie Universität Berlin, Arnimallee 14, 14195 Berlin, Germany
| | - Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
42
|
Droste F, Lindner B. Integrate-and-fire neurons driven by asymmetric dichotomous noise. BIOLOGICAL CYBERNETICS 2014; 108:825-843. [PMID: 25037240 DOI: 10.1007/s00422-014-0621-7] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/31/2014] [Accepted: 07/08/2014] [Indexed: 06/03/2023]
Abstract
We consider a general integrate-and-fire (IF) neuron driven by asymmetric dichotomous noise. In contrast to the Gaussian white noise usually used in the so-called diffusion approximation, this noise is colored, i.e., it exhibits temporal correlations. We give an analytical expression for the stationary voltage distribution of a neuron receiving such noise and derive recursive relations for the moments of the first passage time density, which allow us to calculate the firing rate and the coefficient of variation of interspike intervals. We study how correlations in the input affect the rate and regularity of firing under variation of the model's parameters for leaky and quadratic IF neurons. Further, we consider the limit of small correlation times and find lowest order corrections to the first passage time moments to be proportional to the square root of the correlation time. We show analytically that to this lowest order, correlations always lead to a decrease in firing rate for a leaky IF neuron. All theoretical expressions are compared to simulations of leaky and quadratic IF neurons.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany,
| | | |
Collapse
|
43
|
Dummer B, Wieland S, Lindner B. Self-consistent determination of the spike-train power spectrum in a neural network with sparse connectivity. Front Comput Neurosci 2014; 8:104. [PMID: 25278869 PMCID: PMC4166962 DOI: 10.3389/fncom.2014.00104] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2014] [Accepted: 08/13/2014] [Indexed: 11/13/2022] Open
Abstract
A major source of random variability in cortical networks is the quasi-random arrival of presynaptic action potentials from many other cells. In network studies as well as in the study of the response properties of single cells embedded in a network, synaptic background input is often approximated by Poissonian spike trains. However, the output statistics of the cells is in most cases far from being Poisson. This is inconsistent with the assumption of similar spike-train statistics for pre- and postsynaptic cells in a recurrent network. Here we tackle this problem for the popular class of integrate-and-fire neurons and study a self-consistent statistics of input and output spectra of neural spike trains. Instead of actually using a large network, we use an iterative scheme, in which we simulate a single neuron over several generations. In each of these generations, the neuron is stimulated with surrogate stochastic input that has a similar statistics as the output of the previous generation. For the surrogate input, we employ two distinct approximations: (i) a superposition of renewal spike trains with the same interspike interval density as observed in the previous generation and (ii) a Gaussian current with a power spectrum proportional to that observed in the previous generation. For input parameters that correspond to balanced input in the network, both the renewal and the Gaussian iteration procedure converge quickly and yield comparable results for the self-consistent spike-train power spectrum. We compare our results to large-scale simulations of a random sparsely connected network of leaky integrate-and-fire neurons (Brunel, 2000) and show that in the asynchronous regime close to a state of balanced synaptic input from the network, our iterative schemes provide an excellent approximations to the autocorrelation of spike trains in the recurrent network.
Collapse
Affiliation(s)
- Benjamin Dummer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Stefan Wieland
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| |
Collapse
|
44
|
Sensory stimulation shifts visual cortex from synchronous to asynchronous states. Nature 2014; 509:226-9. [PMID: 24695217 PMCID: PMC4067243 DOI: 10.1038/nature13159] [Citation(s) in RCA: 148] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2013] [Accepted: 02/17/2014] [Indexed: 11/08/2022]
Abstract
In the mammalian cerebral cortex, neural responses are highly variable during spontaneous activity and sensory stimulation. To explain this variability, the cortex of alert animals has been proposed to be in an asynchronous high-conductance state in which irregular spiking arises from the convergence of large numbers of uncorrelated excitatory and inhibitory inputs onto individual neurons. Signatures of this state are that a neuron's membrane potential (Vm) hovers just below spike threshold, and its aggregate synaptic input is nearly Gaussian, arising from many uncorrelated inputs. Alternatively, irregular spiking could arise from infrequent correlated input events that elicit large fluctuations in Vm (refs 5, 6). To distinguish between these hypotheses, we developed a technique to perform whole-cell Vm measurements from the cortex of behaving monkeys, focusing on primary visual cortex (V1) of monkeys performing a visual fixation task. Here we show that, contrary to the predictions of an asynchronous state, mean Vm during fixation was far from threshold (14 mV) and spiking was triggered by occasional large spontaneous fluctuations. Distributions of Vm values were skewed beyond that expected for a range of Gaussian input, but were consistent with synaptic input arising from infrequent correlated events. Furthermore, spontaneous fluctuations in Vm were correlated with the surrounding network activity, as reflected in simultaneously recorded nearby local field potential. Visual stimulation, however, led to responses more consistent with an asynchronous state: mean Vm approached threshold, fluctuations became more Gaussian, and correlations between single neurons and the surrounding network were disrupted. These observations show that sensory drive can shift a common cortical circuitry from a synchronous to an asynchronous state.
Collapse
|
45
|
Brunel N, Hakim V, Richardson MJE. Single neuron dynamics and computation. Curr Opin Neurobiol 2014; 25:149-55. [PMID: 24492069 DOI: 10.1016/j.conb.2014.01.005] [Citation(s) in RCA: 57] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2013] [Revised: 12/18/2013] [Accepted: 01/05/2014] [Indexed: 12/14/2022]
Abstract
At the single neuron level, information processing involves the transformation of input spike trains into an appropriate output spike train. Building upon the classical view of a neuron as a threshold device, models have been developed in recent years that take into account the diverse electrophysiological make-up of neurons and accurately describe their input-output relations. Here, we review these recent advances and survey the computational roles that they have uncovered for various electrophysiological properties, for dendritic arbor anatomy as well as for short-term synaptic plasticity.
Collapse
Affiliation(s)
- Nicolas Brunel
- Departments of Statistics and Neurobiology, University of Chicago, Chicago, USA.
| | - Vincent Hakim
- Laboratoire de Physique Statistique, CNRS, University Pierre et Marie Curie, Ecole Normale Supérieure, Paris, France
| | | |
Collapse
|
46
|
Iyer R, Menon V, Buice M, Koch C, Mihalas S. The influence of synaptic weight distribution on neuronal population dynamics. PLoS Comput Biol 2013; 9:e1003248. [PMID: 24204219 PMCID: PMC3808453 DOI: 10.1371/journal.pcbi.1003248] [Citation(s) in RCA: 38] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2013] [Accepted: 07/24/2013] [Indexed: 11/19/2022] Open
Abstract
The manner in which different distributions of synaptic weights onto cortical neurons shape their spiking activity remains open. To characterize a homogeneous neuronal population, we use the master equation for generalized leaky integrate-and-fire neurons with shot-noise synapses. We develop fast semi-analytic numerical methods to solve this equation for either current or conductance synapses, with and without synaptic depression. We show that its solutions match simulations of equivalent neuronal networks better than those of the Fokker-Planck equation and we compute bounds on the network response to non-instantaneous synapses. We apply these methods to study different synaptic weight distributions in feed-forward networks. We characterize the synaptic amplitude distributions using a set of measures, called tail weight numbers, designed to quantify the preponderance of very strong synapses. Even if synaptic amplitude distributions are equated for both the total current and average synaptic weight, distributions with sparse but strong synapses produce higher responses for small inputs, leading to a larger operating range. Furthermore, despite their small number, such synapses enable the network to respond faster and with more stability in the face of external fluctuations.
Collapse
Affiliation(s)
- Ramakrishnan Iyer
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Vilas Menon
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Michael Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Christof Koch
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Stefan Mihalas
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
47
|
Three generic bistable scenarios of the interplay of voltage pulses and gene expression in neurons. Neural Netw 2013; 44:51-63. [DOI: 10.1016/j.neunet.2013.02.004] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2011] [Revised: 01/05/2013] [Accepted: 02/25/2013] [Indexed: 12/28/2022]
|
48
|
Droste F, Lindner B. Analytical results for integrate-and-fire neurons driven by dichotomous noise. BMC Neurosci 2013. [PMCID: PMC3704408 DOI: 10.1186/1471-2202-14-s1-p243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
|
49
|
Ly C. A principled dimension-reduction method for the population density approach to modeling networks of neurons with synaptic dynamics. Neural Comput 2013; 25:2682-708. [PMID: 23777517 DOI: 10.1162/neco_a_00489] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The population density approach to neural network modeling has been utilized in a variety of contexts. The idea is to group many similar noisy neurons into populations and track the probability density function for each population that encompasses the proportion of neurons with a particular state rather than simulating individual neurons (i.e., Monte Carlo). It is commonly used for both analytic insight and as a time-saving computational tool. The main shortcoming of this method is that when realistic attributes are incorporated in the underlying neuron model, the dimension of the probability density function increases, leading to intractable equations or, at best, computationally intensive simulations. Thus, developing principled dimension-reduction methods is essential for the robustness of these powerful methods. As a more pragmatic tool, it would be of great value for the larger theoretical neuroscience community. For exposition of this method, we consider a single uncoupled population of leaky integrate-and-fire neurons receiving external excitatory synaptic input only. We present a dimension-reduction method that reduces a two-dimensional partial differential-integral equation to a computationally efficient one-dimensional system and gives qualitatively accurate results in both the steady-state and nonequilibrium regimes. The method, termed modified mean-field method, is based entirely on the governing equations and not on any auxiliary variables or parameters, and it does not require fine-tuning. The principles of the modified mean-field method have potential applicability to more realistic (i.e., higher-dimensional) neural networks.
Collapse
Affiliation(s)
- Cheng Ly
- Department of Statistical Sciences and Operations Research, Virginia Commonwealth University Richmond, VA 23284-3083, USA.
| |
Collapse
|
50
|
Schultze-Kraft M, Diesmann M, Grün S, Helias M. Noise suppression and surplus synchrony by coincidence detection. PLoS Comput Biol 2013; 9:e1002904. [PMID: 23592953 PMCID: PMC3617020 DOI: 10.1371/journal.pcbi.1002904] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2012] [Accepted: 11/25/2012] [Indexed: 12/04/2022] Open
Abstract
The functional significance of correlations between action potentials of neurons is still a matter of vivid debate. In particular, it is presently unclear how much synchrony is caused by afferent synchronized events and how much is intrinsic due to the connectivity structure of cortex. The available analytical approaches based on the diffusion approximation do not allow to model spike synchrony, preventing a thorough analysis. Here we theoretically investigate to what extent common synaptic afferents and synchronized inputs each contribute to correlated spiking on a fine temporal scale between pairs of neurons. We employ direct simulation and extend earlier analytical methods based on the diffusion approximation to pulse-coupling, allowing us to introduce precisely timed correlations in the spiking activity of the synaptic afferents. We investigate the transmission of correlated synaptic input currents by pairs of integrate-and-fire model neurons, so that the same input covariance can be realized by common inputs or by spiking synchrony. We identify two distinct regimes: In the limit of low correlation linear perturbation theory accurately determines the correlation transmission coefficient, which is typically smaller than unity, but increases sensitively even for weakly synchronous inputs. In the limit of high input correlation, in the presence of synchrony, a qualitatively new picture arises. As the non-linear neuronal response becomes dominant, the output correlation becomes higher than the total correlation in the input. This transmission coefficient larger unity is a direct consequence of non-linear neural processing in the presence of noise, elucidating how synchrony-coded signals benefit from these generic properties present in cortical networks. Whether spike timing conveys information in cortical networks or whether the firing rate alone is sufficient is a matter of controversial debate, touching the fundamental question of how the brain processes, stores, and conveys information. If the firing rate alone is the decisive signal used in the brain, correlations between action potentials are just an epiphenomenon of cortical connectivity, where pairs of neurons share a considerable fraction of common afferents. Due to membrane leakage, small synaptic amplitudes and the non-linear threshold, nerve cells exhibit lossy transmission of correlation originating from shared synaptic inputs. However, the membrane potential of cortical neurons often displays non-Gaussian fluctuations, caused by synchronized synaptic inputs. Moreover, synchronously active neurons have been found to reflect behavior in primates. In this work we therefore contrast the transmission of correlation due to shared afferents and due to synchronously arriving synaptic impulses for leaky neuron models. We not only find that neurons are highly sensitive to synchronous afferents, but that they can suppress noise on signals transmitted by synchrony, a computational advantage over rate signals.
Collapse
|