1
|
Franzen J, Ramlow L, Lindner B. The steady state and response to a periodic stimulation of the firing rate for a theta neuron with correlated noise. J Comput Neurosci 2023; 51:107-128. [PMID: 36273087 PMCID: PMC9840600 DOI: 10.1007/s10827-022-00836-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 07/29/2022] [Accepted: 09/01/2022] [Indexed: 01/18/2023]
Abstract
The stochastic activity of neurons is caused by various sources of correlated fluctuations and can be described in terms of simplified, yet biophysically grounded, integrate-and-fire models. One paradigmatic model is the quadratic integrate-and-fire model and its equivalent phase description by the theta neuron. Here we study the theta neuron model driven by a correlated Ornstein-Uhlenbeck noise and by periodic stimuli. We apply the matrix-continued-fraction method to the associated Fokker-Planck equation to develop an efficient numerical scheme to determine the stationary firing rate as well as the stimulus-induced modulation of the instantaneous firing rate. For the stationary case, we identify the conditions under which the firing rate decreases or increases by the effect of the colored noise and compare our results to existing analytical approximations for limit cases. For an additional periodic signal we demonstrate how the linear and nonlinear response terms can be computed and report resonant behavior for some of them. We extend the method to the case of two periodic signals, generally with incommensurable frequencies, and present a particular case for which a strong mixed response to both signals is observed, i.e. where the response to the sum of signals differs significantly from the sum of responses to the single signals. We provide Python code for our computational method: https://github.com/jannikfranzen/theta_neuron .
Collapse
Affiliation(s)
- Jannik Franzen
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
| | - Lukas Ramlow
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| | - Benjamin Lindner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| |
Collapse
|
2
|
Lindner B. Fluctuation-Dissipation Relations for Spiking Neurons. PHYSICAL REVIEW LETTERS 2022; 129:198101. [PMID: 36399734 DOI: 10.1103/physrevlett.129.198101] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 09/27/2022] [Accepted: 10/17/2022] [Indexed: 06/16/2023]
Abstract
Spontaneous fluctuations and stimulus response are essential features of neural functioning, but how they are connected is poorly understood. I derive fluctuation-dissipation relations (FDR) between the spontaneous spike and voltage correlations and the firing rate susceptibility for (i) the leaky integrate-and-fire (IF) model with white noise and (ii) an IF model with arbitrary voltage dependence, an adaptation current, and correlated noise. The FDRs can be used to derive thus far unknown statistics analytically [model (i)] or the otherwise inaccessible intrinsic noise statistics [model (ii)].
Collapse
Affiliation(s)
- Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
3
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
4
|
Platkiewicz J, Saccomano Z, McKenzie S, English D, Amarasingham A. Monosynaptic inference via finely-timed spikes. J Comput Neurosci 2021; 49:131-157. [PMID: 33507429 DOI: 10.1007/s10827-020-00770-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2019] [Revised: 09/04/2020] [Accepted: 10/19/2020] [Indexed: 10/22/2022]
Abstract
Observations of finely-timed spike relationships in population recordings have been used to support partial reconstruction of neural microcircuit diagrams. In this approach, fine-timescale components of paired spike train interactions are isolated and subsequently attributed to synaptic parameters. Recent perturbation studies strengthen the case for such an inference, yet the complete set of measurements needed to calibrate statistical models is unavailable. To address this gap, we study features of pairwise spiking in a large-scale in vivo dataset where presynaptic neurons were explicitly decoupled from network activity by juxtacellular stimulation. We then construct biophysical models of paired spike trains to reproduce the observed phenomenology of in vivo monosynaptic interactions, including both fine-timescale spike-spike correlations and firing irregularity. A key characteristic of these models is that the paired neurons are coupled by rapidly-fluctuating background inputs. We quantify a monosynapse's causal effect by comparing the postsynaptic train with its counterfactual, when the monosynapse is removed. Subsequently, we develop statistical techniques for estimating this causal effect from the pre- and post-synaptic spike trains. A particular focus is the justification and application of a nonparametric separation of timescale principle to implement synaptic inference. Using simulated data generated from the biophysical models, we characterize the regimes in which the estimators accurately identify the monosynaptic effect. A secondary goal is to initiate a critical exploration of neurostatistical assumptions in terms of biophysical mechanisms, particularly with regards to the challenging but arguably fundamental issue of fast, unobservable nonstationarities in background dynamics.
Collapse
Affiliation(s)
- Jonathan Platkiewicz
- Department of Mathematics, The City College of New York, The City University of New York, New York, NY, 10031, USA
| | - Zachary Saccomano
- Department of Biology, The Graduate Center, The City University of New York, New York, NY, 10016, USA
| | - Sam McKenzie
- Neuroscience Institute, New York University, New York, NY, 10016, USA
| | - Daniel English
- School of Neuroscience, Virginia Tech, Blacksburg, VA, 24060, USA
| | - Asohan Amarasingham
- Department of Mathematics, The City College of New York, The City University of New York, New York, NY, 10031, USA.
- Department of Biology, The Graduate Center, The City University of New York, New York, NY, 10016, USA.
- Departments of Computer Science and Psychology, The Graduate Center, The City University of New York, New York, NY, 10016, USA.
| |
Collapse
|
5
|
Abstract
Power spectra of spike trains reveal important properties of neuronal behavior. They exhibit several peaks, whose shape and position depend on applied stimuli and intrinsic biophysical properties, such as input current density and channel noise. The position of the spectral peaks in the frequency domain is not straightforwardly predictable from statistical averages of the interspike intervals, especially when stochastic behavior prevails. In this work, we provide a model for the neuronal power spectrum, obtained from Discrete Fourier Transform and expressed as a series of expected value of sinusoidal terms. The first term of the series allows us to estimate the frequencies of the spectral peaks to a maximum error of a few Hz, and to interpret why they are not harmonics of the first peak frequency. Thus, the simple expression of the proposed power spectral density (PSD) model makes it a powerful interpretative tool of PSD shape, and also useful for neurophysiological studies aimed at extracting information on neuronal behavior from spike train spectra.
Collapse
|
6
|
Bostner Ž, Knoll G, Lindner B. Information filtering by coincidence detection of synchronous population output: analytical approaches to the coherence function of a two-stage neural system. BIOLOGICAL CYBERNETICS 2020; 114:403-418. [PMID: 32583370 PMCID: PMC7326833 DOI: 10.1007/s00422-020-00838-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Accepted: 05/18/2020] [Indexed: 06/11/2023]
Abstract
Information about time-dependent sensory stimuli is encoded in the activity of neural populations; distinct aspects of the stimulus are read out by different types of neurons: while overall information is perceived by integrator cells, so-called coincidence detector cells are driven mainly by the synchronous activity in the population that encodes predominantly high-frequency content of the input signal (high-pass information filtering). Previously, an analytically accessible statistic called the partial synchronous output was introduced as a proxy for the coincidence detector cell's output in order to approximate its information transmission. In the first part of the current paper, we compare the information filtering properties (specifically, the coherence function) of this proxy to those of a simple coincidence detector neuron. We show that the latter's coherence function can indeed be well-approximated by the partial synchronous output with a time scale and threshold criterion that are related approximately linearly to the membrane time constant and firing threshold of the coincidence detector cell. In the second part of the paper, we propose an alternative theory for the spectral measures (including the coherence) of the coincidence detector cell that combines linear-response theory for shot-noise driven integrate-and-fire neurons with a novel perturbation ansatz for the spectra of spike-trains driven by colored noise. We demonstrate how the variability of the synaptic weights for connections from the population to the coincidence detector can shape the information transmission of the entire two-stage system.
Collapse
Affiliation(s)
- Žiga Bostner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| |
Collapse
|
7
|
Barendregt NW, Josić K, Kilpatrick ZP. Analyzing dynamic decision-making models using Chapman-Kolmogorov equations. J Comput Neurosci 2019; 47:205-222. [PMID: 31734803 PMCID: PMC7137388 DOI: 10.1007/s10827-019-00733-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2019] [Revised: 09/25/2019] [Accepted: 10/01/2019] [Indexed: 11/28/2022]
Abstract
Decision-making in dynamic environments typically requires adaptive evidence accumulation that weights new evidence more heavily than old observations. Recent experimental studies of dynamic decision tasks require subjects to make decisions for which the correct choice switches stochastically throughout a single trial. In such cases, an ideal observer's belief is described by an evolution equation that is doubly stochastic, reflecting stochasticity in the both observations and environmental changes. In these contexts, we show that the probability density of the belief can be represented using differential Chapman-Kolmogorov equations, allowing efficient computation of ensemble statistics. This allows us to reliably compare normative models to near-normative approximations using, as model performance metrics, decision response accuracy and Kullback-Leibler divergence of the belief distributions. Such belief distributions could be obtained empirically from subjects by asking them to report their decision confidence. We also study how response accuracy is affected by additional internal noise, showing optimality requires longer integration timescales as more noise is added. Lastly, we demonstrate that our method can be applied to tasks in which evidence arrives in a discrete, pulsatile fashion, rather than continuously.
Collapse
Affiliation(s)
- Nicholas W Barendregt
- Department of Applied Mathematics, University of Colorado Boulder, Boulder, CO, 80309, USA
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, TX, 77204, USA
| | - Zachary P Kilpatrick
- Department of Applied Mathematics, University of Colorado Boulder, Boulder, CO, 80309, USA.
| |
Collapse
|
8
|
Up-Down-Like Background Spiking Can Enhance Neural Information Transmission. eNeuro 2018; 4:eN-TNC-0282-17. [PMID: 29354678 PMCID: PMC5773284 DOI: 10.1523/eneuro.0282-17.2017] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2017] [Revised: 11/15/2017] [Accepted: 11/20/2017] [Indexed: 11/23/2022] Open
Abstract
How neurons transmit information about sensory or internal signals is strongly influenced by ongoing internal activity. Depending on brain state, this background spiking can occur asynchronously or clustered in up states, periods of collective firing that are interspersed by silent down states. Here, we study which effect such up-down (UD) transitions have on signal transmission. In a simple model, we obtain numerical and analytical results for information theoretic measures. We find that, surprisingly, an UD background can benefit information transmission: when background activity is sparse, it is advantageous to distribute spikes into up states rather than uniformly in time. We reproduce the same effect in a more realistic recurrent network and show that signal transmission is further improved by incorporating that up states propagate across cortex as traveling waves. We propose that traveling UD activity might represent a compromise between reducing metabolic strain and maintaining information transmission capabilities.
Collapse
|
9
|
Droste F, Lindner B. Exact analytical results for integrate-and-fire neurons driven by excitatory shot noise. J Comput Neurosci 2017; 43:81-91. [PMID: 28585050 DOI: 10.1007/s10827-017-0649-5] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2017] [Revised: 04/14/2017] [Accepted: 05/04/2017] [Indexed: 11/24/2022]
Abstract
A neuron receives input from other neurons via electrical pulses, so-called spikes. The pulse-like nature of the input is frequently neglected in analytical studies; instead, the input is usually approximated to be Gaussian. Recent experimental studies have shown, however, that an assumption underlying this approximation is often not met: Individual presynaptic spikes can have a significant effect on a neuron's dynamics. It is thus desirable to explicitly account for the pulse-like nature of neural input, i.e. consider neurons driven by a shot noise - a long-standing problem that is mathematically challenging. In this work, we exploit the fact that excitatory shot noise with exponentially distributed weights can be obtained as a limit case of dichotomous noise, a Markovian two-state process. This allows us to obtain novel exact expressions for the stationary voltage density and the moments of the interspike-interval density of general integrate-and-fire neurons driven by such an input. For the special case of leaky integrate-and-fire neurons, we also give expressions for the power spectrum and the linear response to a signal. We verify and illustrate our expressions by comparison to simulations of leaky-, quadratic- and exponential integrate-and-fire neurons.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany. .,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany
| |
Collapse
|