1
|
Koren V, Blanco Malerba S, Schwalger T, Panzeri S. Efficient coding in biophysically realistic excitatory-inhibitory spiking networks. eLife 2025; 13:RP99545. [PMID: 40053385 PMCID: PMC11888603 DOI: 10.7554/elife.99545] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/09/2025] Open
Abstract
The principle of efficient coding posits that sensory cortical networks are designed to encode maximal sensory information with minimal metabolic cost. Despite the major influence of efficient coding in neuroscience, it has remained unclear whether fundamental empirical properties of neural network activity can be explained solely based on this normative principle. Here, we derive the structural, coding, and biophysical properties of excitatory-inhibitory recurrent networks of spiking neurons that emerge directly from imposing that the network minimizes an instantaneous loss function and a time-averaged performance measure enacting efficient coding. We assumed that the network encodes a number of independent stimulus features varying with a time scale equal to the membrane time constant of excitatory and inhibitory neurons. The optimal network has biologically plausible biophysical features, including realistic integrate-and-fire spiking dynamics, spike-triggered adaptation, and a non-specific excitatory external input. The excitatory-inhibitory recurrent connectivity between neurons with similar stimulus tuning implements feature-specific competition, similar to that recently found in visual cortex. Networks with unstructured connectivity cannot reach comparable levels of coding efficiency. The optimal ratio of excitatory vs inhibitory neurons and the ratio of mean inhibitory-to-inhibitory vs excitatory-to-inhibitory connectivity are comparable to those of cortical sensory networks. The efficient network solution exhibits an instantaneous balance between excitation and inhibition. The network can perform efficient coding even when external stimuli vary over multiple time scales. Together, these results suggest that key properties of biological neural networks may be accounted for by efficient coding.
Collapse
Affiliation(s)
- Veronika Koren
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-EppendorfHamburgGermany
- Institute of Mathematics, Technische Universität BerlinBerlinGermany
- Bernstein Center for Computational Neuroscience BerlinBerlinGermany
| | - Simone Blanco Malerba
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-EppendorfHamburgGermany
| | - Tilo Schwalger
- Institute of Mathematics, Technische Universität BerlinBerlinGermany
- Bernstein Center for Computational Neuroscience BerlinBerlinGermany
| | - Stefano Panzeri
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-EppendorfHamburgGermany
| |
Collapse
|
2
|
Koren V, Malerba SB, Schwalger T, Panzeri S. Efficient coding in biophysically realistic excitatory-inhibitory spiking networks. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2025:2024.04.24.590955. [PMID: 38712237 PMCID: PMC11071478 DOI: 10.1101/2024.04.24.590955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2024]
Abstract
The principle of efficient coding posits that sensory cortical networks are designed to encode maximal sensory information with minimal metabolic cost. Despite the major influence of efficient coding in neuroscience, it has remained unclear whether fundamental empirical properties of neural network activity can be explained solely based on this normative principle. Here, we derive the structural, coding, and biophysical properties of excitatory-inhibitory recurrent networks of spiking neurons that emerge directly from imposing that the network minimizes an instantaneous loss function and a time-averaged performance measure enacting efficient coding. We assumed that the network encodes a number of independent stimulus features varying with a time scale equal to the membrane time constant of excitatory and inhibitory neurons. The optimal network has biologically-plausible biophysical features, including realistic integrate-and-fire spiking dynamics, spike-triggered adaptation, and a non-specific excitatory external input. The excitatory-inhibitory recurrent connectivity between neurons with similar stimulus tuning implements feature-specific competition, similar to that recently found in visual cortex. Networks with unstructured connectivity cannot reach comparable levels of coding efficiency. The optimal ratio of excitatory vs inhibitory neurons and the ratio of mean inhibitory-to-inhibitory vs excitatory-to-inhibitory connectivity are comparable to those of cortical sensory networks. The efficient network solution exhibits an instantaneous balance between excitation and inhibition. The network can perform efficient coding even when external stimuli vary over multiple time scales. Together, these results suggest that key properties of biological neural networks may be accounted for by efficient coding.
Collapse
Affiliation(s)
- Veronika Koren
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
- Institute of Mathematics, Technische Universität Berlin, 10623 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Simone Blanco Malerba
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
| | - Tilo Schwalger
- Institute of Mathematics, Technische Universität Berlin, 10623 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Stefano Panzeri
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
| |
Collapse
|
3
|
Pietras B, Clusella P, Montbrió E. Low-dimensional model for adaptive networks of spiking neurons. Phys Rev E 2025; 111:014422. [PMID: 39972912 DOI: 10.1103/physreve.111.014422] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2024] [Accepted: 12/06/2024] [Indexed: 02/21/2025]
Abstract
We investigate a large ensemble of quadratic integrate-and-fire neurons with heterogeneous input currents and adaptation variables. Our analysis reveals that, for a specific class of adaptation, termed quadratic spike-frequency adaptation, the high-dimensional system can be exactly reduced to a low-dimensional system of ordinary differential equations, which describes the dynamics of three mean-field variables: the population's firing rate, the mean membrane potential, and a mean adaptation variable. The resulting low-dimensional firing rate equations (FREs) uncover a key generic feature of heterogeneous networks with spike-frequency adaptation: Both the center and width of the distribution of the neurons' firing frequencies are reduced, and this largely promotes the emergence of collective synchronization in the network. Our findings are further supported by the bifurcation analysis of the FREs, which accurately captures the collective dynamics of the spiking neuron network, including phenomena such as collective oscillations, bursting, and macroscopic chaos.
Collapse
Affiliation(s)
- Bastian Pietras
- Universitat Pompeu Fabra, Neuronal Dynamics Group, Department of Engineering, 08018 Barcelona, Spain
| | - Pau Clusella
- Universitat Politècnica de Catalunya, EPSEM, Departament de Matemàtiques, 08242 Manresa, Spain
| | - Ernest Montbrió
- Universitat Pompeu Fabra, Neuronal Dynamics Group, Department of Engineering, 08018 Barcelona, Spain
| |
Collapse
|
4
|
Ramlow L, Falcke M, Lindner B. An integrate-and-fire approach to Ca 2+ signaling. Part II: Cumulative refractoriness. Biophys J 2023; 122:4710-4729. [PMID: 37981761 PMCID: PMC10754692 DOI: 10.1016/j.bpj.2023.11.015] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Revised: 10/20/2023] [Accepted: 11/15/2023] [Indexed: 11/21/2023] Open
Abstract
Inositol 1,4,5-trisphosphate-induced Ca2+ signaling is a second messenger system used by almost all eukaryotic cells. The agonist concentration stimulating Ca2+ signals is encoded in the frequency of a Ca2+ concentration spike sequence. When a cell is stimulated, the interspike intervals (ISIs) often show a distinct transient during which they gradually increase, a system property we refer to as cumulative refractoriness. We extend a previously published stochastic model to include the Ca2+ concentration in the intracellular Ca2+ store as a slow adaptation variable. This model can reproduce both stationary and transient statistics of experimentally observed ISI sequences. We derive approximate expressions for the mean and coefficient of variation of the stationary ISIs. We also consider the response to the onset of a constant stimulus and estimate the length of the transient and the strength of the adaptation of the ISI. We show that the adaptation sets the coefficient of variation in agreement with current ideas derived from experiments. Moreover, we explain why, despite a pronounced transient behavior, ISI correlations can be weak, as often observed in experiments. Finally, we fit our model to reproduce the transient statistics of experimentally observed ISI sequences in stimulated HEK cells. The fitted model is able to qualitatively reproduce the relationship between the stationary interval correlations and the number of transient intervals, as well as the strength of the ISI adaptation. We also find positive correlations in the experimental sequence that cannot be explained by our model.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Department of Physics, Humboldt University Berlin, Berlin, Germany; Max Delbrück Center for Molecular Medicine, Berlin, Germany
| | - Martin Falcke
- Department of Physics, Humboldt University Berlin, Berlin, Germany; Max Delbrück Center for Molecular Medicine, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Department of Physics, Humboldt University Berlin, Berlin, Germany
| |
Collapse
|
5
|
Sidhu RS, Johnson EC, Jones DL, Ratnam R. A dynamic spike threshold with correlated noise predicts observed patterns of negative interval correlations in neuronal spike trains. BIOLOGICAL CYBERNETICS 2022; 116:611-633. [PMID: 36244004 PMCID: PMC9691502 DOI: 10.1007/s00422-022-00946-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Accepted: 09/26/2022] [Indexed: 06/16/2023]
Abstract
Negative correlations in the sequential evolution of interspike intervals (ISIs) are a signature of memory in neuronal spike-trains. They provide coding benefits including firing-rate stabilization, improved detectability of weak sensory signals, and enhanced transmission of information by improving signal-to-noise ratio. Primary electrosensory afferent spike-trains in weakly electric fish fall into two categories based on the pattern of ISI correlations: non-bursting units have negative correlations which remain negative but decay to zero with increasing lags (Type I ISI correlations), and bursting units have oscillatory (alternating sign) correlation which damp to zero with increasing lags (Type II ISI correlations). Here, we predict and match observed ISI correlations in these afferents using a stochastic dynamic threshold model. We determine the ISI correlation function as a function of an arbitrary discrete noise correlation function [Formula: see text], where k is a multiple of the mean ISI. The function permits forward and inverse calculations of the correlation function. Both types of correlation functions can be generated by adding colored noise to the spike threshold with Type I correlations generated with slow noise and Type II correlations generated with fast noise. A first-order autoregressive (AR) process with a single parameter is sufficient to predict and accurately match both types of afferent ISI correlation functions, with the type being determined by the sign of the AR parameter. The predicted and experimentally observed correlations are in geometric progression. The theory predicts that the limiting sum of ISI correlations is [Formula: see text] yielding a perfect DC-block in the power spectrum of the spike train. Observed ISI correlations from afferents have a limiting sum that is slightly larger at [Formula: see text] ([Formula: see text]). We conclude that the underlying process for generating ISIs may be a simple combination of low-order AR and moving average processes and discuss the results from the perspective of optimal coding.
Collapse
Affiliation(s)
- Robin S Sidhu
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Erik C Johnson
- The Johns Hopkins University Applied Physics Laboratory, Laurel, MD, USA
| | - Douglas L Jones
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Rama Ratnam
- Division of Biological and Life Sciences, School of Arts and Sciences, Ahmedabad University, Ahmedabad, Gujarat, India.
| |
Collapse
|
6
|
Kromer JA, Tass PA. Synaptic reshaping of plastic neuronal networks by periodic multichannel stimulation with single-pulse and burst stimuli. PLoS Comput Biol 2022; 18:e1010568. [PMID: 36327232 PMCID: PMC9632832 DOI: 10.1371/journal.pcbi.1010568] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2022] [Accepted: 09/14/2022] [Indexed: 11/06/2022] Open
Abstract
Synaptic dysfunction is associated with several brain disorders, including Alzheimer's disease, Parkinson's disease (PD) and obsessive compulsive disorder (OCD). Utilizing synaptic plasticity, brain stimulation is capable of reshaping synaptic connectivity. This may pave the way for novel therapies that specifically counteract pathological synaptic connectivity. For instance, in PD, novel multichannel coordinated reset stimulation (CRS) was designed to counteract neuronal synchrony and down-regulate pathological synaptic connectivity. CRS was shown to entail long-lasting therapeutic aftereffects in PD patients and related animal models. This is in marked contrast to conventional deep brain stimulation (DBS) therapy, where PD symptoms return shortly after stimulation ceases. In the present paper, we study synaptic reshaping by periodic multichannel stimulation (PMCS) in networks of leaky integrate-and-fire (LIF) neurons with spike-timing-dependent plasticity (STDP). During PMCS, phase-shifted periodic stimulus trains are delivered to segregated neuronal subpopulations. Harnessing STDP, PMCS leads to changes of the synaptic network structure. We found that the PMCS-induced changes of the network structure depend on both the phase lags between stimuli and the shape of individual stimuli. Single-pulse stimuli and burst stimuli with low intraburst frequency down-regulate synapses between neurons receiving stimuli simultaneously. In contrast, burst stimuli with high intraburst frequency up-regulate these synapses. We derive theoretical approximations of the stimulation-induced network structure. This enables us to formulate stimulation strategies for inducing a variety of network structures. Our results provide testable hypotheses for future pre-clinical and clinical studies and suggest that periodic multichannel stimulation may be suitable for reshaping plastic neuronal networks to counteract pathological synaptic connectivity. Furthermore, we provide novel insight on how the stimulus type may affect the long-lasting outcome of conventional DBS. This may strongly impact parameter adjustment procedures for clinical DBS, which, so far, primarily focused on acute effects of stimulation.
Collapse
Affiliation(s)
- Justus A Kromer
- Department of Neurosurgery, Stanford University, Stanford, California, United States of America
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, California, United States of America
| |
Collapse
|
7
|
Holzhausen K, Ramlow L, Pu S, Thomas PJ, Lindner B. Mean-return-time phase of a stochastic oscillator provides an approximate renewal description for the associated point process. BIOLOGICAL CYBERNETICS 2022; 116:235-251. [PMID: 35166932 PMCID: PMC9068687 DOI: 10.1007/s00422-022-00920-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Accepted: 01/11/2022] [Indexed: 06/14/2023]
Abstract
Stochastic oscillations can be characterized by a corresponding point process; this is a common practice in computational neuroscience, where oscillations of the membrane voltage under the influence of noise are often analyzed in terms of the interspike interval statistics, specifically the distribution and correlation of intervals between subsequent threshold-crossing times. More generally, crossing times and the corresponding interval sequences can be introduced for different kinds of stochastic oscillators that have been used to model variability of rhythmic activity in biological systems. In this paper we show that if we use the so-called mean-return-time (MRT) phase isochrons (introduced by Schwabedal and Pikovsky) to count the cycles of a stochastic oscillator with Markovian dynamics, the interphase interval sequence does not show any linear correlations, i.e., the corresponding sequence of passage times forms approximately a renewal point process. We first outline the general mathematical argument for this finding and illustrate it numerically for three models of increasing complexity: (i) the isotropic Guckenheimer-Schwabedal-Pikovsky oscillator that displays positive interspike interval (ISI) correlations if rotations are counted by passing the spoke of a wheel; (ii) the adaptive leaky integrate-and-fire model with white Gaussian noise that shows negative interspike interval correlations when spikes are counted in the usual way by the passage of a voltage threshold; (iii) a Hodgkin-Huxley model with channel noise (in the diffusion approximation represented by Gaussian noise) that exhibits weak but statistically significant interspike interval correlations, again for spikes counted when passing a voltage threshold. For all these models, linear correlations between intervals vanish when we count rotations by the passage of an MRT isochron. We finally discuss that the removal of interval correlations does not change the long-term variability and its effect on information transmission, especially in the neural context.
Collapse
Affiliation(s)
- Konstantin Holzhausen
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Shusen Pu
- Department of Biomedical Engineering, 5814 Stevenson Center, Vanderbilt University, Nashville, TN 37215 USA
| | - Peter J. Thomas
- Department of Mathematics, Applied Mathematics, and Statistics, 212 Yost Hall, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, Ohio USA
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
8
|
Kullmann R, Knoll G, Bernardi D, Lindner B. Critical current for giant Fano factor in neural models with bistable firing dynamics and implications for signal transmission. Phys Rev E 2022; 105:014416. [PMID: 35193262 DOI: 10.1103/physreve.105.014416] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2020] [Accepted: 01/05/2022] [Indexed: 06/14/2023]
Abstract
Bistability in the firing rate is a prominent feature in different types of neurons as well as in neural networks. We show that for a constant input below a critical value, such bistability can lead to a giant spike-count diffusion. We study the transmission of a periodic signal and demonstrate that close to the critical bias current, the signal-to-noise ratio suffers a sharp increase, an effect that can be traced back to the giant diffusion and large Fano factor.
Collapse
Affiliation(s)
- Richard Kullmann
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Davide Bernardi
- Center for Translational Neurophysiology of Speech and Communication, Fondazione Istituto Italiano di Tecnologia, via Fossato di Mortara 19, 44121 Ferrara, Italy
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
9
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel's time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
10
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
11
|
Budzinski R, Lopes S, Masoller C. Symbolic analysis of bursting dynamical regimes of Rulkov neural networks. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.05.122] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
12
|
Bernardi D, Doron G, Brecht M, Lindner B. A network model of the barrel cortex combined with a differentiator detector reproduces features of the behavioral response to single-neuron stimulation. PLoS Comput Biol 2021; 17:e1007831. [PMID: 33556070 PMCID: PMC7895413 DOI: 10.1371/journal.pcbi.1007831] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 02/19/2021] [Accepted: 01/17/2021] [Indexed: 11/23/2022] Open
Abstract
The stimulation of a single neuron in the rat somatosensory cortex can elicit a behavioral response. The probability of a behavioral response does not depend appreciably on the duration or intensity of a constant stimulation, whereas the response probability increases significantly upon injection of an irregular current. Biological mechanisms that can potentially suppress a constant input signal are present in the dynamics of both neurons and synapses and seem ideal candidates to explain these experimental findings. Here, we study a large network of integrate-and-fire neurons with several salient features of neuronal populations in the rat barrel cortex. The model includes cellular spike-frequency adaptation, experimentally constrained numbers and types of chemical synapses endowed with short-term plasticity, and gap junctions. Numerical simulations of this model indicate that cellular and synaptic adaptation mechanisms alone may not suffice to account for the experimental results if the local network activity is read out by an integrator. However, a circuit that approximates a differentiator can detect the single-cell stimulation with a reliability that barely depends on the length or intensity of the stimulus, but that increases when an irregular signal is used. This finding is in accordance with the experimental results obtained for the stimulation of a regularly-spiking excitatory cell. It is widely assumed that only a large group of neurons can encode a stimulus or control behavior. This tenet of neuroscience has been challenged by experiments in which stimulating a single cortical neuron has had a measurable effect on an animal’s behavior. Recently, theoretical studies have explored how a single-neuron stimulation could be detected in a large recurrent network. However, these studies missed essential biological mechanisms of cortical networks and are unable to explain more recent experiments in the barrel cortex. Here, to describe the stimulated brain area, we propose and study a network model endowed with many important biological features of the barrel cortex. Importantly, we also investigate different readout mechanisms, i.e. ways in which the stimulation effects can propagate to other brain areas. We show that a readout network which tracks rapid variations in the local network activity is in agreement with the experiments. Our model demonstrates a possible mechanism for how the stimulation of a single neuron translates into a signal at the population level, which is taken as a proxy of the animal’s response. Our results illustrate the power of spiking neural networks to properly describe the effects of a single neuron’s activity.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- Center for Translational Neurophysiology of Speech and Communication, Fondazione Istituto Italiano di Tecnologia, Ferrara, Italy
- * E-mail:
| | - Guy Doron
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Michael Brecht
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
13
|
Nesse WH, Maler L, Longtin A. Enhanced Signal Detection by Adaptive Decorrelation of Interspike Intervals. Neural Comput 2020; 33:341-375. [PMID: 33253034 DOI: 10.1162/neco_a_01347] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
Spike trains with negative interspike interval (ISI) correlations, in which long/short ISIs are more likely followed by short/long ISIs, are common in many neurons. They can be described by stochastic models with a spike-triggered adaptation variable. We analyze a phenomenon in these models where such statistically dependent ISI sequences arise in tandem with quasi-statistically independent and identically distributed (quasi-IID) adaptation variable sequences. The sequences of adaptation states and resulting ISIs are linked by a nonlinear decorrelating transformation. We establish general conditions on a family of stochastic spiking models that guarantee this quasi-IID property and establish bounds on the resulting baseline ISI correlations. Inputs that elicit weak firing rate changes in samples with many spikes are known to be more detectible when negative ISI correlations are present because they reduce spike count variance; this defines a variance-reduced firing rate coding benchmark. We performed a Fisher information analysis on these adapting models exhibiting ISI correlations to show that a spike pattern code based on the quasi-IID property achieves the upper bound of detection performance, surpassing rate codes with the same mean rate-including the variance-reduced rate code benchmark-by 20% to 30%. The information loss in rate codes arises because the benefits of reduced spike count variance cannot compensate for the lower firing rate gain due to adaptation. Since adaptation states have similar dynamics to synaptic responses, the quasi-IID decorrelation transformation of the spike train is plausibly implemented by downstream neurons through matched postsynaptic kinetics. This provides an explanation for observed coding performance in sensory systems that cannot be accounted for by rate coding, for example, at the detection threshold where rate changes can be insignificant.
Collapse
Affiliation(s)
- William H Nesse
- Department of Mathematics, University of Utah, Salt Lake City, UT 84112, U.S.A.
| | - Leonard Maler
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa, ON K1N 6N5, Canada
| |
Collapse
|
14
|
Muscinelli SP, Gerstner W, Schwalger T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput Biol 2019; 15:e1007122. [PMID: 31181063 PMCID: PMC6586367 DOI: 10.1371/journal.pcbi.1007122] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Revised: 06/20/2019] [Accepted: 05/22/2019] [Indexed: 02/07/2023] Open
Abstract
While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks.
Collapse
Affiliation(s)
- Samuel P. Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany
| |
Collapse
|
15
|
Braun W, Longtin A. Interspike interval correlations in networks of inhibitory integrate-and-fire neurons. Phys Rev E 2019; 99:032402. [PMID: 30999498 DOI: 10.1103/physreve.99.032402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Indexed: 11/07/2022]
Abstract
We study temporal correlations of interspike intervals, quantified by the network-averaged serial correlation coefficient (SCC), in networks of both current- and conductance-based purely inhibitory integrate-and-fire neurons. Numerical simulations reveal transitions to negative SCCs at intermediate values of bias current drive and network size. As bias drive and network size are increased past these values, the SCC returns to zero. The SCC is maximally negative at an intermediate value of the network oscillation strength. The dependence of the SCC on two canonical schemes for synaptic connectivity is studied, and it is shown that the results occur robustly in both schemes. For conductance-based synapses, the SCC becomes negative at the onset of both a fast and slow coherent network oscillation. We then show by means of offline simulations using prerecorded network activity that a neuron's SCC is highly sensitive to its number of presynaptic inputs. Finally, we devise a noise-reduced diffusion approximation for current-based networks that accounts for the observed temporal correlation transitions.
Collapse
Affiliation(s)
- Wilhelm Braun
- Neural Network Dynamics and Computation, Institut für Genetik, Universität Bonn, Kirschallee 1, 53115 Bonn, Germany.,Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| |
Collapse
|
16
|
Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS Comput Biol 2019; 15:e1006893. [PMID: 30897092 PMCID: PMC6445477 DOI: 10.1371/journal.pcbi.1006893] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 04/02/2019] [Accepted: 02/19/2019] [Indexed: 11/19/2022] Open
Abstract
Neural activity in awake behaving animals exhibits a vast range of timescales that can be several fold larger than the membrane time constant of individual neurons. Two types of mechanisms have been proposed to explain this conundrum. One possibility is that large timescales are generated by a network mechanism based on positive feedback, but this hypothesis requires fine-tuning of the strength or structure of the synaptic connections. A second possibility is that large timescales in the neural dynamics are inherited from large timescales of underlying biophysical processes, two prominent candidates being intrinsic adaptive ionic currents and synaptic transmission. How the timescales of adaptation or synaptic transmission influence the timescale of the network dynamics has however not been fully explored. To address this question, here we analyze large networks of randomly connected excitatory and inhibitory units with additional degrees of freedom that correspond to adaptation or synaptic filtering. We determine the fixed points of the systems, their stability to perturbations and the corresponding dynamical timescales. Furthermore, we apply dynamical mean field theory to study the temporal statistics of the activity in the fluctuating regime, and examine how the adaptation and synaptic timescales transfer from individual units to the whole population. Our overarching finding is that synaptic filtering and adaptation in single neurons have very different effects at the network level. Unexpectedly, the macroscopic network dynamics do not inherit the large timescale present in adaptive currents. In contrast, the timescales of network activity increase proportionally to the time constant of the synaptic filter. Altogether, our study demonstrates that the timescales of different biophysical processes have different effects on the network level, so that the slow processes within individual neurons do not necessarily induce slow activity in large recurrent neural networks.
Collapse
Affiliation(s)
- Manuel Beiran
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
17
|
Bird AD, Richardson MJE. Transmission of temporally correlated spike trains through synapses with short-term depression. PLoS Comput Biol 2018; 14:e1006232. [PMID: 29933363 PMCID: PMC6039054 DOI: 10.1371/journal.pcbi.1006232] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Revised: 07/10/2018] [Accepted: 05/24/2018] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic depression, caused by depletion of releasable neurotransmitter, modulates the strength of neuronal connections in a history-dependent manner. Quantifying the statistics of synaptic transmission requires stochastic models that link probabilistic neurotransmitter release with presynaptic spike-train statistics. Common approaches are to model the presynaptic spike train as either regular or a memory-less Poisson process: few analytical results are available that describe depressing synapses when the afferent spike train has more complex, temporally correlated statistics such as bursts. Here we present a series of analytical results—from vesicle release-site occupancy statistics, via neurotransmitter release, to the post-synaptic voltage mean and variance—for depressing synapses driven by correlated presynaptic spike trains. The class of presynaptic drive considered is that fully characterised by the inter-spike-interval distribution and encompasses a broad range of models used for neuronal circuit and network analyses, such as integrate-and-fire models with a complete post-spike reset and receiving sufficiently short-time correlated drive. We further demonstrate that the derived post-synaptic voltage mean and variance allow for a simple and accurate approximation of the firing rate of the post-synaptic neuron, using the exponential integrate-and-fire model as an example. These results extend the level of biological detail included in models of synaptic transmission and will allow for the incorporation of more complex and physiologically relevant firing patterns into future studies of neuronal networks. Synapses between neurons transmit signals with strengths that vary with the history of their activity, over scales from milliseconds to decades. Short-term changes in synaptic strength modulate and sculpt ongoing neuronal activity, whereas long-term changes underpin memory formation. Here we focus on changes of strength over timescales of less than a second caused by transitory depletion of the neurotransmitters that carry signals across the synapse. Neurotransmitters are stored in small vesicles that release their contents, with a certain probability, when the presynaptic neuron is active. Once a vesicle has been used it is replenished after a variable delay. There is therefore a complex interaction between the pattern of incoming signals to the synapse and the probablistic release and restock of packaged neurotransmitter. Here we extend existing models to examine how correlated synaptic activity is transmitted through synapses and affects the voltage fluctuations and firing rate of the target neuron. Our results provide a framework that will allow for the inclusion of biophysically realistic synaptic behaviour in studies of neuronal circuits.
Collapse
Affiliation(s)
- Alex D. Bird
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
- Ernst Strüngmann Institute for Neuroscience, Max Planck Society, Frankfurt, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt, Germany
- * E-mail: (ADB); (MJER)
| | - Magnus J. E. Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry, United Kingdom
- * E-mail: (ADB); (MJER)
| |
Collapse
|
18
|
Braun W, Thul R, Longtin A. Evolution of moments and correlations in nonrenewal escape-time processes. Phys Rev E 2017; 95:052127. [PMID: 28618562 DOI: 10.1103/physreve.95.052127] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2016] [Indexed: 06/07/2023]
Abstract
The theoretical description of nonrenewal stochastic systems is a challenge. Analytical results are often not available or can be obtained only under strong conditions, limiting their applicability. Also, numerical results have mostly been obtained by ad hoc Monte Carlo simulations, which are usually computationally expensive when a high degree of accuracy is needed. To gain quantitative insight into these systems under general conditions, we here introduce a numerical iterated first-passage time approach based on solving the time-dependent Fokker-Planck equation (FPE) to describe the statistics of nonrenewal stochastic systems. We illustrate the approach using spike-triggered neuronal adaptation in the leaky and perfect integrate-and-fire model, respectively. The transition to stationarity of first-passage time moments and their sequential correlations occur on a nontrivial time scale that depends on all system parameters. Surprisingly this is so for both single exponential and scale-free power-law adaptation. The method works beyond the small noise and time-scale separation approximations. It shows excellent agreement with direct Monte Carlo simulations, which allow for the computation of transient and stationary distributions. We compare different methods to compute the evolution of the moments and serial correlation coefficients (SCCs) and discuss the challenge of reliably computing the SCCs, which we find to be very sensitive to numerical inaccuracies for both the leaky and perfect integrate-and-fire models. In conclusion, our methods provide a general picture of nonrenewal dynamics in a wide range of stochastic systems exhibiting short- and long-range correlations.
Collapse
Affiliation(s)
- Wilhelm Braun
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| | - Rüdiger Thul
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD, United Kingdom
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| |
Collapse
|
19
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
20
|
Relationship in Pacemaker Neurons Between the Long-Term Correlations of Membrane Voltage Fluctuations and the Corresponding Duration of the Inter-Spike Interval. J Membr Biol 2017; 250:249-257. [PMID: 28417145 DOI: 10.1007/s00232-017-9956-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2016] [Accepted: 04/07/2017] [Indexed: 10/19/2022]
Abstract
Several studies of the behavior in the voltage and frequency fluctuations of the neural electrical activity have been performed. Here, we explored the particular association between behavior of the voltage fluctuations in the inter-spike segment (VFIS) and the inter-spike intervals (ISI) of F1 pacemaker neurons from H. aspersa, by disturbing the intracellular calcium handling with cadmium and caffeine. The scaling exponent α of the VFIS, as provided by detrended fluctuations analysis, in conjunction with the corresponding duration of ISI to estimate the determination coefficient R 2 (48-50 intervals per neuron, N = 5) were all evaluated. The time-varying scaling exponent α(t) of VFIS was also studied (20 segments per neuron, N = 11). The R 2 obtained in control conditions was 0.683 ([0.647 0.776] lower and upper quartiles), 0.405 [0.381 0.495] by using cadmium, and 0.151 [0.118 0.222] with caffeine (P < 0.05). A non-uniform scaling exponent α(t) showing a profile throughout the duration of the VFIS was further identified. A significant reduction of long-term correlations by cadmium was confirmed in the first part of this profile (P = 0.0001), but no significant reductions were detected by using caffeine. Our findings endorse that the behavior of the VFIS appears associated to the activation of different populations of ionic channels, which establish the neural membrane potential and are mediated by the intracellular calcium handling. Thus, we provide evidence to consider that the behavior of the VFIS, as determined by the scaling exponent α, conveys insights into mechanisms regulating the excitability of pacemaker neurons.
Collapse
|
21
|
Messer M, Costa KM, Roeper J, Schneider G. Multi-scale detection of rate changes in spike trains with weak dependencies. J Comput Neurosci 2016; 42:187-201. [PMID: 28025784 DOI: 10.1007/s10827-016-0635-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2016] [Revised: 11/23/2016] [Accepted: 12/07/2016] [Indexed: 11/28/2022]
Abstract
The statistical analysis of neuronal spike trains by models of point processes often relies on the assumption of constant process parameters. However, it is a well-known problem that the parameters of empirical spike trains can be highly variable, such as for example the firing rate. In order to test the null hypothesis of a constant rate and to estimate the change points, a Multiple Filter Test (MFT) and a corresponding algorithm (MFA) have been proposed that can be applied under the assumption of independent inter spike intervals (ISIs). As empirical spike trains often show weak dependencies in the correlation structure of ISIs, we extend the MFT here to point processes associated with short range dependencies. By specifically estimating serial dependencies in the test statistic, we show that the new MFT can be applied to a variety of empirical firing patterns, including positive and negative serial correlations as well as tonic and bursty firing. The new MFT is applied to a data set of empirical spike trains with serial correlations, and simulations show improved performance against methods that assume independence. In case of positive correlations, our new MFT is necessary to reduce the number of false positives, which can be highly enhanced when falsely assuming independence. For the frequent case of negative correlations, the new MFT shows an improved detection probability of change points and thus, also a higher potential of signal extraction from noisy spike trains.
Collapse
Affiliation(s)
- Michael Messer
- Institute of Mathematics, Johann Wolfgang Goethe University Frankfurt, Frankfurt, Germany
| | - Kauê M Costa
- Institute of Neurophysiology, Johann Wolfgang Goethe University Frankfurt, Frankfurt, Germany
| | - Jochen Roeper
- Institute of Neurophysiology, Johann Wolfgang Goethe University Frankfurt, Frankfurt, Germany
| | - Gaby Schneider
- Institute of Mathematics, Johann Wolfgang Goethe University Frankfurt, Frankfurt, Germany.
| |
Collapse
|
22
|
Meier SR, Lancaster JL, Fetterhoff D, Kraft RA, Hampson RE, Starobin JM. The relationship between nernst equilibrium variability and the multifractality of interspike intervals in the hippocampus. J Comput Neurosci 2016; 42:167-175. [PMID: 27909842 DOI: 10.1007/s10827-016-0633-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2016] [Revised: 11/14/2016] [Accepted: 11/21/2016] [Indexed: 11/26/2022]
Abstract
Spatiotemporal patterns of action potentials are considered to be closely related to information processing in the brain. Auto-generating neurons contributing to these processing tasks are known to cause multifractal behavior in the inter-spike intervals of the output action potentials. In this paper we define a novel relationship between this multifractality and the adaptive Nernst equilibrium in hippocampal neurons. Using this relationship we are able to differentiate between various drugs at varying dosages. Conventional methods limit their ability to account for cellular charge depletion by not including these adaptive Nernst equilibria. Our results provide a new theoretical approach for measuring the effects which drugs have on single-cell dynamics.
Collapse
Affiliation(s)
- Stephen R Meier
- Department of Applied Mathematics and Statistics, State University of New York, Stony Brook, NY, 11794, USA.
| | | | - Dustin Fetterhoff
- Department of Biology II, Ludwig Maximilian University of Munich, Munich, Germany
| | - Robert A Kraft
- Department of Biomedical Engineering, Wake Forest School of Medicine, Winston-Salem, NC, 27109, USA
| | - Robert E Hampson
- Department of Physiology & Pharmacology, Wake Forest School of Medicine, Winston-Salem, NC, 27109, USA
| | - Joseph M Starobin
- Department of Nanoscience, The University of North Carolina, Greensboro, NC, 27401, USA
| |
Collapse
|
23
|
Norman SE, Butera RJ, Canavier CC. Stochastic slowly adapting ionic currents may provide a decorrelation mechanism for neural oscillators by causing wander in the intrinsic period. J Neurophysiol 2016; 116:1189-98. [PMID: 27281746 DOI: 10.1152/jn.00193.2016] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2016] [Accepted: 06/01/2016] [Indexed: 11/22/2022] Open
Abstract
Oscillatory neurons integrate their synaptic inputs in fundamentally different ways than normally quiescent neurons. We show that the oscillation period of invertebrate endogenous pacemaker neurons wanders, producing random fluctuations in the interspike intervals (ISI) on a time scale of seconds to minutes, which decorrelates pairs of neurons in hybrid circuits constructed using the dynamic clamp. The autocorrelation of the ISI sequence remained high for many ISIs, but the autocorrelation of the ΔISI series had on average a single nonzero value, which was negative at a lag of one interval. We reproduced these results using a simple integrate and fire (IF) model with a stochastic population of channels carrying an adaptation current with a stochastic component that was integrated with a slow time scale, suggesting that a similar population of channels underlies the observed wander in the period. Using autoregressive integrated moving average (ARIMA) models, we found that a single integrator and a single moving average with a negative coefficient could simulate both the experimental data and the IF model. Feeding white noise into an integrator with a slow time constant is sufficient to produce the autocorrelation structure of the ISI series. Moreover, the moving average clearly accounted for the autocorrelation structure of the ΔISI series and is biophysically implemented in the IF model using slow stochastic adaptation. The observed autocorrelation structure may be a neural signature of slow stochastic adaptation, and wander generated in this manner may be a general mechanism for limiting episodes of synchronized activity in the nervous system.
Collapse
Affiliation(s)
- Sharon E Norman
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia; Bioengineering Graduate Program, Georgia Institute of Technology, Atlanta, Georgia
| | - Robert J Butera
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia; Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, Atlanta, Georgia; and
| | - Carmen C Canavier
- Neuroscience Center of Excellence, Louisiana State University Health Sciences Center, New Orleans, Louisiana; and Department of Cell Biology and Anatomy, Louisiana State University Health Sciences Center, New Orleans, Louisiana
| |
Collapse
|
24
|
Schwalger T, Lindner B. Analytical approach to an integrate-and-fire model with spike-triggered adaptation. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062703. [PMID: 26764723 DOI: 10.1103/physreve.92.062703] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2015] [Indexed: 06/05/2023]
Abstract
The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.
Collapse
Affiliation(s)
- Tilo Schwalger
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL) Station 15, CH-1015 Lausanne, Switzerland
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
25
|
Statistical structure of neural spiking under non-Poissonian or other non-white stimulation. J Comput Neurosci 2015; 39:29-51. [PMID: 25936628 DOI: 10.1007/s10827-015-0560-x] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2014] [Revised: 02/24/2015] [Accepted: 03/27/2015] [Indexed: 10/23/2022]
Abstract
Nerve cells in the brain generate sequences of action potentials with a complex statistics. Theoretical attempts to understand this statistics were largely limited to the case of a temporally uncorrelated input (Poissonian shot noise) from the neurons in the surrounding network. However, the stimulation from thousands of other neurons has various sorts of temporal structure. Firstly, input spike trains are temporally correlated because their firing rates can carry complex signals and because of cell-intrinsic properties like neural refractoriness, bursting, or adaptation. Secondly, at the connections between neurons, the synapses, usage-dependent changes in the synaptic weight (short-term plasticity) further shape the correlation structure of the effective input to the cell. From the theoretical side, it is poorly understood how these correlated stimuli, so-called colored noise, affect the spike train statistics. In particular, no standard method exists to solve the associated first-passage-time problem for the interspike-interval statistics with an arbitrarily colored noise. Assuming that input fluctuations are weaker than the mean neuronal drive, we derive simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network. We verify our theory by numerical simulations for three paradigmatic situations that lead to input correlations: (i) rate-coded naturalistic stimuli in presynaptic spike trains; (ii) presynaptic refractoriness or bursting; (iii) synaptic short-term plasticity. In all cases, we find severe effects on interval statistics. Our results provide a framework for the interpretation of firing statistics measured in vivo in the brain.
Collapse
|
26
|
Shiau L, Schwalger T, Lindner B. Interspike interval correlation in a stochastic exponential integrate-and-fire model with subthreshold and spike-triggered adaptation. J Comput Neurosci 2015; 38:589-600. [DOI: 10.1007/s10827-015-0558-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Revised: 02/11/2015] [Accepted: 03/20/2015] [Indexed: 10/23/2022]
|
27
|
Phase-resetting as a tool of information transmission. Curr Opin Neurobiol 2014; 31:206-13. [PMID: 25529003 DOI: 10.1016/j.conb.2014.12.003] [Citation(s) in RCA: 70] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2014] [Revised: 11/26/2014] [Accepted: 12/01/2014] [Indexed: 11/23/2022]
Abstract
Models of information transmission in the brain largely rely on firing rate codes. The abundance of oscillatory activity in the brain suggests that information may be also encoded using the phases of ongoing oscillations. Sensory perception, working memory and spatial navigation have been hypothesized to use phase codes, and cross-frequency coordination and phase synchronization between brain areas have been proposed to gate the flow of information. Phase codes generally require the phase of the oscillations to be reset at specific reference points for consistent coding, and coordination between oscillators requires favorable phase resetting characteristics. Recent evidence supports a role for neural oscillations in providing temporal reference windows that allow for correct parsing of phase-coded information.
Collapse
|
28
|
Deger M, Schwalger T, Naud R, Gerstner W. Fluctuations and information filtering in coupled populations of spiking neurons with adaptation. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 90:062704. [PMID: 25615126 DOI: 10.1103/physreve.90.062704] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/19/2013] [Indexed: 06/04/2023]
Abstract
Finite-sized populations of spiking elements are fundamental to brain function but also are used in many areas of physics. Here we present a theory of the dynamics of finite-sized populations of spiking units, based on a quasirenewal description of neurons with adaptation. We derive an integral equation with colored noise that governs the stochastic dynamics of the population activity in response to time-dependent stimulation and calculate the spectral density in the asynchronous state. We show that systems of coupled populations with adaptation can generate a frequency band in which sensory information is preferentially encoded. The theory is applicable to fully as well as randomly connected networks and to leaky integrate-and-fire as well as to generalized spiking neurons with adaptation on multiple time scales.
Collapse
Affiliation(s)
- Moritz Deger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École polytechnique fédérale de Lausanne, Station 15, 1015 Lausanne EPFL, Switzerland
| | - Tilo Schwalger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École polytechnique fédérale de Lausanne, Station 15, 1015 Lausanne EPFL, Switzerland
| | - Richard Naud
- Department of Physics, University of Ottawa, 150 Louis Pasteur, Ottawa, Ontario, K1N 6N5 Canada
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École polytechnique fédérale de Lausanne, Station 15, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
29
|
Rosenbaum R, Tchumatchenko T, Moreno-Bote R. Correlated neuronal activity and its relationship to coding, dynamics and network architecture. Front Comput Neurosci 2014; 8:102. [PMID: 25221504 PMCID: PMC4145255 DOI: 10.3389/fncom.2014.00102] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2014] [Accepted: 08/07/2014] [Indexed: 11/13/2022] Open
Affiliation(s)
- Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame Notre Dame, IN, USA ; Center for the Neural Basis of Cognition Pittsburgh, PA, USA
| | - Tatjana Tchumatchenko
- Department Theory of Neural Dynamics, Max Planck Institute for Brain Research Frankfurt am Main, Germany
| | - Rubén Moreno-Bote
- Research Unit, Parc Sanitari Sant Joan de Déu and Universitat de Barcelona Barcelona, Spain ; Centro de Investigación Biomédica en Red de Salud Mental (CIBERSAM) Barcelona, Spain
| |
Collapse
|
30
|
Kromer JA, Lindner B, Schimansky-Geier L. Event-triggered feedback in noise-driven phase oscillators. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 89:032138. [PMID: 24730820 DOI: 10.1103/physreve.89.032138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2013] [Indexed: 06/03/2023]
Abstract
Using a stochastic nonlinear phase oscillator model, we study the effect of event-triggered feedback on the statistics of interevent intervals. Events are associated with the entering of a new cycle. The feedback is modeled by an instantaneous increase (positive feedback) or decrease (negative feedback) of the oscillator frequency whenever an event occurs followed by an exponential decay on a slow time scale. In addition to the known excitable and oscillatory regimes, which are separated by a saddle node on invariant circle bifurcation, positive feedback can lead to bistable dynamics and a change of the system's excitability. The feedback has also a strong effect on noise-induced phenomena like coherence resonance or anticoherence resonance. Both positive and negative feedback can lead to more regular output for particular noise strengths. Finally, we investigate serial correlations in the sequence of interevent intervals that occur due to the additional slow dynamics. We derive approximations for the serial correlation coefficient and show that positive feedback results in extended positive interval correlations, whereas negative feedback yields short-ranging negative correlations. Investigating the interplay of feedback and the nonlinear phase dynamics close to the bifurcation, we find that correlations are most pronounced for optimal feedback strengths.
Collapse
Affiliation(s)
- Justus A Kromer
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Lutz Schimansky-Geier
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|