1
|
Ranft J, Lindner B. Theory of the asynchronous state of structured rotator networks and its application to recurrent networks of excitatory and inhibitory units. Phys Rev E 2023; 107:044306. [PMID: 37198857 DOI: 10.1103/physreve.107.044306] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2022] [Accepted: 03/28/2023] [Indexed: 05/19/2023]
Abstract
Recurrently coupled oscillators that are sufficiently heterogeneous and/or randomly coupled can show an asynchronous activity in which there are no significant correlations among the units of the network. The asynchronous state can nevertheless exhibit a rich temporal correlation statistics that is generally difficult to capture theoretically. For randomly coupled rotator networks, it is possible to derive differential equations that determine the autocorrelation functions of the network noise and of the single elements in the network. So far, the theory has been restricted to statistically homogeneous networks, making it difficult to apply this framework to real-world networks, which are structured with respect to the properties of the single units and their connectivity. A particularly striking case are neural networks for which one has to distinguish between excitatory and inhibitory neurons, which drive their target neurons towards or away from the firing threshold. To take into account network structures like that, here we extend the theory for rotator networks to the case of multiple populations. Specifically, we derive a system of differential equations that govern the self-consistent autocorrelation functions of the network fluctuations in the respective populations. We then apply this general theory to the special but important case of recurrent networks of excitatory and inhibitory units in the balanced case and compare our theory to numerical simulations. We inspect the effect of the network structure on the noise statistics by comparing our results to the case of an equivalent homogeneous network devoid of internal structure. Our results show that structured connectivity and heterogeneity of the oscillator type can both enhance or reduce the overall strength of the generated network noise and shape its temporal correlations.
Collapse
Affiliation(s)
- Jonas Ranft
- Institut de Biologie de l'ENS, Ecole Normale Supérieure, CNRS, Inserm, Université PSL, 46 rue d'Ulm, 75005 Paris, France
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
2
|
Kraemer KH, Hellmann F, Anvari M, Kurths J, Marwan N. Spike Spectra for Recurrences. ENTROPY (BASEL, SWITZERLAND) 2022; 24:1689. [PMID: 36421545 PMCID: PMC9689348 DOI: 10.3390/e24111689] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Revised: 11/04/2022] [Accepted: 11/14/2022] [Indexed: 06/16/2023]
Abstract
In recurrence analysis, the τ-recurrence rate encodes the periods of the cycles of the underlying high-dimensional time series. It, thus, plays a similar role to the autocorrelation for scalar time-series in encoding temporal correlations. However, its Fourier decomposition does not have a clean interpretation. Thus, there is no satisfactory analogue to the power spectrum in recurrence analysis. We introduce a novel method to decompose the τ-recurrence rate using an over-complete basis of Dirac combs together with sparsity regularization. We show that this decomposition, the inter-spike spectrum, naturally provides an analogue to the power spectrum for recurrence analysis in the sense that it reveals the dominant periodicities of the underlying time series. We show that the inter-spike spectrum correctly identifies patterns and transitions in the underlying system in a wide variety of examples and is robust to measurement noise.
Collapse
Affiliation(s)
- K. Hauke Kraemer
- Potsdam Institute for Climate Impact Research (PIK), Member of the Leibniz Association, 14473 Potsdam, Germany
| | - Frank Hellmann
- Potsdam Institute for Climate Impact Research (PIK), Member of the Leibniz Association, 14473 Potsdam, Germany
| | - Mehrnaz Anvari
- Potsdam Institute for Climate Impact Research (PIK), Member of the Leibniz Association, 14473 Potsdam, Germany
| | - Jürgen Kurths
- Potsdam Institute for Climate Impact Research (PIK), Member of the Leibniz Association, 14473 Potsdam, Germany
- Institute of Physics and Astronomy, University of Potsdam, 14476 Potsdam, Germany
- Institute of Physics, Humboldt Universität zu Berlin, 12489 Berlin, Germany
| | - Norbert Marwan
- Potsdam Institute for Climate Impact Research (PIK), Member of the Leibniz Association, 14473 Potsdam, Germany
- Institute of Physics and Astronomy, University of Potsdam, 14476 Potsdam, Germany
- Institute of Geosciences, University of Potsdam, 14476 Potsdam, Germany
| |
Collapse
|
3
|
Lindner B. Fluctuation-Dissipation Relations for Spiking Neurons. PHYSICAL REVIEW LETTERS 2022; 129:198101. [PMID: 36399734 DOI: 10.1103/physrevlett.129.198101] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 09/27/2022] [Accepted: 10/17/2022] [Indexed: 06/16/2023]
Abstract
Spontaneous fluctuations and stimulus response are essential features of neural functioning, but how they are connected is poorly understood. I derive fluctuation-dissipation relations (FDR) between the spontaneous spike and voltage correlations and the firing rate susceptibility for (i) the leaky integrate-and-fire (IF) model with white noise and (ii) an IF model with arbitrary voltage dependence, an adaptation current, and correlated noise. The FDRs can be used to derive thus far unknown statistics analytically [model (i)] or the otherwise inaccessible intrinsic noise statistics [model (ii)].
Collapse
Affiliation(s)
- Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
4
|
Ranft J, Lindner B. A self-consistent analytical theory for rotator networks under stochastic forcing: Effects of intrinsic noise and common input. CHAOS (WOODBURY, N.Y.) 2022; 32:063131. [PMID: 35778158 DOI: 10.1063/5.0096000] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Accepted: 05/23/2022] [Indexed: 06/15/2023]
Abstract
Despite the incredible complexity of our brains' neural networks, theoretical descriptions of neural dynamics have led to profound insights into possible network states and dynamics. It remains challenging to develop theories that apply to spiking networks and thus allow one to characterize the dynamic properties of biologically more realistic networks. Here, we build on recent work by van Meegen and Lindner who have shown that "rotator networks," while considerably simpler than real spiking networks and, therefore, more amenable to mathematical analysis, still allow one to capture dynamical properties of networks of spiking neurons. This framework can be easily extended to the case where individual units receive uncorrelated stochastic input, which can be interpreted as intrinsic noise. However, the assumptions of the theory do not apply anymore when the input received by the single rotators is strongly correlated among units. As we show, in this case, the network fluctuations become significantly non-Gaussian, which calls for reworking of the theory. Using a cumulant expansion, we develop a self-consistent analytical theory that accounts for the observed non-Gaussian statistics. Our theory provides a starting point for further studies of more general network setups and information transmission properties of these networks.
Collapse
Affiliation(s)
- Jonas Ranft
- Institut de Biologie de l'ENS, Ecole Normale Supérieure, CNRS, Inserm, Université PSL, 46 rue d'Ulm, 75005 Paris, France
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Department of Physics, Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
5
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
6
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel’s time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability. The elementary processing units in the central nervous system are neurons that transmit information by short electrical pulses, so called action potentials or spikes. The generation of the action potential is a random process that can be shaped by correlated fluctuations (colored noise) and by adaptation. A consequence of these two ubiquitous features is that the successive time intervals between spikes, the interspike intervals, are not independent but correlated. As these correlations can significantly improve information transmission and weak-signal detection, it is an important task to develop analytical approaches to these statistics for well-established computational models. Here we present a theory of interval correlations for a widely used class of integrate-and-fire models endowed with an adaptation mechanism and subject to correlated fluctuations. We demonstrate which patterns of interval correlations can be expected from the interplay of colored noise, adaptation and intrinsic nonlinear dynamics.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
- * E-mail:
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
7
|
Knoll G, Lindner B. Recurrence-mediated suprathreshold stochastic resonance. J Comput Neurosci 2021; 49:407-418. [PMID: 34003421 PMCID: PMC8556192 DOI: 10.1007/s10827-021-00788-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 04/21/2021] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
It has previously been shown that the encoding of time-dependent signals by feedforward networks (FFNs) of processing units exhibits suprathreshold stochastic resonance (SSR), which is an optimal signal transmission for a finite level of independent, individual stochasticity in the single units. In this study, a recurrent spiking network is simulated to demonstrate that SSR can be also caused by network noise in place of intrinsic noise. The level of autonomously generated fluctuations in the network can be controlled by the strength of synapses, and hence the coding fraction (our measure of information transmission) exhibits a maximum as a function of the synaptic coupling strength. The presence of a coding peak at an optimal coupling strength is robust over a wide range of individual, network, and signal parameters, although the optimal strength and peak magnitude depend on the parameter being varied. We also perform control experiments with an FFN illustrating that the optimized coding fraction is due to the change in noise level and not from other effects entailed when changing the coupling strength. These results also indicate that the non-white (temporally correlated) network noise in general provides an extra boost to encoding performance compared to the FFN driven by intrinsic white noise fluctuations.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany. .,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany.,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
8
|
On the structural connectivity of large-scale models of brain networks at cellular level. Sci Rep 2021; 11:4345. [PMID: 33623053 PMCID: PMC7902637 DOI: 10.1038/s41598-021-83759-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2020] [Accepted: 01/15/2021] [Indexed: 12/22/2022] Open
Abstract
The brain’s structural connectivity plays a fundamental role in determining how neuron networks generate, process, and transfer information within and between brain regions. The underlying mechanisms are extremely difficult to study experimentally and, in many cases, large-scale model networks are of great help. However, the implementation of these models relies on experimental findings that are often sparse and limited. Their predicting power ultimately depends on how closely a model’s connectivity represents the real system. Here we argue that the data-driven probabilistic rules, widely used to build neuronal network models, may not be appropriate to represent the dynamics of the corresponding biological system. To solve this problem, we propose to use a new mathematical framework able to use sparse and limited experimental data to quantitatively reproduce the structural connectivity of biological brain networks at cellular level.
Collapse
|
9
|
Abstract
Power spectra of spike trains reveal important properties of neuronal behavior. They exhibit several peaks, whose shape and position depend on applied stimuli and intrinsic biophysical properties, such as input current density and channel noise. The position of the spectral peaks in the frequency domain is not straightforwardly predictable from statistical averages of the interspike intervals, especially when stochastic behavior prevails. In this work, we provide a model for the neuronal power spectrum, obtained from Discrete Fourier Transform and expressed as a series of expected value of sinusoidal terms. The first term of the series allows us to estimate the frequencies of the spectral peaks to a maximum error of a few Hz, and to interpret why they are not harmonics of the first peak frequency. Thus, the simple expression of the proposed power spectral density (PSD) model makes it a powerful interpretative tool of PSD shape, and also useful for neurophysiological studies aimed at extracting information on neuronal behavior from spike train spectra.
Collapse
|
10
|
Bernardi D, Lindner B. Receiver operating characteristic curves for a simple stochastic process that carries a static signal. Phys Rev E 2020; 101:062132. [PMID: 32688497 DOI: 10.1103/physreve.101.062132] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2020] [Accepted: 05/14/2020] [Indexed: 11/07/2022]
Abstract
The detection of a weak signal in the presence of noise is an important problem that is often studied in terms of the receiver operating characteristic (ROC) curve, in which the probability of correct detection is plotted against the probability for a false positive. This kind of analysis is typically applied to the situation in which signal and noise are stochastic variables; the detection problem emerges, however, also often in a context in which both signal and noise are stochastic processes and the (correct or false) detection is said to take place when the process crosses a threshold in a given time window. Here we consider the problem for a combination of a static signal which has to be detected against a dynamic noise process, the well-known Ornstein-Uhlenbeck process. We give exact (but difficult to evaluate) quadrature expressions for the detection rates for false positives and correct detections, investigate systematically a simple sampling approximation suggested earlier, compare to an approximation by Stratonovich for the limit of high threshold, and briefly explore the case of multiplicative signal; all theoretical results are compared to extensive numerical simulations of the corresponding Langevin equation. Our results demonstrate that the sampling approximation provides a reasonable description of the ROC curve for this system, and it clarifies limit cases for the ROC curve.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| |
Collapse
|
11
|
Ponzi A, Barton SJ, Bunner KD, Rangel-Barajas C, Zhang ES, Miller BR, Rebec GV, Kozloski J. Striatal network modeling in Huntington's Disease. PLoS Comput Biol 2020; 16:e1007648. [PMID: 32302302 PMCID: PMC7197869 DOI: 10.1371/journal.pcbi.1007648] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2019] [Revised: 05/04/2020] [Accepted: 01/09/2020] [Indexed: 12/26/2022] Open
Abstract
Medium spiny neurons (MSNs) comprise over 90% of cells in the striatum. In vivo MSNs display coherent burst firing cell assembly activity patterns, even though isolated MSNs do not burst fire intrinsically. This activity is important for the learning and execution of action sequences and is characteristically dysregulated in Huntington's Disease (HD). However, how dysregulation is caused by the various neural pathologies affecting MSNs in HD is unknown. Previous modeling work using simple cell models has shown that cell assembly activity patterns can emerge as a result of MSN inhibitory network interactions. Here, by directly estimating MSN network model parameters from single unit spiking data, we show that a network composed of much more physiologically detailed MSNs provides an excellent quantitative fit to wild type (WT) mouse spiking data, but only when network parameters are appropriate for the striatum. We find the WT MSN network is situated in a regime close to a transition from stable to strongly fluctuating network dynamics. This regime facilitates the generation of low-dimensional slowly varying coherent activity patterns and confers high sensitivity to variations in cortical driving. By re-estimating the model on HD spiking data we discover network parameter modifications are consistent across three very different types of HD mutant mouse models (YAC128, Q175, R6/2). In striking agreement with the known pathophysiology we find feedforward excitatory drive is reduced in HD compared to WT mice, while recurrent inhibition also shows phenotype dependency. We show that these modifications shift the HD MSN network to a sub-optimal regime where higher dimensional incoherent rapidly fluctuating activity predominates. Our results provide insight into a diverse range of experimental findings in HD, including cognitive and motor symptoms, and may suggest new avenues for treatment.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
- * E-mail:
| | - Scott J. Barton
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Kendra D. Bunner
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Claudia Rangel-Barajas
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Emily S. Zhang
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Benjamin R. Miller
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - George V. Rebec
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - James Kozloski
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
| |
Collapse
|
12
|
Muscinelli SP, Gerstner W, Schwalger T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput Biol 2019; 15:e1007122. [PMID: 31181063 PMCID: PMC6586367 DOI: 10.1371/journal.pcbi.1007122] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Revised: 06/20/2019] [Accepted: 05/22/2019] [Indexed: 02/07/2023] Open
Abstract
While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks.
Collapse
Affiliation(s)
- Samuel P. Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany
| |
Collapse
|
13
|
Baker C, Ebsch C, Lampl I, Rosenbaum R. Correlated states in balanced neuronal networks. Phys Rev E 2019; 99:052414. [PMID: 31212573 DOI: 10.1103/physreve.99.052414] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2018] [Indexed: 06/09/2023]
Abstract
Understanding the magnitude and structure of interneuronal correlations and their relationship to synaptic connectivity structure is an important and difficult problem in computational neuroscience. Early studies show that neuronal network models with excitatory-inhibitory balance naturally create very weak spike train correlations, defining the "asynchronous state." Later work showed that, under some connectivity structures, balanced networks can produce larger correlations between some neuron pairs, even when the average correlation is very small. All of these previous studies assume that the local network receives feedforward synaptic input from a population of uncorrelated spike trains. We show that when spike trains providing feedforward input are correlated, the downstream recurrent network produces much larger correlations. We provide an in-depth analysis of the resulting "correlated state" in balanced networks and show that, unlike the asynchronous state, it produces a tight excitatory-inhibitory balance consistent with in vivo cortical recordings.
Collapse
Affiliation(s)
- Cody Baker
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Christopher Ebsch
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Ilan Lampl
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, 7610001, Israel
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana 46556, USA
| |
Collapse
|
14
|
Braun W, Longtin A. Interspike interval correlations in networks of inhibitory integrate-and-fire neurons. Phys Rev E 2019; 99:032402. [PMID: 30999498 DOI: 10.1103/physreve.99.032402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Indexed: 11/07/2022]
Abstract
We study temporal correlations of interspike intervals, quantified by the network-averaged serial correlation coefficient (SCC), in networks of both current- and conductance-based purely inhibitory integrate-and-fire neurons. Numerical simulations reveal transitions to negative SCCs at intermediate values of bias current drive and network size. As bias drive and network size are increased past these values, the SCC returns to zero. The SCC is maximally negative at an intermediate value of the network oscillation strength. The dependence of the SCC on two canonical schemes for synaptic connectivity is studied, and it is shown that the results occur robustly in both schemes. For conductance-based synapses, the SCC becomes negative at the onset of both a fast and slow coherent network oscillation. We then show by means of offline simulations using prerecorded network activity that a neuron's SCC is highly sensitive to its number of presynaptic inputs. Finally, we devise a noise-reduced diffusion approximation for current-based networks that accounts for the observed temporal correlation transitions.
Collapse
Affiliation(s)
- Wilhelm Braun
- Neural Network Dynamics and Computation, Institut für Genetik, Universität Bonn, Kirschallee 1, 53115 Bonn, Germany.,Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| |
Collapse
|
15
|
Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS Comput Biol 2019; 15:e1006893. [PMID: 30897092 PMCID: PMC6445477 DOI: 10.1371/journal.pcbi.1006893] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 04/02/2019] [Accepted: 02/19/2019] [Indexed: 11/19/2022] Open
Abstract
Neural activity in awake behaving animals exhibits a vast range of timescales that can be several fold larger than the membrane time constant of individual neurons. Two types of mechanisms have been proposed to explain this conundrum. One possibility is that large timescales are generated by a network mechanism based on positive feedback, but this hypothesis requires fine-tuning of the strength or structure of the synaptic connections. A second possibility is that large timescales in the neural dynamics are inherited from large timescales of underlying biophysical processes, two prominent candidates being intrinsic adaptive ionic currents and synaptic transmission. How the timescales of adaptation or synaptic transmission influence the timescale of the network dynamics has however not been fully explored. To address this question, here we analyze large networks of randomly connected excitatory and inhibitory units with additional degrees of freedom that correspond to adaptation or synaptic filtering. We determine the fixed points of the systems, their stability to perturbations and the corresponding dynamical timescales. Furthermore, we apply dynamical mean field theory to study the temporal statistics of the activity in the fluctuating regime, and examine how the adaptation and synaptic timescales transfer from individual units to the whole population. Our overarching finding is that synaptic filtering and adaptation in single neurons have very different effects at the network level. Unexpectedly, the macroscopic network dynamics do not inherit the large timescale present in adaptive currents. In contrast, the timescales of network activity increase proportionally to the time constant of the synaptic filter. Altogether, our study demonstrates that the timescales of different biophysical processes have different effects on the network level, so that the slow processes within individual neurons do not necessarily induce slow activity in large recurrent neural networks.
Collapse
Affiliation(s)
- Manuel Beiran
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
16
|
van Meegen A, Lindner B. Self-Consistent Correlations of Randomly Coupled Rotators in the Asynchronous State. PHYSICAL REVIEW LETTERS 2018; 121:258302. [PMID: 30608814 DOI: 10.1103/physrevlett.121.258302] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2017] [Revised: 10/09/2018] [Indexed: 06/09/2023]
Abstract
We study a network of unidirectionally coupled rotators with independent identically distributed (i.i.d.) frequencies and i.i.d. coupling coefficients. Similar to biological networks, this system can attain an asynchronous state with pronounced temporal autocorrelations of the rotators. We derive differential equations for the self-consistent autocorrelation function that can be solved analytically in limit cases. For more involved scenarios, its numerical solution is confirmed by simulations of networks with Gaussian or sparsely distributed coupling coefficients. The theory is finally generalized for pulse-coupled units and tested on a standard model of computational neuroscience, a recurrent network of sparsely coupled exponential integrate-and-fire neurons.
Collapse
Affiliation(s)
- Alexander van Meegen
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
17
|
Ullner E, Politi A, Torcini A. Ubiquity of collective irregular dynamics in balanced networks of spiking neurons. CHAOS (WOODBURY, N.Y.) 2018; 28:081106. [PMID: 30180628 DOI: 10.1063/1.5049902] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 08/09/2018] [Indexed: 06/08/2023]
Abstract
We revisit the dynamics of a prototypical model of balanced activity in networks of spiking neurons. A detailed investigation of the thermodynamic limit for fixed density of connections (massive coupling) shows that, when inhibition prevails, the asymptotic regime is not asynchronous but rather characterized by a self-sustained irregular, macroscopic (collective) dynamics. So long as the connectivity is massive, this regime is found in many different setups: leaky as well as quadratic integrate-and-fire neurons; large and small coupling strength; and weak and strong external currents.
Collapse
Affiliation(s)
- Ekkehard Ullner
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Antonio Politi
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Alessandro Torcini
- Max Planck Institut für Physik komplexer Systeme, Nöthnitzer Str. 38, 01187 Dresden, Germany
| |
Collapse
|
18
|
Bird AD, Richardson MJE. Transmission of temporally correlated spike trains through synapses with short-term depression. PLoS Comput Biol 2018; 14:e1006232. [PMID: 29933363 PMCID: PMC6039054 DOI: 10.1371/journal.pcbi.1006232] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Revised: 07/10/2018] [Accepted: 05/24/2018] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic depression, caused by depletion of releasable neurotransmitter, modulates the strength of neuronal connections in a history-dependent manner. Quantifying the statistics of synaptic transmission requires stochastic models that link probabilistic neurotransmitter release with presynaptic spike-train statistics. Common approaches are to model the presynaptic spike train as either regular or a memory-less Poisson process: few analytical results are available that describe depressing synapses when the afferent spike train has more complex, temporally correlated statistics such as bursts. Here we present a series of analytical results—from vesicle release-site occupancy statistics, via neurotransmitter release, to the post-synaptic voltage mean and variance—for depressing synapses driven by correlated presynaptic spike trains. The class of presynaptic drive considered is that fully characterised by the inter-spike-interval distribution and encompasses a broad range of models used for neuronal circuit and network analyses, such as integrate-and-fire models with a complete post-spike reset and receiving sufficiently short-time correlated drive. We further demonstrate that the derived post-synaptic voltage mean and variance allow for a simple and accurate approximation of the firing rate of the post-synaptic neuron, using the exponential integrate-and-fire model as an example. These results extend the level of biological detail included in models of synaptic transmission and will allow for the incorporation of more complex and physiologically relevant firing patterns into future studies of neuronal networks. Synapses between neurons transmit signals with strengths that vary with the history of their activity, over scales from milliseconds to decades. Short-term changes in synaptic strength modulate and sculpt ongoing neuronal activity, whereas long-term changes underpin memory formation. Here we focus on changes of strength over timescales of less than a second caused by transitory depletion of the neurotransmitters that carry signals across the synapse. Neurotransmitters are stored in small vesicles that release their contents, with a certain probability, when the presynaptic neuron is active. Once a vesicle has been used it is replenished after a variable delay. There is therefore a complex interaction between the pattern of incoming signals to the synapse and the probablistic release and restock of packaged neurotransmitter. Here we extend existing models to examine how correlated synaptic activity is transmitted through synapses and affects the voltage fluctuations and firing rate of the target neuron. Our results provide a framework that will allow for the inclusion of biophysically realistic synaptic behaviour in studies of neuronal circuits.
Collapse
Affiliation(s)
- Alex D. Bird
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
- Ernst Strüngmann Institute for Neuroscience, Max Planck Society, Frankfurt, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt, Germany
- * E-mail: (ADB); (MJER)
| | - Magnus J. E. Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry, United Kingdom
- * E-mail: (ADB); (MJER)
| |
Collapse
|
19
|
Antunes G, Faria da Silva SF, Simoes de Souza FM. Mirror Neurons Modeled Through Spike-Timing-Dependent Plasticity are Affected by Channelopathies Associated with Autism Spectrum Disorder. Int J Neural Syst 2018; 28:1750058. [DOI: 10.1142/s0129065717500587] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Mirror neurons fire action potentials both when the agent performs a certain behavior and watches someone performing a similar action. Here, we present an original mirror neuron model based on the spike-timing-dependent plasticity (STDP) between two morpho-electrical models of neocortical pyramidal neurons. Both neurons fired spontaneously with basal firing rate that follows a Poisson distribution, and the STDP between them was modeled by the triplet algorithm. Our simulation results demonstrated that STDP is sufficient for the rise of mirror neuron function between the pairs of neocortical neurons. This is a proof of concept that pairs of neocortical neurons associating sensory inputs to motor outputs could operate like mirror neurons. In addition, we used the mirror neuron model to investigate whether channelopathies associated with autism spectrum disorder could impair the modeled mirror function. Our simulation results showed that impaired hyperpolarization-activated cationic currents (Ih) affected the mirror function between the pairs of neocortical neurons coupled by STDP.
Collapse
Affiliation(s)
- Gabriela Antunes
- Department of Physics, Faculdade de Filosofia, Ciencias e Letras de Ribeirao Preto, Universidade de Sao Paulo, Ribeirao Preto, SP, Brazil
| | | | - Fabio M. Simoes de Souza
- Center for Mathematics, Computation and Cognition, Federal University of ABC, Sao Bernardo do Campo, SP, Brazil
| |
Collapse
|
20
|
Pena RFO, Vellmer S, Bernardi D, Roque AC, Lindner B. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks. Front Comput Neurosci 2018; 12:9. [PMID: 29551968 PMCID: PMC5840464 DOI: 10.3389/fncom.2018.00009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2017] [Accepted: 02/07/2018] [Indexed: 11/13/2022] Open
Abstract
Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.
Collapse
Affiliation(s)
- Rodrigo F O Pena
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Sebastian Vellmer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Davide Bernardi
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Antonio C Roque
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| |
Collapse
|
21
|
Beiran M, Kruscha A, Benda J, Lindner B. Coding of time-dependent stimuli in homogeneous and heterogeneous neural populations. J Comput Neurosci 2017; 44:189-202. [PMID: 29222729 DOI: 10.1007/s10827-017-0674-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Revised: 11/08/2017] [Accepted: 11/12/2017] [Indexed: 11/29/2022]
Abstract
We compare the information transmission of a time-dependent signal by two types of uncoupled neuron populations that differ in their sources of variability: i) a homogeneous population whose units receive independent noise and ii) a deterministic heterogeneous population, where each unit exhibits a different baseline firing rate ('disorder'). Our criterion for making both sources of variability quantitatively comparable is that the interspike-interval distributions are identical for both systems. Numerical simulations using leaky integrate-and-fire neurons unveil that a non-zero amount of both noise or disorder maximizes the encoding efficiency of the homogeneous and heterogeneous system, respectively, as a particular case of suprathreshold stochastic resonance. Our findings thus illustrate that heterogeneity can render similarly profitable effects for neuronal populations as dynamic noise. The optimal noise/disorder depends on the system size and the properties of the stimulus such as its intensity or cutoff frequency. We find that weak stimuli are better encoded by a noiseless heterogeneous population, whereas for strong stimuli a homogeneous population outperforms an equivalent heterogeneous system up to a moderate noise level. Furthermore, we derive analytical expressions of the coherence function for the cases of very strong noise and of vanishing intrinsic noise or heterogeneity, which predict the existence of an optimal noise intensity. Our results show that, depending on the type of signal, noise as well as heterogeneity can enhance the encoding performance of neuronal populations.
Collapse
Affiliation(s)
- Manuel Beiran
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany. .,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, Département Études Cognitives, École Normale Supérieure, INSERM, PSL Research University, Paris, France.
| | - Alexandra Kruscha
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Jan Benda
- Institute for Neurobiology, Eberhard Karls Universität, Tübingen, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
22
|
Shomali SR, Ahmadabadi MN, Shimazaki H, Rasuli SN. How does transient signaling input affect the spike timing of postsynaptic neuron near the threshold regime: an analytical study. J Comput Neurosci 2017; 44:147-171. [PMID: 29192377 PMCID: PMC5851711 DOI: 10.1007/s10827-017-0664-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Revised: 07/14/2017] [Accepted: 09/11/2017] [Indexed: 11/05/2022]
Abstract
The noisy threshold regime, where even a small set of presynaptic neurons can significantly affect postsynaptic spike-timing, is suggested as a key requisite for computation in neurons with high variability. It also has been proposed that signals under the noisy conditions are successfully transferred by a few strong synapses and/or by an assembly of nearly synchronous synaptic activities. We analytically investigate the impact of a transient signaling input on a leaky integrate-and-fire postsynaptic neuron that receives background noise near the threshold regime. The signaling input models a single strong synapse or a set of synchronous synapses, while the background noise represents a lot of weak synapses. We find an analytic solution that explains how the first-passage time (ISI) density is changed by transient signaling input. The analysis allows us to connect properties of the signaling input like spike timing and amplitude with postsynaptic first-passage time density in a noisy environment. Based on the analytic solution, we calculate the Fisher information with respect to the signaling input’s amplitude. For a wide range of amplitudes, we observe a non-monotonic behavior for the Fisher information as a function of background noise. Moreover, Fisher information non-trivially depends on the signaling input’s amplitude; changing the amplitude, we observe one maximum in the high level of the background noise. The single maximum splits into two maximums in the low noise regime. This finding demonstrates the benefit of the analytic solution in investigating signal transfer by neurons.
Collapse
Affiliation(s)
- Safura Rashid Shomali
- School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5746 (1954851167), Tehran, Iran.
| | - Majid Nili Ahmadabadi
- Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, 14395-515, Iran
| | - Hideaki Shimazaki
- Graduate School of Informatics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto, 606-8501, Japan.,Honda Research Institute Japan, Honcho 8-1, Wako-shi, Saitama, 351-0188, Japan
| | - Seyyed Nader Rasuli
- Department of Physics, University of Guilan, Rasht, 41335-1914, Iran.,School of Physics, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran, Iran
| |
Collapse
|
23
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
24
|
Exact firing time statistics of neurons driven by discrete inhibitory noise. Sci Rep 2017; 7:1577. [PMID: 28484244 PMCID: PMC5431561 DOI: 10.1038/s41598-017-01658-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2017] [Accepted: 03/29/2017] [Indexed: 12/15/2022] Open
Abstract
Neurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory pre- synaptic neurons, which determines the firing activity of the stimulated neuron. In order to investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous post- synaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.
Collapse
|
25
|
Mastrogiuseppe F, Ostojic S. Intrinsically-generated fluctuating activity in excitatory-inhibitory networks. PLoS Comput Biol 2017; 13:e1005498. [PMID: 28437436 PMCID: PMC5421821 DOI: 10.1371/journal.pcbi.1005498] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Revised: 05/08/2017] [Accepted: 04/04/2017] [Indexed: 12/05/2022] Open
Abstract
Recurrent networks of non-linear units display a variety of dynamical regimes depending on the structure of their synaptic connectivity. A particularly remarkable phenomenon is the appearance of strongly fluctuating, chaotic activity in networks of deterministic, but randomly connected rate units. How this type of intrinsically generated fluctuations appears in more realistic networks of spiking neurons has been a long standing question. To ease the comparison between rate and spiking networks, recent works investigated the dynamical regimes of randomly-connected rate networks with segregated excitatory and inhibitory populations, and firing rates constrained to be positive. These works derived general dynamical mean field (DMF) equations describing the fluctuating dynamics, but solved these equations only in the case of purely inhibitory networks. Using a simplified excitatory-inhibitory architecture in which DMF equations are more easily tractable, here we show that the presence of excitation qualitatively modifies the fluctuating activity compared to purely inhibitory networks. In presence of excitation, intrinsically generated fluctuations induce a strong increase in mean firing rates, a phenomenon that is much weaker in purely inhibitory networks. Excitation moreover induces two different fluctuating regimes: for moderate overall coupling, recurrent inhibition is sufficient to stabilize fluctuations; for strong coupling, firing rates are stabilized solely by the upper bound imposed on activity, even if inhibition is stronger than excitation. These results extend to more general network architectures, and to rate networks receiving noisy inputs mimicking spiking activity. Finally, we show that signatures of the second dynamical regime appear in networks of integrate-and-fire neurons.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
- Laboratoire de Physique Statistique, CNRS UMR 8550, École Normale Supérieure - PSL Research University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
| |
Collapse
|
26
|
Rosenbaum R. A Diffusion Approximation and Numerical Methods for Adaptive Neuron Models with Stochastic Inputs. Front Comput Neurosci 2016; 10:39. [PMID: 27148036 PMCID: PMC4840919 DOI: 10.3389/fncom.2016.00039] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2015] [Accepted: 04/04/2016] [Indexed: 11/16/2022] Open
Abstract
Characterizing the spiking statistics of neurons receiving noisy synaptic input is a central problem in computational neuroscience. Monte Carlo approaches to this problem are computationally expensive and often fail to provide mechanistic insight. Thus, the field has seen the development of mathematical and numerical approaches, often relying on a Fokker-Planck formalism. These approaches force a compromise between biological realism, accuracy and computational efficiency. In this article we develop an extension of existing diffusion approximations to more accurately approximate the response of neurons with adaptation currents and noisy synaptic currents. The implementation refines existing numerical schemes for solving the associated Fokker-Planck equations to improve computationally efficiency and accuracy. Computer code implementing the developed algorithms is made available to the public.
Collapse
Affiliation(s)
- Robert Rosenbaum
- Applied and Computational Mathematics and Statistics, University of Notre Dame Notre Dame, IN, USA
| |
Collapse
|
27
|
Goldental A, Sabo P, Sardi S, Vardi R, Kanter I. Mimicking Collective Firing Patterns of Hundreds of Connected Neurons using a Single-Neuron Experiment. Front Neurosci 2016; 9:508. [PMID: 26834538 DOI: 10.3389/fnins.2015.00508] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2015] [Accepted: 12/21/2015] [Indexed: 11/13/2022] Open
Abstract
The experimental study of neural networks requires simultaneous measurements of a massive number of neurons, while monitoring properties of the connectivity, synaptic strengths and delays. Current technological barriers make such a mission unachievable. In addition, as a result of the enormous number of required measurements, the estimated network parameters would differ from the original ones. Here we present a versatile experimental technique, which enables the study of recurrent neural networks activity while being capable of dictating the network connectivity and synaptic strengths. This method is based on the observation that the response of neurons depends solely on their recent stimulations, a short-term memory. It allows a long-term scheme of stimulation and recording of a single neuron, to mimic simultaneous activity measurements of neurons in a recurrent network. Utilization of this technique demonstrates the spontaneous emergence of cooperative synchronous oscillations, in particular the coexistence of fast γ and slow δ oscillations, and opens the horizon for the experimental study of other cooperative phenomena within large-scale neural networks.
Collapse
Affiliation(s)
- Amir Goldental
- Department of Physics, Bar-Ilan University Ramat-Gan, Israel
| | - Pinhas Sabo
- Department of Physics, Bar-Ilan University Ramat-Gan, Israel
| | - Shira Sardi
- Department of Physics, Bar-Ilan UniversityRamat-Gan, Israel; Gonda Interdisciplinary Brain Research Center and The Goodman Faculty of Life Sciences, Bar-Ilan UniversityRamat-Gan, Israel
| | - Roni Vardi
- Gonda Interdisciplinary Brain Research Center and The Goodman Faculty of Life Sciences, Bar-Ilan University Ramat-Gan, Israel
| | - Ido Kanter
- Department of Physics, Bar-Ilan UniversityRamat-Gan, Israel; Gonda Interdisciplinary Brain Research Center and The Goodman Faculty of Life Sciences, Bar-Ilan UniversityRamat-Gan, Israel
| |
Collapse
|
28
|
Schuecker J, Diesmann M, Helias M. Modulated escape from a metastable state driven by colored noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:052119. [PMID: 26651659 DOI: 10.1103/physreve.92.052119] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/06/2014] [Indexed: 06/05/2023]
Abstract
Many phenomena in nature are described by excitable systems driven by colored noise. The temporal correlations in the fluctuations hinder an analytical treatment. We here present a general method of reduction to a white-noise system, capturing the color of the noise by effective and time-dependent boundary conditions. We apply the formalism to a model of the excitability of neuronal membranes, the leaky integrate-and-fire neuron model, revealing an analytical expression for the linear response of the system valid up to moderate frequencies. The closed form analytical expression enables the characterization of the response properties of such excitable units and the assessment of oscillations emerging in networks thereof.
Collapse
Affiliation(s)
- Jannis Schuecker
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
29
|
Wieland S, Bernardi D, Schwalger T, Lindner B. Slow fluctuations in recurrent networks of spiking neurons. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:040901. [PMID: 26565154 DOI: 10.1103/physreve.92.040901] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/24/2015] [Indexed: 06/05/2023]
Abstract
Networks of fast nonlinear elements may display slow fluctuations if interactions are strong. We find a transition in the long-term variability of a sparse recurrent network of perfect integrate-and-fire neurons at which the Fano factor switches from zero to infinity and the correlation time is minimized. This corresponds to a bifurcation in a linear map arising from the self-consistency of temporal input and output statistics. More realistic neural dynamics with a leak current and refractory period lead to smoothed transitions and modified critical couplings that can be theoretically predicted.
Collapse
Affiliation(s)
- Stefan Wieland
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Tilo Schwalger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, Station 15, 1015 Lausanne EPFL, Switzerland
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
30
|
Statistical structure of neural spiking under non-Poissonian or other non-white stimulation. J Comput Neurosci 2015; 39:29-51. [PMID: 25936628 DOI: 10.1007/s10827-015-0560-x] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2014] [Revised: 02/24/2015] [Accepted: 03/27/2015] [Indexed: 10/23/2022]
Abstract
Nerve cells in the brain generate sequences of action potentials with a complex statistics. Theoretical attempts to understand this statistics were largely limited to the case of a temporally uncorrelated input (Poissonian shot noise) from the neurons in the surrounding network. However, the stimulation from thousands of other neurons has various sorts of temporal structure. Firstly, input spike trains are temporally correlated because their firing rates can carry complex signals and because of cell-intrinsic properties like neural refractoriness, bursting, or adaptation. Secondly, at the connections between neurons, the synapses, usage-dependent changes in the synaptic weight (short-term plasticity) further shape the correlation structure of the effective input to the cell. From the theoretical side, it is poorly understood how these correlated stimuli, so-called colored noise, affect the spike train statistics. In particular, no standard method exists to solve the associated first-passage-time problem for the interspike-interval statistics with an arbitrarily colored noise. Assuming that input fluctuations are weaker than the mean neuronal drive, we derive simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network. We verify our theory by numerical simulations for three paradigmatic situations that lead to input correlations: (i) rate-coded naturalistic stimuli in presynaptic spike trains; (ii) presynaptic refractoriness or bursting; (iii) synaptic short-term plasticity. In all cases, we find severe effects on interval statistics. Our results provide a framework for the interpretation of firing statistics measured in vivo in the brain.
Collapse
|