1
|
Hwang SH, Park D, Lee JW, Lee SH, Kim HF. Convergent representation of values from tactile and visual inputs for efficient goal-directed behavior in the primate putamen. Nat Commun 2024; 15:8954. [PMID: 39448643 PMCID: PMC11502908 DOI: 10.1038/s41467-024-53342-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2024] [Accepted: 10/03/2024] [Indexed: 10/26/2024] Open
Abstract
Animals can discriminate diverse sensory values with a limited number of neurons, raising questions about how the brain utilizes neural resources to efficiently process multi-dimensional inputs for decision-making. Here, we demonstrate that this efficiency is achieved by reducing sensory dimensions and converging towards the value dimension essential for goal-directed behavior in the putamen. Humans and monkeys performed tactile and visual value discrimination tasks while their neural responses were examined. Value information, whether originating from tactile or visual stimuli, was found to be processed within the human putamen using fMRI. Notably, at the single-neuron level in the macaque putamen, half of the individual neurons encode values independently of sensory inputs, while the other half selectively encode tactile or visual value. The responses of bimodal value neurons correlate with value-guided finger insertion behavior in both tasks, whereas modality-selective value neurons show task-specific correlations. Simulation using these neurons reveals that the presence of bimodal value neurons enables value discrimination with a significantly reduced number of neurons compared to simulations without them. Our data indicate that individual neurons in the primate putamen process different values in a convergent manner, thereby facilitating the efficient use of constrained neural resources for value-guided behavior.
Collapse
Affiliation(s)
- Seong-Hwan Hwang
- School of Biological Sciences, College of Natural Sciences, Seoul National University (SNU), Seoul, 08826, Republic of Korea
- Institute for Data Innovation in Science, Seoul National University (SNU), Seoul, 08826, Republic of Korea
| | - Doyoung Park
- Institute for Data Innovation in Science, Seoul National University (SNU), Seoul, 08826, Republic of Korea
- Institute of Psychological Sciences, Institute of Social Sciences, Seoul National University (SNU), Seoul, 08826, Republic of Korea
- Department of Psychology, College of Social Sciences, Seoul National University (SNU), Seoul, 08826, Republic of Korea
| | - Ji-Woo Lee
- School of Biological Sciences, College of Natural Sciences, Seoul National University (SNU), Seoul, 08826, Republic of Korea
| | - Sue-Hyun Lee
- Department of Psychology, College of Social Sciences, Seoul National University (SNU), Seoul, 08826, Republic of Korea.
| | - Hyoung F Kim
- School of Biological Sciences, College of Natural Sciences, Seoul National University (SNU), Seoul, 08826, Republic of Korea.
- Institute for Data Innovation in Science, Seoul National University (SNU), Seoul, 08826, Republic of Korea.
| |
Collapse
|
2
|
Franzen J, Ramlow L, Lindner B. The steady state and response to a periodic stimulation of the firing rate for a theta neuron with correlated noise. J Comput Neurosci 2023; 51:107-128. [PMID: 36273087 PMCID: PMC9840600 DOI: 10.1007/s10827-022-00836-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 07/29/2022] [Accepted: 09/01/2022] [Indexed: 01/18/2023]
Abstract
The stochastic activity of neurons is caused by various sources of correlated fluctuations and can be described in terms of simplified, yet biophysically grounded, integrate-and-fire models. One paradigmatic model is the quadratic integrate-and-fire model and its equivalent phase description by the theta neuron. Here we study the theta neuron model driven by a correlated Ornstein-Uhlenbeck noise and by periodic stimuli. We apply the matrix-continued-fraction method to the associated Fokker-Planck equation to develop an efficient numerical scheme to determine the stationary firing rate as well as the stimulus-induced modulation of the instantaneous firing rate. For the stationary case, we identify the conditions under which the firing rate decreases or increases by the effect of the colored noise and compare our results to existing analytical approximations for limit cases. For an additional periodic signal we demonstrate how the linear and nonlinear response terms can be computed and report resonant behavior for some of them. We extend the method to the case of two periodic signals, generally with incommensurable frequencies, and present a particular case for which a strong mixed response to both signals is observed, i.e. where the response to the sum of signals differs significantly from the sum of responses to the single signals. We provide Python code for our computational method: https://github.com/jannikfranzen/theta_neuron .
Collapse
Affiliation(s)
- Jannik Franzen
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
| | - Lukas Ramlow
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| | - Benjamin Lindner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| |
Collapse
|
3
|
The Mean Field Approach for Populations of Spiking Neurons. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:125-157. [DOI: 10.1007/978-3-030-89439-9_6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
AbstractMean field theory is a device to analyze the collective behavior of a dynamical system comprising many interacting particles. The theory allows to reduce the behavior of the system to the properties of a handful of parameters. In neural circuits, these parameters are typically the firing rates of distinct, homogeneous subgroups of neurons. Knowledge of the firing rates under conditions of interest can reveal essential information on both the dynamics of neural circuits and the way they can subserve brain function. The goal of this chapter is to provide an elementary introduction to the mean field approach for populations of spiking neurons. We introduce the general idea in networks of binary neurons, starting from the most basic results and then generalizing to more relevant situations. This allows to derive the mean field equations in a simplified setting. We then derive the mean field equations for populations of integrate-and-fire neurons. An effort is made to derive the main equations of the theory using only elementary methods from calculus and probability theory. The chapter ends with a discussion of the assumptions of the theory and some of the consequences of violating those assumptions. This discussion includes an introduction to balanced and metastable networks and a brief catalogue of successful applications of the mean field approach to the study of neural circuits.
Collapse
|
4
|
Fonseca Dos Reis E, Li A, Masuda N. Generative models of simultaneously heavy-tailed distributions of interevent times on nodes and edges. Phys Rev E 2020; 102:052303. [PMID: 33327065 DOI: 10.1103/physreve.102.052303] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Accepted: 10/15/2020] [Indexed: 06/12/2023]
Abstract
Intervals between discrete events representing human activities, as well as other types of events, often obey heavy-tailed distributions, and their impacts on collective dynamics on networks such as contagion processes have been intensively studied. The literature supports that such heavy-tailed distributions are present for interevent times associated with both individual nodes and individual edges in networks. However, the simultaneous presence of heavy-tailed distributions of interevent times for nodes and edges is a nontrivial phenomenon, and its origin has been elusive. In the present study, we propose a generative model and its variants to explain this phenomenon. We assume that each node independently transits between a high-activity and low-activity state according to a continuous-time two-state Markov process and that, for the main model, events on an edge occur at a high rate if and only if both end nodes of the edge are in the high-activity state. In other words, two nodes interact frequently only when both nodes prefer to interact with others. The model produces distributions of interevent times for both individual nodes and edges that resemble heavy-tailed distributions across some scales. It also produces positive correlation in consecutive interevent times, which is another stylized observation for empirical data of human activity. We expect that our modeling framework provides a useful benchmark for investigating dynamics on temporal networks driven by non-Poissonian event sequences.
Collapse
Affiliation(s)
- Elohim Fonseca Dos Reis
- Department of Mathematics, State University of New York at Buffalo, Buffalo, New York 14260, USA
| | - Aming Li
- Department of Zoology, University of Oxford, Oxford OX1 3PS, United Kingdom
- Department of Biochemistry, University of Oxford, Oxford OX1 3QU, United Kingdom
| | - Naoki Masuda
- Department of Mathematics, State University of New York at Buffalo, Buffalo, New York 14260, USA
- Computational and Data-Enabled Science and Engineering Program, State University of New York at Buffalo, Buffalo, New York 14260, USA
- Faculty of Science and Engineering, Waseda University, 169-8555 Tokyo, Japan
| |
Collapse
|
5
|
Fang Y, Yu Z, Chen F. Noise Helps Optimization Escape From Saddle Points in the Synaptic Plasticity. Front Neurosci 2020; 14:343. [PMID: 32410937 PMCID: PMC7201302 DOI: 10.3389/fnins.2020.00343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2019] [Accepted: 03/23/2020] [Indexed: 11/20/2022] Open
Abstract
Numerous experimental studies suggest that noise is inherent in the human brain. However, the functional importance of noise remains unknown. n particular, from a computational perspective, such stochasticity is potentially harmful to brain function. In machine learning, a large number of saddle points are surrounded by high error plateaus and give the illusion of the existence of local minimum. As a result, being trapped in the saddle points can dramatically impair learning and adding noise will attack such saddle point problems in high-dimensional optimization, especially under the strict saddle condition. Motivated by these arguments, we propose one biologically plausible noise structure and demonstrate that noise can efficiently improve the optimization performance of spiking neural networks based on stochastic gradient descent. The strict saddle condition for synaptic plasticity is deduced, and under such conditions, noise can help optimization escape from saddle points on high dimensional domains. The theoretical results explain the stochasticity of synapses and guide us on how to make use of noise. In addition, we provide biological interpretations of proposed noise structures from two points: one based on the free energy principle in neuroscience and another based on observations of in vivo experiments. Our simulation results manifest that in the learning and test phase, the accuracy of synaptic sampling with noise is almost 20% higher than that without noise for synthesis dataset, and the gain in accuracy with/without noise is at least 10% for the MNIST and CIFAR-10 dataset. Our study provides a new learning framework for the brain and sheds new light on deep noisy spiking neural networks.
Collapse
Affiliation(s)
- Ying Fang
- Department of Automation, Center for Brain-Inspired Computing Research, Tsinghua University, Beijing, China
- Beijing Innovation Center for Future Chip, Beijing, China
- Beijing Key Laboratory of Security in Big Data Processing and Application, Beijing, China
| | - Zhaofei Yu
- National Engineering Laboratory for Video Technology, School of Electronics Engineering and Computer Science, Peking University, Beijing, China
| | - Feng Chen
- Department of Automation, Center for Brain-Inspired Computing Research, Tsinghua University, Beijing, China
- Beijing Innovation Center for Future Chip, Beijing, China
- Beijing Key Laboratory of Security in Big Data Processing and Application, Beijing, China
| |
Collapse
|
6
|
Abstract
'Bursting', defined as periods of high-frequency firing of a neuron separated by periods of quiescence, has been observed in various neuronal systems, both in vitro and in vivo. It has been associated with a range of neuronal processes, including efficient information transfer and the formation of functional networks during development, and has been shown to be sensitive to genetic and pharmacological manipulations. Accurate detection of periods of bursting activity is thus an important aspect of characterising both spontaneous and evoked neuronal network activity. A wide variety of computational methods have been developed to detect periods of bursting in spike trains recorded from neuronal networks. In this chapter, we review several of the most popular and successful of these methods.
Collapse
|
7
|
Bird AD, Richardson MJE. Transmission of temporally correlated spike trains through synapses with short-term depression. PLoS Comput Biol 2018; 14:e1006232. [PMID: 29933363 PMCID: PMC6039054 DOI: 10.1371/journal.pcbi.1006232] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Revised: 07/10/2018] [Accepted: 05/24/2018] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic depression, caused by depletion of releasable neurotransmitter, modulates the strength of neuronal connections in a history-dependent manner. Quantifying the statistics of synaptic transmission requires stochastic models that link probabilistic neurotransmitter release with presynaptic spike-train statistics. Common approaches are to model the presynaptic spike train as either regular or a memory-less Poisson process: few analytical results are available that describe depressing synapses when the afferent spike train has more complex, temporally correlated statistics such as bursts. Here we present a series of analytical results—from vesicle release-site occupancy statistics, via neurotransmitter release, to the post-synaptic voltage mean and variance—for depressing synapses driven by correlated presynaptic spike trains. The class of presynaptic drive considered is that fully characterised by the inter-spike-interval distribution and encompasses a broad range of models used for neuronal circuit and network analyses, such as integrate-and-fire models with a complete post-spike reset and receiving sufficiently short-time correlated drive. We further demonstrate that the derived post-synaptic voltage mean and variance allow for a simple and accurate approximation of the firing rate of the post-synaptic neuron, using the exponential integrate-and-fire model as an example. These results extend the level of biological detail included in models of synaptic transmission and will allow for the incorporation of more complex and physiologically relevant firing patterns into future studies of neuronal networks. Synapses between neurons transmit signals with strengths that vary with the history of their activity, over scales from milliseconds to decades. Short-term changes in synaptic strength modulate and sculpt ongoing neuronal activity, whereas long-term changes underpin memory formation. Here we focus on changes of strength over timescales of less than a second caused by transitory depletion of the neurotransmitters that carry signals across the synapse. Neurotransmitters are stored in small vesicles that release their contents, with a certain probability, when the presynaptic neuron is active. Once a vesicle has been used it is replenished after a variable delay. There is therefore a complex interaction between the pattern of incoming signals to the synapse and the probablistic release and restock of packaged neurotransmitter. Here we extend existing models to examine how correlated synaptic activity is transmitted through synapses and affects the voltage fluctuations and firing rate of the target neuron. Our results provide a framework that will allow for the inclusion of biophysically realistic synaptic behaviour in studies of neuronal circuits.
Collapse
Affiliation(s)
- Alex D. Bird
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
- Ernst Strüngmann Institute for Neuroscience, Max Planck Society, Frankfurt, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt, Germany
- * E-mail: (ADB); (MJER)
| | - Magnus J. E. Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry, United Kingdom
- * E-mail: (ADB); (MJER)
| |
Collapse
|
8
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
9
|
Lai YM, de Kamps M. Population density equations for stochastic processes with memory kernels. Phys Rev E 2017; 95:062125. [PMID: 28709222 DOI: 10.1103/physreve.95.062125] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2016] [Indexed: 06/07/2023]
Abstract
We present a method for solving population density equations (PDEs)--a mean-field technique describing homogeneous populations of uncoupled neurons-where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation-a recent result from random network theory-describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.
Collapse
Affiliation(s)
- Yi Ming Lai
- Institute for Artificial and Biological Computation, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom
| | - Marc de Kamps
- Institute for Artificial and Biological Computation, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom
| |
Collapse
|
10
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
11
|
Weegink KJ, Bellette PA, Varghese JJ, Silburn PA, Meehan PA, Bradley AP. A Parametric Simulation of Neuronal Noise From Microelectrode Recordings. IEEE Trans Neural Syst Rehabil Eng 2017; 25:1-10. [DOI: 10.1109/tnsre.2016.2573318] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
12
|
Gliske SV, Stacey WC, Lim E, Holman KA, Fink CG. Emergence of Narrowband High Frequency Oscillations from Asynchronous, Uncoupled Neural Firing. Int J Neural Syst 2016; 27:1650049. [PMID: 27712456 DOI: 10.1142/s0129065716500490] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Previous experimental studies have demonstrated the emergence of narrowband local field potential oscillations during epileptic seizures in which the underlying neural activity appears to be completely asynchronous. We derive a mathematical model explaining how this counterintuitive phenomenon may occur, showing that a population of independent, completely asynchronous neurons may produce narrowband oscillations if each neuron fires quasi-periodically, without requiring any intrinsic oscillatory cells or feedback inhibition. This quasi-periodicity can occur through cells with similar frequency-current ([Formula: see text]-[Formula: see text]) curves receiving a similar, high amount of uncorrelated synaptic noise. Thus, this source of oscillatory behavior is distinct from the usual cases (pacemaker cells entraining a network, or oscillations being an inherent property of the network structure), as it requires no oscillatory drive nor any specific network or cellular properties other than cells that repetitively fire with continual stimulus. We also deduce bounds on the degree of variability in neural spike-timing which will permit the emergence of such oscillations, both for action potential- and postsynaptic potential-dominated LFPs. These results suggest that even an uncoupled network may generate collective rhythms, implying that the breakdown of inhibition and high synaptic input often observed during epileptic seizures may generate narrowband oscillations. We propose that this mechanism may explain why so many disparate epileptic and normal brain mechanisms can produce similar high frequency oscillations.
Collapse
Affiliation(s)
- Stephen V Gliske
- 1 Department of Neurology, University of Michigan, 1500 E. Medical Center Drive, Ann Arbor, MI 48109, USA
| | - William C Stacey
- 2 Departments of Biomedical Engineering and Neurology, University of Michigan, 1500 E. Medical Center Drive, Ann Arbor, MI 48109, USA
| | - Eugene Lim
- 3 Department of Physics, Ohio Wesleyan University, 61 S. Sandusky St., Delaware, OH 43015, USA
| | - Katherine A Holman
- 4 Department of Physics, Towson University, 8000 York Road, Towson, MD 21252, USA
| | - Christian G Fink
- 5 Department of Physics and Neuroscience Program, Ohio Wesleyan University, 61 S. Sandusky St., Delaware, OH 43015, USA
| |
Collapse
|
13
|
Polito F, Scalas E. A generalization of the space-fractional Poisson process and its connection to some Lévy processes. ELECTRONIC COMMUNICATIONS IN PROBABILITY 2016. [DOI: 10.1214/16-ecp4383] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
14
|
Rajdl K, Lansky P. Stein's neuronal model with pooled renewal input. BIOLOGICAL CYBERNETICS 2015; 109:389-399. [PMID: 25910437 DOI: 10.1007/s00422-015-0650-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2014] [Accepted: 04/08/2015] [Indexed: 06/04/2023]
Abstract
The input of Stein's model of a single neuron is usually described by using a Poisson process, which is assumed to represent the behaviour of spikes pooled from a large number of presynaptic spike trains. However, such a description of the input is not always appropriate as the variability cannot be separated from the intensity. Therefore, we create and study Stein's model with a more general input, a sum of equilibrium renewal processes. The mean and variance of the membrane potential are derived for this model. Using these formulas and numerical simulations, the model is analyzed to study the influence of the input variability on the properties of the membrane potential and the output spike trains. The generalized Stein's model is compared with the original Stein's model with Poissonian input using the relative difference of variances of membrane potential at steady state and the integral square error of output interspike intervals. Both of the criteria show large differences between the models for input with high variability.
Collapse
Affiliation(s)
- Kamil Rajdl
- Department of Mathematics and Statistics, Faculty of Science, Masaryk University, Kotlarska 2, 611 37, Brno, Czech Republic,
| | | |
Collapse
|
15
|
Statistical structure of neural spiking under non-Poissonian or other non-white stimulation. J Comput Neurosci 2015; 39:29-51. [PMID: 25936628 DOI: 10.1007/s10827-015-0560-x] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2014] [Revised: 02/24/2015] [Accepted: 03/27/2015] [Indexed: 10/23/2022]
Abstract
Nerve cells in the brain generate sequences of action potentials with a complex statistics. Theoretical attempts to understand this statistics were largely limited to the case of a temporally uncorrelated input (Poissonian shot noise) from the neurons in the surrounding network. However, the stimulation from thousands of other neurons has various sorts of temporal structure. Firstly, input spike trains are temporally correlated because their firing rates can carry complex signals and because of cell-intrinsic properties like neural refractoriness, bursting, or adaptation. Secondly, at the connections between neurons, the synapses, usage-dependent changes in the synaptic weight (short-term plasticity) further shape the correlation structure of the effective input to the cell. From the theoretical side, it is poorly understood how these correlated stimuli, so-called colored noise, affect the spike train statistics. In particular, no standard method exists to solve the associated first-passage-time problem for the interspike-interval statistics with an arbitrarily colored noise. Assuming that input fluctuations are weaker than the mean neuronal drive, we derive simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network. We verify our theory by numerical simulations for three paradigmatic situations that lead to input correlations: (i) rate-coded naturalistic stimuli in presynaptic spike trains; (ii) presynaptic refractoriness or bursting; (iii) synaptic short-term plasticity. In all cases, we find severe effects on interval statistics. Our results provide a framework for the interpretation of firing statistics measured in vivo in the brain.
Collapse
|
16
|
Dummer B, Wieland S, Lindner B. Self-consistent determination of the spike-train power spectrum in a neural network with sparse connectivity. Front Comput Neurosci 2014; 8:104. [PMID: 25278869 PMCID: PMC4166962 DOI: 10.3389/fncom.2014.00104] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2014] [Accepted: 08/13/2014] [Indexed: 11/13/2022] Open
Abstract
A major source of random variability in cortical networks is the quasi-random arrival of presynaptic action potentials from many other cells. In network studies as well as in the study of the response properties of single cells embedded in a network, synaptic background input is often approximated by Poissonian spike trains. However, the output statistics of the cells is in most cases far from being Poisson. This is inconsistent with the assumption of similar spike-train statistics for pre- and postsynaptic cells in a recurrent network. Here we tackle this problem for the popular class of integrate-and-fire neurons and study a self-consistent statistics of input and output spectra of neural spike trains. Instead of actually using a large network, we use an iterative scheme, in which we simulate a single neuron over several generations. In each of these generations, the neuron is stimulated with surrogate stochastic input that has a similar statistics as the output of the previous generation. For the surrogate input, we employ two distinct approximations: (i) a superposition of renewal spike trains with the same interspike interval density as observed in the previous generation and (ii) a Gaussian current with a power spectrum proportional to that observed in the previous generation. For input parameters that correspond to balanced input in the network, both the renewal and the Gaussian iteration procedure converge quickly and yield comparable results for the self-consistent spike-train power spectrum. We compare our results to large-scale simulations of a random sparsely connected network of leaky integrate-and-fire neurons (Brunel, 2000) and show that in the asynchronous regime close to a state of balanced synaptic input from the network, our iterative schemes provide an excellent approximations to the autocorrelation of spike trains in the recurrent network.
Collapse
Affiliation(s)
- Benjamin Dummer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Stefan Wieland
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| |
Collapse
|
17
|
Deniz T, Rotter S. Going beyond Poisson processes: a new statistical framework in neuronal modeling and data analysis. BMC Neurosci 2013. [PMCID: PMC3704707 DOI: 10.1186/1471-2202-14-s1-p332] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
18
|
Shinozaki T, Naruse Y, Câteau H. Gap junctions facilitate propagation of synchronous firing in the cortical neural population: a numerical simulation study. Neural Netw 2013; 46:91-8. [PMID: 23711746 DOI: 10.1016/j.neunet.2013.04.011] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2012] [Revised: 01/22/2013] [Accepted: 04/26/2013] [Indexed: 10/26/2022]
Abstract
This study investigates the effect of gap junctions on firing propagation in a feedforward neural network by a numerical simulation with biologically plausible parameters. Gap junctions are electrical couplings between two cells connected by a binding protein, connexin. Recent electrophysiological studies have reported that a large number of inhibitory neurons in the mammalian cortex are mutually connected by gap junctions, and synchronization of gap junctions, spread over several hundred microns, suggests that these have a strong effect on the dynamics of the cortical network. However, the effect of gap junctions on firing propagation in cortical circuits has not been examined systematically. In this study, we perform numerical simulations using biologically plausible parameters to clarify this effect on population firing in a feedforward neural network. The results suggest that gap junctions switch the temporally uniform firing in a layer to temporally clustered firing in subsequent layers, resulting in an enhancement in the propagation of population firing in the feedforward network. Because gap junctions are often modulated in physiological conditions, we speculate that gap junctions could be related to a gating function of population firing in the brain.
Collapse
Affiliation(s)
- Takashi Shinozaki
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA.
| | | | | |
Collapse
|
19
|
Reimer ICG, Staude B, Boucsein C, Rotter S. A new method to infer higher-order spike correlations from membrane potentials. J Comput Neurosci 2013; 35:169-86. [PMID: 23474914 PMCID: PMC3766522 DOI: 10.1007/s10827-013-0446-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2012] [Revised: 01/25/2013] [Accepted: 02/05/2013] [Indexed: 11/25/2022]
Abstract
What is the role of higher-order spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higher-order correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higher-order statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higher-order correlations from the population spike count; Staude et al., J Comput Neurosci 29(1-2):327-350, 2010c) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higher-order correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials.
Collapse
Affiliation(s)
- Imke C G Reimer
- Bernstein Center Freiburg and Faculty of Biology, University of Freiburg, Freiburg, Germany
| | | | | | | |
Collapse
|
20
|
Ko D, Wilson CJ, Lobb CJ, Paladini CA. Detection of bursts and pauses in spike trains. J Neurosci Methods 2012; 211:145-58. [PMID: 22939922 DOI: 10.1016/j.jneumeth.2012.08.013] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2012] [Revised: 08/03/2012] [Accepted: 08/13/2012] [Indexed: 10/28/2022]
Abstract
Midbrain dopaminergic neurons in vivo exhibit a wide range of firing patterns. They normally fire constantly at a low rate, and speed up, firing a phasic burst when reward exceeds prediction, or pause when an expected reward does not occur. Therefore, the detection of bursts and pauses from spike train data is a critical problem when studying the role of phasic dopamine (DA) in reward related learning, and other DA dependent behaviors. However, few statistical methods have been developed that can identify bursts and pauses simultaneously. We propose a new statistical method, the Robust Gaussian Surprise (RGS) method, which performs an exhaustive search of bursts and pauses in spike trains simultaneously. We found that the RGS method is adaptable to various patterns of spike trains recorded in vivo, and is not influenced by baseline firing rate, making it applicable to all in vivo spike trains where baseline firing rates vary over time. We compare the performance of the RGS method to other methods of detecting bursts, such as the Poisson Surprise (PS), Rank Surprise (RS), and Template methods. Analysis of data using the RGS method reveals potential mechanisms underlying how bursts and pauses are controlled in DA neurons.
Collapse
Affiliation(s)
- D Ko
- Department of Management Science and Statistics, University of Texas at San Antonio, One UTSA Circle, San Antonio, TX 78249, USA
| | | | | | | |
Collapse
|
21
|
Reimer ICG, Staude B, Ehm W, Rotter S. Modeling and analyzing higher-order correlations in non-Poissonian spike trains. J Neurosci Methods 2012; 208:18-33. [PMID: 22561088 DOI: 10.1016/j.jneumeth.2012.04.015] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2011] [Revised: 04/17/2012] [Accepted: 04/18/2012] [Indexed: 11/17/2022]
Abstract
Measuring pairwise and higher-order spike correlations is crucial for studying their potential impact on neuronal information processing. In order to avoid misinterpretation of results, the tools used for data analysis need to be carefully calibrated with respect to their sensitivity and robustness. This, in turn, requires surrogate data with statistical properties common to experimental spike trains. Here, we present a novel method to generate correlated non-Poissonian spike trains and study the impact of single-neuron spike statistics on the inference of higher-order correlations. Our method to mimic cooperative neuronal spike activity allows the realization of a large variety of renewal processes with controlled higher-order correlation structure. Based on surrogate data obtained by this procedure we investigate the robustness of the recently proposed method empirical de-Poissonization (Ehm et al., 2007). It assumes Poissonian spiking, which is common also for many other estimation techniques. We observe that some degree of deviation from this assumption can generally be tolerated, that the results are more reliable for small analysis bins, and that the degree of misestimation depends on the detailed spike statistics. As a consequence of these findings we finally propose a strategy to assess the reliability of results for experimental data.
Collapse
Affiliation(s)
- Imke C G Reimer
- Bernstein Center Freiburg and Faculty of Biology, Albert-Ludwig University, Freiburg, Germany
| | | | | | | |
Collapse
|
22
|
Scheller B, Castellano M, Vicente R, Pipa G. Spike train auto-structure impacts post-synaptic firing and timing-based plasticity. Front Comput Neurosci 2011; 5:60. [PMID: 22203800 PMCID: PMC3243878 DOI: 10.3389/fncom.2011.00060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2010] [Accepted: 11/29/2011] [Indexed: 11/13/2022] Open
Abstract
Cortical neurons are typically driven by several thousand synapses. The precise spatiotemporal pattern formed by these inputs can modulate the response of a post-synaptic cell. In this work, we explore how the temporal structure of pre-synaptic inhibitory and excitatory inputs impact the post-synaptic firing of a conductance-based integrate and fire neuron. Both the excitatory and inhibitory input was modeled by renewal gamma processes with varying shape factors for modeling regular and temporally random Poisson activity. We demonstrate that the temporal structure of mutually independent inputs affects the post-synaptic firing, while the strength of the effect depends on the firing rates of both the excitatory and inhibitory inputs. In a second step, we explore the effect of temporal structure of mutually independent inputs on a simple version of Hebbian learning, i.e., hard bound spike-timing-dependent plasticity. We explore both the equilibrium weight distribution and the speed of the transient weight dynamics for different mutually independent gamma processes. We find that both the equilibrium distribution of the synaptic weights and the speed of synaptic changes are modulated by the temporal structure of the input. Finally, we highlight that the sensitivity of both the post-synaptic firing as well as the spike-timing-dependent plasticity on the auto-structure of the input of a neuron could be used to modulate the learning rate of synaptic modification.
Collapse
Affiliation(s)
- Bertram Scheller
- Clinic for Anesthesia, Intensive Care Medicine and Pain Therapy, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| | - Marta Castellano
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
- Department of Neurophysiology, Max-Planck-Institute for Brain ResearchFrankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| | - Raul Vicente
- Department of Neurophysiology, Max-Planck-Institute for Brain ResearchFrankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| | - Gordon Pipa
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
- Department of Neurophysiology, Max-Planck-Institute for Brain ResearchFrankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| |
Collapse
|
23
|
Deger M, Helias M, Boucsein C, Rotter S. Effective neuronal refractoriness dominates the statistics of superimposed spike trains. BMC Neurosci 2011. [PMCID: PMC3240382 DOI: 10.1186/1471-2202-12-s1-p273] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
24
|
Deger M, Helias M, Boucsein C, Rotter S. Statistical properties of superimposed stationary spike trains. J Comput Neurosci 2011; 32:443-63. [PMID: 21964584 PMCID: PMC3343236 DOI: 10.1007/s10827-011-0362-8] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2011] [Revised: 09/07/2011] [Accepted: 09/08/2011] [Indexed: 11/28/2022]
Abstract
The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities—like the count variability, inter-spike interval (ISI) variability and ISI correlations—and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.
Collapse
Affiliation(s)
- Moritz Deger
- Bernstein Center Freiburg & Faculty of Biology, Albert-Ludwig University, 79104 Freiburg, Germany.
| | | | | | | |
Collapse
|
25
|
Farkhooi F, Muller E, Nawrot MP. Adaptation reduces variability of the neuronal population code. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2011; 83:050905. [PMID: 21728481 DOI: 10.1103/physreve.83.050905] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2010] [Revised: 03/22/2011] [Indexed: 05/31/2023]
Abstract
Sequences of events in noise-driven excitable systems with slow variables often show serial correlations among their intervals of events. Here, we employ a master equation for generalized non-renewal processes to calculate the interval and count statistics of superimposed processes governed by a slow adaptation variable. For an ensemble of neurons with spike-frequency adaptation, this results in the regularization of the population activity and an enhanced postsynaptic signal decoding. We confirm our theoretical results in a population of cortical neurons recorded in vivo.
Collapse
Affiliation(s)
- Farzad Farkhooi
- Neuroinformatics and Theoretical Neuroscience, Freie Universität Berlin and BCCN-Berlin, Berlin, Germany.
| | | | | |
Collapse
|
26
|
Radulescu AR. Mechanisms explaining transitions between tonic and phasic firing in neuronal populations as predicted by a low dimensional firing rate model. PLoS One 2010; 5:e12695. [PMID: 20877649 PMCID: PMC2943909 DOI: 10.1371/journal.pone.0012695] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2009] [Accepted: 08/13/2010] [Indexed: 11/18/2022] Open
Abstract
Several firing patterns experimentally observed in neural populations have been successfully correlated to animal behavior. Population bursting, hereby regarded as a period of high firing rate followed by a period of quiescence, is typically observed in groups of neurons during behavior. Biophysical membrane-potential models of single cell bursting involve at least three equations. Extending such models to study the collective behavior of neural populations involves thousands of equations and can be very expensive computationally. For this reason, low dimensional population models that capture biophysical aspects of networks are needed. The present paper uses a firing-rate model to study mechanisms that trigger and stop transitions between tonic and phasic population firing. These mechanisms are captured through a two-dimensional system, which can potentially be extended to include interactions between different areas of the nervous system with a small number of equations. The typical behavior of midbrain dopaminergic neurons in the rodent is used as an example to illustrate and interpret our results. The model presented here can be used as a building block to study interactions between networks of neurons. This theoretical approach may help contextualize and understand the factors involved in regulating burst firing in populations and how it may modulate distinct aspects of behavior.
Collapse
Affiliation(s)
- Anca R Radulescu
- Department of Psychology, University of Colorado, Boulder, Colorado, USA.
| |
Collapse
|
27
|
Lu W, Rossoni E, Feng J. On a Gaussian neuronal field model. Neuroimage 2010; 52:913-33. [DOI: 10.1016/j.neuroimage.2010.02.075] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2009] [Revised: 02/09/2010] [Accepted: 02/26/2010] [Indexed: 10/19/2022] Open
|
28
|
Staude B, Rotter S, Grün S. CuBIC: cumulant based inference of higher-order correlations in massively parallel spike trains. J Comput Neurosci 2010; 29:327-350. [PMID: 19862611 PMCID: PMC2940040 DOI: 10.1007/s10827-009-0195-x] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2008] [Revised: 08/07/2009] [Accepted: 09/01/2009] [Indexed: 10/24/2022]
Abstract
Recent developments in electrophysiological and optical recording techniques enable the simultaneous observation of large numbers of neurons. A meaningful interpretation of the resulting multivariate data, however, presents a serious challenge. In particular, the estimation of higher-order correlations that characterize the cooperative dynamics of groups of neurons is impeded by the combinatorial explosion of the parameter space. The resulting requirements with respect to sample size and recording time has rendered the detection of coordinated neuronal groups exceedingly difficult. Here we describe a novel approach to infer higher-order correlations in massively parallel spike trains that is less susceptible to these problems. Based on the superimposed activity of all recorded neurons, the cumulant-based inference of higher-order correlations (CuBIC) presented here exploits the fact that the absence of higher-order correlations imposes also strong constraints on correlations of lower order. Thus, estimates of only few lower-order cumulant suffice to infer higher-order correlations in the population. As a consequence, CuBIC is much better compatible with the constraints of in vivo recordings than previous approaches, which is shown by a systematic analysis of its parameter dependence.
Collapse
Affiliation(s)
- Benjamin Staude
- Unit of Statistical Neuroscience, RIKEN Brain Science Institute, Wako-Shi, Japan
- Bernstein Center for Computational Neuroscience, Freiburg & Faculty of Biology, Albert-Ludwig University, Hansastr. 9a, 79104 Freiburg, Germany
| | - Stefan Rotter
- Bernstein Center for Computational Neuroscience, Freiburg & Faculty of Biology, Albert-Ludwig University, Hansastr. 9a, 79104 Freiburg, Germany
| | - Sonja Grün
- Unit of Statistical Neuroscience, RIKEN Brain Science Institute, Wako-Shi, Japan
- Bernstein Center for Computational Neuroscience, Berlin, Humboldt Unverstität zu, Berlin, Germany
| |
Collapse
|
29
|
Werner G. Fractals in the nervous system: conceptual implications for theoretical neuroscience. Front Physiol 2010; 1:15. [PMID: 21423358 PMCID: PMC3059969 DOI: 10.3389/fphys.2010.00015] [Citation(s) in RCA: 84] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2010] [Accepted: 06/05/2010] [Indexed: 11/15/2022] Open
Abstract
This essay is presented with two principal objectives in mind: first, to document the prevalence of fractals at all levels of the nervous system, giving credence to the notion of their functional relevance; and second, to draw attention to the as yet still unresolved issues of the detailed relationships among power-law scaling, self-similarity, and self-organized criticality. As regards criticality, I will document that it has become a pivotal reference point in Neurodynamics. Furthermore, I will emphasize the not yet fully appreciated significance of allometric control processes. For dynamic fractals, I will assemble reasons for attributing to them the capacity to adapt task execution to contextual changes across a range of scales. The final Section consists of general reflections on the implications of the reviewed data, and identifies what appear to be issues of fundamental importance for future research in the rapidly evolving topic of this review.
Collapse
Affiliation(s)
- Gerhard Werner
- Department of Biomedical Engineering, University of Texas at Austin TX, USA.
| |
Collapse
|
30
|
Shotorban B. Dynamic least-squares kernel density modeling of Fokker-Planck equations with application to neural population. Phys Rev E 2010; 81:046706. [PMID: 20481859 DOI: 10.1103/physreve.81.046706] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2009] [Revised: 03/13/2010] [Indexed: 11/07/2022]
Abstract
The dynamic least-squares kernel density (LSQKD) model [C. Pantano and B. Shotorban, Phys. Rev. E 76, 066705 (2007)] is used to solve the Fokker-Planck equations. In this model the probability density function (PDF) is approximated by a linear combination of basis functions with unknown parameters whose governing equations are determined by a global least-squares approximation of the PDF in the phase space. In this work basis functions are set to be Gaussian for which the mean, variance, and covariances are governed by a set of partial differential equations (PDEs) or ordinary differential equations (ODEs) depending on what phase-space variables are approximated by Gaussian functions. Three sample problems of univariate double-well potential, bivariate bistable neurodynamical system [G. Deco and D. Martí, Phys. Rev. E 75, 031913 (2007)], and bivariate Brownian particles in a nonuniform gas are studied. The LSQKD is verified for these problems as its results are compared against the results of the method of characteristics in nondiffusive cases and the stochastic particle method in diffusive cases. For the double-well potential problem it is observed that for low to moderate diffusivity the dynamic LSQKD well predicts the stationary PDF for which there is an exact solution. A similar observation is made for the bistable neurodynamical system. In both these problems least-squares approximation is made on all phase-space variables resulting in a set of ODEs with time as the independent variable for the Gaussian function parameters. In the problem of Brownian particles in a nonuniform gas, this approximation is made only for the particle velocity variable leading to a set of PDEs with time and particle position as independent variables. Solving these PDEs, a very good performance by LSQKD is observed for a wide range of diffusivities.
Collapse
Affiliation(s)
- Babak Shotorban
- Department of Mechanical and Aerospace Engineering, The University of Alabama in Huntsville, Huntsville, Alabama 35899, USA
| |
Collapse
|
31
|
Rosenbaum RJ, Trousdale J, Josić K. Pooling and correlated neural activity. Front Comput Neurosci 2010; 4:9. [PMID: 20485451 PMCID: PMC2870944 DOI: 10.3389/fncom.2010.00009] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2009] [Accepted: 03/24/2010] [Indexed: 11/13/2022] Open
Abstract
Correlations between spike trains can strongly modulate neuronal activity and affect the ability of neurons to encode information. Neurons integrate inputs from thousands of afferents. Similarly, a number of experimental techniques are designed to record pooled cell activity. We review and generalize a number of previous results that show how correlations between cells in a population can be amplified and distorted in signals that reflect their collective activity. The structure of the underlying neuronal response can significantly impact correlations between such pooled signals. Therefore care needs to be taken when interpreting pooled recordings, or modeling networks of cells that receive inputs from large presynaptic populations. We also show that the frequently observed runaway synchrony in feedforward chains is primarily due to the pooling of correlated inputs.
Collapse
Affiliation(s)
- Robert J Rosenbaum
- Department of Mathematics, College of Natural Sciences and Mathematics, University of Houston Houston, TX, USA
| | | | | |
Collapse
|
32
|
|
33
|
Shinozaki T, Okada M, Reyes AD, Câteau H. Flexible traffic control of the synfire-mode transmission by inhibitory modulation: nonlinear noise reduction. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2010; 81:011913. [PMID: 20365405 DOI: 10.1103/physreve.81.011913] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/18/2009] [Revised: 12/02/2009] [Indexed: 05/29/2023]
Abstract
Intermingled neural connections apparent in the brain make us wonder what controls the traffic of propagating activity in the brain to secure signal transmission without harmful crosstalk. Here, we reveal that inhibitory input but not excitatory input works as a particularly useful traffic controller because it controls the degree of synchrony of population firing of neurons as well as controlling the size of the population firing bidirectionally. Our dynamical system analysis reveals that the synchrony enhancement depends crucially on the nonlinear membrane potential dynamics and a hidden slow dynamical variable. Our electrophysiological study with rodent slice preparations show that the phenomenon happens in real neurons. Furthermore, our analysis with the Fokker-Planck equations demonstrates the phenomenon in a semianalytical manner.
Collapse
|
34
|
Wang S, Zhou C. Rate-synchrony relationship between input and output of spike trains in neuronal networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2010; 81:011917. [PMID: 20365409 DOI: 10.1103/physreve.81.011917] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/27/2009] [Revised: 12/06/2009] [Indexed: 05/29/2023]
Abstract
Neuronal networks interact via spike trains. How the spike trains are transformed by neuronal networks is critical for understanding the underlying mechanism of information processing in the nervous system. Both the rate and synchrony of the spikes can affect the transmission, while the relationship between them has not been fully understood. Here we investigate the mapping between input and output spike trains of a neuronal network in terms of firing rate and synchrony. With large enough input rate, the working mode of the neurons is gradually changed from temporal integrators into coincidence detectors when the synchrony degree of input spike trains increases. Since the membrane potentials of the neurons can be depolarized to near the firing threshold by uncorrelated input spikes, small input synchrony can cause great output synchrony. On the other hand, the synchrony in the output may be reduced when the input rate is too small. The case of the feedforward network can be regarded as iterative process of such an input-output relationship. The activity in deep layers of the feedforward network is in an all-or-none manner depending on the input rate and synchrony.
Collapse
Affiliation(s)
- Sentao Wang
- Department of Physics, Hong Kong Baptist University, Kowloon Tong, Hong Kong
| | | |
Collapse
|
35
|
Chen CC, Jasnow D. Mean-field theory of a plastic network of integrate-and-fire neurons. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2010; 81:011907. [PMID: 20365399 DOI: 10.1103/physreve.81.011907] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/25/2009] [Revised: 11/13/2009] [Indexed: 05/29/2023]
Abstract
We consider a noise-driven network of integrate-and-fire neurons. The network evolves as result of the activities of the neurons following spike-timing-dependent plasticity rules. We apply a self-consistent mean-field theory to the system to obtain the mean activity level for the system as a function of the mean synaptic weight, which predicts a first-order transition and hysteresis between a noise-dominated regime and a regime of persistent neural activity. Assuming Poisson firing statistics for the neurons, the plasticity dynamics of a synapse under the influence of the mean-field environment can be mapped to the dynamics of an asymmetric random walk in synaptic-weight space. Using a master equation for small steps, we predict a narrow distribution of synaptic weights that scales with the square root of the plasticity rate for the stationary state of the system given plausible physiological parameter values describing neural transmission and plasticity. The dependence of the distribution on the synaptic weight of the mean-field environment allows us to determine the mean synaptic weight self-consistently. The effect of fluctuations in the total synaptic conductance and plasticity step sizes are also considered. Such fluctuations result in a smoothing of the first-order transition for low number of afferent synapses per neuron and a broadening of the synaptic-weight distribution, respectively.
Collapse
Affiliation(s)
- Chun-Chung Chen
- Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA
| | | |
Collapse
|
36
|
Zhang X, You G, Chen T, Feng J. Maximum likelihood decoding of neuronal inputs from an interspike interval distribution. Neural Comput 2009; 21:3079-105. [PMID: 19635019 DOI: 10.1162/neco.2009.06-08-807] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
An expression for the probability distribution of the interspike interval of a leaky integrate-and-fire (LIF) model neuron is rigorously derived, based on recent theoretical developments in the theory of stochastic processes. This enables us to find for the first time a way of developing maximum likelihood estimates (MLE) of the input information (e.g., afferent rate and variance) for an LIF neuron from a set of recorded spike trains. Dynamic inputs to pools of LIF neurons both with and without interactions are efficiently and reliably decoded by applying the MLE, even within time windows as short as 25 msec.
Collapse
Affiliation(s)
- Xuejuan Zhang
- Mathematical Department, Zhejiang Normal University, Jinhua, PR China.
| | | | | | | |
Collapse
|
37
|
Shinomoto S, Kim H, Shimokawa T, Matsuno N, Funahashi S, Shima K, Fujita I, Tamura H, Doi T, Kawano K, Inaba N, Fukushima K, Kurkin S, Kurata K, Taira M, Tsutsui KI, Komatsu H, Ogawa T, Koida K, Tanji J, Toyama K. Relating neuronal firing patterns to functional differentiation of cerebral cortex. PLoS Comput Biol 2009; 5:e1000433. [PMID: 19593378 PMCID: PMC2701610 DOI: 10.1371/journal.pcbi.1000433] [Citation(s) in RCA: 140] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2009] [Accepted: 06/04/2009] [Indexed: 12/03/2022] Open
Abstract
It has been empirically established that the cerebral cortical areas defined by Brodmann one hundred years ago solely on the basis of cellular organization are closely correlated to their function, such as sensation, association, and motion. Cytoarchitectonically distinct cortical areas have different densities and types of neurons. Thus, signaling patterns may also vary among cytoarchitectonically unique cortical areas. To examine how neuronal signaling patterns are related to innate cortical functions, we detected intrinsic features of cortical firing by devising a metric that efficiently isolates non-Poisson irregular characteristics, independent of spike rate fluctuations that are caused extrinsically by ever-changing behavioral conditions. Using the new metric, we analyzed spike trains from over 1,000 neurons in 15 cortical areas sampled by eight independent neurophysiological laboratories. Analysis of firing-pattern dissimilarities across cortical areas revealed a gradient of firing regularity that corresponded closely to the functional category of the cortical area; neuronal spiking patterns are regular in motor areas, random in the visual areas, and bursty in the prefrontal area. Thus, signaling patterns may play an important role in function-specific cerebral cortical computation. Neurons, or nerve cells in the brain, communicate with each other using stereotyped electric pulses, called spikes. It is believed that neurons convey information mainly through the frequency of the transmitted spikes, called the firing rate. In addition, neurons may communicate some information through the finer temporal patterns of the spikes. Neuronal firing patterns may depend on cellular organization, which varies among the regions of the brain, according to the roles they play, such as sensation, association, and motion. In order to examine the relationship among signals, structure, and function, we devised a metric to detect firing irregularity intrinsic and specific to individual neurons and analyzed spike sequences from over 1,000 neurons in 15 different cortical areas. Here we report two results of this study. First, we found that neurons exhibit stable firing patterns that can be characterized as “regular”, “random”, and “bursty”. Second, we observed a strong correlation between the type of signaling pattern exhibited by neurons in a given area and the function of that area. This suggests that, in addition to reflecting the cellular organization of the brain, neuronal signaling patterns may also play a role in specific types of neuronal computations.
Collapse
Affiliation(s)
- Shigeru Shinomoto
- Graduate School of Science, Kyoto University, Sakyo-ku, Kyoto, Japan.
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
38
|
Ly C, Tranchina D. Spike train statistics and dynamics with synaptic input from any renewal process: a population density approach. Neural Comput 2009; 21:360-96. [PMID: 19431264 DOI: 10.1162/neco.2008.03-08-743] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physiological consequences. We extend PDF methods to models with synaptic event times governed by any modulated renewal process. We focus on the integrate-and-fire neuron with instantaneous synaptic kinetics and a random elementary excitatory postsynaptic potential (EPSP), A. Between presynaptic events, the membrane voltage, v, decays exponentially toward rest, while s, the time since the last synaptic input event, evolves with unit velocity. When a synaptic event arrives, v jumps by A, and s is reset to zero. If v crosses the threshold voltage, an action potential occurs, and v is reset to v(reset). The probability per unit time of a synaptic event at time t, given the elapsed time s since the last event, h(s, t), depends on specifics of the renewal process. We study how regularity of the train of synaptic input events affects output spike rate, PDF and coefficient of variation (CV) of the interspike interval, and the autocorrelation function of the output spike train. In the limit of a deterministic, clocklike train of input events, the PDF of the interspike interval converges to a sum of delta functions, with coefficients determined by the PDF for A. The limiting autocorrelation function of the output spike train is a sum of delta functions whose coefficients fall under a damped oscillatory envelope. When the EPSP CV, sigma A/mu A, is equal to 0.45, a CV for the intersynaptic event interval, sigma T/mu T = 0.35, is functionally equivalent to a deterministic periodic train of synaptic input events (CV = 0) with respect to spike statistics. We discuss the relevance to neural network simulations.
Collapse
Affiliation(s)
- Cheng Ly
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, USA.
| | | |
Collapse
|
39
|
Wang Y, Lai YC, Zheng Z. Onset of colored-noise-induced synchronization in chaotic systems. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2009; 79:056210. [PMID: 19518539 DOI: 10.1103/physreve.79.056210] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/11/2008] [Indexed: 05/27/2023]
Abstract
We develop and validate an algorithm for integrating stochastic differential equations under green noise. Utilizing it and the standard methods for computing dynamical systems under red and white noise, we address the problem of synchronization among chaotic oscillators in the presence of common colored noise. We find that colored noise can induce synchronization, but the onset of synchronization, as characterized by the value of the critical noise amplitude above which synchronization occurs, can be different for noise of different colors. A formula relating the critical noise amplitudes among red, green, and white noise is uncovered, which holds for both complete and phase synchronization. The formula suggests practical strategies for controlling the degree of synchronization by noise, e.g., utilizing noise filters to suppress synchronization.
Collapse
Affiliation(s)
- Yan Wang
- Department of Electrical Engineering, Arizona State University, Tempe, Arizona 85287, USA
| | | | | |
Collapse
|
40
|
Abstract
Long-term potentiation of synapse strength requires enlargement of dendritic spines on cerebral pyramidal neurons. Long-term depression is linked to spine shrinkage. Indeed, spines are dynamic structures: they form, change their shapes and volumes, or can disappear in the space of hours. Do all such changes result from synaptic activity, or do some changes result from intrinsic processes? How do enlargement and shrinkage of spines relate to elimination and generation of spines, and how do these processes contribute to the stationary distribution of spine volumes? To answer these questions, we recorded the volumes of many individual spines daily for several days using two-photon imaging of CA1 pyramidal neurons in cultured slices of rat hippocampus between postnatal days 17 and 23. With normal synaptic transmission, spines often changed volume or were created or eliminated, thereby showing activity-dependent plasticity. However, we found that spines changed volume even after we blocked synaptic activity, reflecting a native instability of these small structures over the long term. Such "intrinsic fluctuations" showed unique dependence on spine volume. A mathematical model constructed from these data and the theory of random fluctuations explains population behaviors of spines, such as rates of elimination and generation, stationary distribution of volumes, and the long-term persistence of large spines. Our study finds that generation and elimination of spines are more prevalent than previously believed, and spine volume shows significant correlation with its age and life expectancy. The population dynamics of spines also predict key psychological features of memory.
Collapse
|
41
|
Liu CY, Nykamp DQ. A kinetic theory approach to capturing interneuronal correlation: the feed-forward case. J Comput Neurosci 2008; 26:339-68. [PMID: 18987967 DOI: 10.1007/s10827-008-0116-4] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2008] [Revised: 09/19/2008] [Accepted: 09/24/2008] [Indexed: 11/30/2022]
Abstract
We present an approach for using kinetic theory to capture first and second order statistics of neuronal activity. We coarse grain neuronal networks into populations of neurons and calculate the population average firing rate and output cross-correlation in response to time varying correlated input. We derive coupling equations for the populations based on first and second order statistics of the network connectivity. This coupling scheme is based on the hypothesis that second order statistics of the network connectivity are sufficient to determine second order statistics of neuronal activity. We implement a kinetic theory representation of a simple feed-forward network and demonstrate that the kinetic theory model captures key aspects of the emergence and propagation of correlations in the network, as long as the correlations do not become too strong. By analyzing the correlated activity of feed-forward networks with a variety of connectivity patterns, we provide evidence supporting our hypothesis of the sufficiency of second order connectivity statistics.
Collapse
Affiliation(s)
- Chin-Yueh Liu
- School of Mathematics, University of Minnesota, 206 Church St., Minneapolis, MN 55455, USA
| | | |
Collapse
|
42
|
Moreno-Bote R, Renart A, Parga N. Theory of Input Spike Auto- and Cross-Correlations and Their Effect on the Response of Spiking Neurons. Neural Comput 2008; 20:1651-705. [DOI: 10.1162/neco.2008.03-07-497] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Spike correlations between neurons are ubiquitous in the cortex, but their role is not understood. Here we describe the firing response of a leaky integrate-and-fire neuron (LIF) when it receives a temporarily correlated input generated by presynaptic correlated neuronal populations. Input correlations are characterized in terms of the firing rates, Fano factors, correlation coefficients, and correlation timescale of the neurons driving the target neuron. We show that the sum of the presynaptic spike trains cannot be well described by a Poisson process. In fact, the total input current has a nontrivial two-point correlation function described by two main parameters: the correlation timescale (how precise the input correlations are in time) and the correlation magnitude (how strong they are). Therefore, the total current generated by the input spike trains is not well described by a white noise gaussian process. Instead, we model the total current as a colored gaussian process with the same mean and two-point correlation function, leading to the formulation of the problem in terms of a Fokker-Planck equation. Solutions of the output firing rate are found in the limit of short and long correlation timescales. The solutions described here expand and improve on our previous results (Moreno, de la Rocha, Renart, & Parga, 2002) by presenting new analytical expressions for the output firing rate for general IF neurons, extending the validity of the results for arbitrarily large correlation magnitude, and by describing the differential effect of correlations on the mean-driven or noise-dominated firing regimes. Also the details of this novel formalism are given here for the first time. We employ numerical simulations to confirm the analytical solutions and study the firing response to sudden changes in the input correlations. We expect this formalism to be useful for the study of correlations in neuronal networks and their role in neural processing and information transmission.
Collapse
Affiliation(s)
- Rubén Moreno-Bote
- Departamento de Física Teórica. Universidad Autónoma de Madrid, Cantoblanco 28049, Madrid, Spain
| | - Alfonso Renart
- Departamento de Física Teórica. Universidad Autónoma de Madrid, Cantoblanco 28049, Madrid, Spain
| | - Néstor Parga
- Departamento de Física Teórica. Universidad Autónoma de Madrid, Cantoblanco 28049, Madrid, Spain
| |
Collapse
|
43
|
Pawlas Z, Klebanov LB, Prokop M, Lansky P. Parameters of Spike Trains Observed in a Short Time Window. Neural Comput 2008; 20:1325-43. [DOI: 10.1162/neco.2007.01-07-442] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We study the estimation of statistical moments of interspike intervals based on observation of spike counts in many independent short time windows. This scenario corresponds to the situation in which a target neuron occurs. It receives information from many neurons and has to respond within a short time interval. The precision of the estimation procedures is examined. As the model for neuronal activity, two examples of stationary point processes are considered: renewal process and doubly stochastic Poisson process. Both moment and maximum likelihood estimators are investigated. Not only the mean but also the coefficient of variation is estimated. In accordance with our expectations, numerical studies confirm that the estimation of mean interspike interval is more reliable than the estimation of coefficient of variation. The error of estimation increases with increasing mean interspike interval, which is equivalent to decreasing the size of window (less events are observed in a window) and with decreasing the number of neurons (lower number of windows).
Collapse
Affiliation(s)
- Zbyněk Pawlas
- Department of Probability and Mathematical Statistics, Faculty of Mathematics and Physics, Charles University, 186 75 Prague 8, Czech Republic
| | - Lev B. Klebanov
- Department of Probability and Mathematical Statistics, Faculty of Mathematics and Physics, Charles University, 186 75 Prague 8, Czech Republic
| | - Martin Prokop
- Department of Probability and Mathematical Statistics, Faculty of Mathematics and Physics, Charles University, 186 75 Prague 8, Czech Republic
| | - Petr Lansky
- Institute of Physiology, Academy of Sciences of the Czech Republic, 142 20 Prague 4, Czech Republic
| |
Collapse
|
44
|
Câteau H, Kitano K, Fukai T. Interplay between a phase response curve and spike-timing-dependent plasticity leading to wireless clustering. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2008; 77:051909. [PMID: 18643104 DOI: 10.1103/physreve.77.051909] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2008] [Indexed: 05/26/2023]
Abstract
A phase response curve (PRC) characterizes the signal transduction between oscillators such as neurons on a fixed network in a minimal manner, while spike-timing-dependent plasiticity (STDP) characterizes the way of rewiring networks in an activity-dependent manner. This paper demonstrates that these two key properties both related to the interaction times of oscillators work synergetically to carve functionally useful circuits. STDP working on neurons that prefer asynchrony converts the initial asynchronous firing to clustered firing with synchrony within a cluster. They get synchronized within a cluster despite their preference to asynchrony because STDP selectively disrupts intracluster connections, which we call wireless clustering. Our PRC analysis reveals a triad mechanism: the network structure affects how the PRC is read out to determine the synchrony tendency, the synchrony tendency affects how the STDP works, and STDP affects the network structure, closing the loop.
Collapse
Affiliation(s)
- Hideyuki Câteau
- Laboratory for Neural Circut Theory, RIKEN Brain Science Institute, 2-1 Hirowasa, Wako, Saitama 351-0198, Japan
| | | | | |
Collapse
|
45
|
Kang S, Kitano K, Fukai T. Structure of spontaneous UP and DOWN transitions self-organizing in a cortical network model. PLoS Comput Biol 2008; 4:e1000022. [PMID: 18369421 PMCID: PMC2265465 DOI: 10.1371/journal.pcbi.1000022] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2007] [Accepted: 02/05/2008] [Indexed: 12/02/2022] Open
Abstract
Synaptic plasticity is considered to play a crucial role in the experience-dependent self-organization of local cortical networks. In the absence of sensory stimuli, cerebral cortex exhibits spontaneous membrane potential transitions between an UP and a DOWN state. To reveal how cortical networks develop spontaneous activity, or conversely, how spontaneous activity structures cortical networks, we analyze the self-organization of a recurrent network model of excitatory and inhibitory neurons, which is realistic enough to replicate UP-DOWN states, with spike-timing-dependent plasticity (STDP). The individual neurons in the self-organized network exhibit a variety of temporal patterns in the two-state transitions. In addition, the model develops a feed-forward network-like structure that produces a diverse repertoire of precise sequences of the UP state. Our model shows that the self-organized activity well resembles the spontaneous activity of cortical networks if STDP is accompanied by the pruning of weak synapses. These results suggest that the two-state membrane potential transitions play an active role in structuring local cortical circuits.
Collapse
Affiliation(s)
- Siu Kang
- Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute, Wako, Japan
| | - Katsunori Kitano
- Department of Computer Science, Ritsumeikan University, Shiga, Japan
| | - Tomoki Fukai
- Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute, Wako, Japan
| |
Collapse
|
46
|
Balanced excitatory and inhibitory inputs to cortical neurons decouple firing irregularity from rate modulations. J Neurosci 2008; 27:13802-12. [PMID: 18077692 DOI: 10.1523/jneurosci.2452-07.2007] [Citation(s) in RCA: 61] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
In vivo cortical neurons are known to exhibit highly irregular spike patterns. Because the intervals between successive spikes fluctuate greatly, irregular neuronal firing makes it difficult to estimate instantaneous firing rates accurately. If, however, the irregularity of spike timing is decoupled from rate modulations, the estimate of firing rate can be improved. Here, we introduce a novel coding scheme to make the firing irregularity orthogonal to the firing rate in information representation. The scheme is valid if an interspike interval distribution can be well fitted by the gamma distribution and the firing irregularity is constant over time. We investigated in a computational model whether fluctuating external inputs may generate gamma process-like spike outputs, and whether the two quantities are actually decoupled. Whole-cell patch-clamp recordings of cortical neurons were performed to confirm the predictions of the model. The output spikes were well fitted by the gamma distribution. The firing irregularity remained approximately constant regardless of the firing rate when we injected a balanced input, in which excitatory and inhibitory synapses are activated concurrently while keeping their conductance ratio fixed. The degree of irregular firing depended on the effective reversal potential set by the balance between excitation and inhibition. In contrast, when we modulated conductances out of balance, the irregularity varied with the firing rate. These results indicate that the balanced input may improve the efficiency of neural coding by clamping the firing irregularity of cortical neurons. We demonstrate how this novel coding scheme facilitates stimulus decoding.
Collapse
|
47
|
Bianco S, Ignaccolo M, Rider MS, Ross MJ, Winsor P, Grigolini P. Brain, music, and non-Poisson renewal processes. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2007; 75:061911. [PMID: 17677304 DOI: 10.1103/physreve.75.061911] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/01/2006] [Indexed: 05/16/2023]
Abstract
In this paper we show that both music composition and brain function, as revealed by the electroencephalogram (EEG) analysis, are renewal non-Poisson processes living in the nonergodic dominion. To reach this important conclusion we process the data with the minimum spanning tree method, so as to detect significant events, thereby building a sequence of times, which is the time series to analyze. Then we show that in both cases, EEG and music composition, these significant events are the signature of a non-Poisson renewal process. This conclusion is reached using a technique of statistical analysis recently developed by our group, the aging experiment (AE). First, we find that in both cases the distances between two consecutive events are described by nonexponential histograms, thereby proving the non-Poisson nature of these processes. The corresponding survival probabilities Psi(t) are well fitted by stretched exponentials [Psi(t) proportional, variant exp (-(gammat){alpha}) , with 0.5<alpha<1 .] The second step rests on the adoption of AE, which shows that these are renewal processes. We show that the stretched exponential, due to its renewal character, is the emerging tip of an iceberg, whose underwater part has slow tails with an inverse power law structure with power index mu=1+alpha. Adopting the AE procedure we find that both EEG and music composition yield mu<2. On the basis of the recently discovered complexity matching effect, according to which a complex system S with mu{S}<2 responds only to a complex driving signal P with mu{P}< or =mu{S}, we conclude that the results of our analysis may explain the influence of music on the human brain.
Collapse
Affiliation(s)
- Simone Bianco
- Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, Texas 76203-1427, USA
| | | | | | | | | | | |
Collapse
|
48
|
Apfaltrer F, Ly C, Tranchina D. Population density methods for stochastic neurons with realistic synaptic kinetics: firing rate dynamics and fast computational methods. NETWORK (BRISTOL, ENGLAND) 2006; 17:373-418. [PMID: 17162461 DOI: 10.1080/09548980601069787] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
An outstanding problem in computational neuroscience is how to use population density function (PDF) methods to model neural networks with realistic synaptic kinetics in a computationally efficient manner. We explore an application of two-dimensional (2-D) PDF methods to simulating electrical activity in networks of excitatory integrate-and-fire neurons. We formulate a pair of coupled partial differential-integral equations describing the evolution of PDFs for neurons in non-refractory and refractory pools. The population firing rate is given by the total flux of probability across the threshold voltage. We use an operator-splitting method to reduce computation time. We report on speed and accuracy of PDF results and compare them to those from direct, Monte-Carlo simulations. We compute temporal frequency response functions for the transduction from the rate of postsynaptic input to population firing rate, and examine its dependence on background synaptic input rate. The behaviors in the1-D and 2-D cases--corresponding to instantaneous and non-instantaneous synaptic kinetics, respectively--differ markedly from those for a somewhat different transduction: from injected current input to population firing rate output (Brunel et al. 2001; Fourcaud & Brunel 2002). We extend our method by adding inhibitory input, consider a 3-D to 2-D dimension reduction method, demonstrate its limitations, and suggest directions for future study.
Collapse
Affiliation(s)
- Felix Apfaltrer
- Courant Institute of Mathematical Sciences, New York University, New York, NY 10012, USA
| | | | | |
Collapse
|
49
|
Gourévitch B, Eggermont JJ. A nonparametric approach for detection of bursts in spike trains. J Neurosci Methods 2006; 160:349-58. [PMID: 17070926 DOI: 10.1016/j.jneumeth.2006.09.024] [Citation(s) in RCA: 54] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2006] [Revised: 09/26/2006] [Accepted: 09/26/2006] [Indexed: 11/23/2022]
Abstract
In spike-train data, bursts are considered as a unit of neural information and are of potential interest in studies of responses to any sensory stimulus. Consequently, burst detection appears to be a critical problem for which the Poisson-surprise (PS) method has been widely used for 20 years. However, this method has faced some recurrent criticism about the underlying assumptions regarding the interspike interval (ISI) distributions. In this paper, we avoid such assumptions by using a nonparametric approach for burst detection based on the ranks of ISI in the entire spike train. Similar to the PS statistic, a "Rank surprise" (RS) statistic is extracted. A new algorithm performing an exhaustive search of bursts in the spike trains is also presented. Compared to the performances of the PS method on realizations of gamma renewal processes and spike trains recorded in cat auditory cortex, we show that the RS method is very robust for any type of ISI distribution and is based on an elementary formalization of the definition of a burst. It presents an alternative to the PS method for non-Poisson spike trains and is simple to implement.
Collapse
Affiliation(s)
- Boris Gourévitch
- Department of Physiology and Biophysics, Department of Psychology, University of Calgary, 2500 University Drive N.W., Calgary, Alberta, Canada.
| | | |
Collapse
|
50
|
Doiron B, Rinzel J, Reyes A. Stochastic synchronization in finite size spiking networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2006; 74:030903. [PMID: 17025585 DOI: 10.1103/physreve.74.030903] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/18/2005] [Indexed: 05/12/2023]
Abstract
We study a stochastic synchronization of spiking activity in feedforward networks of integrate-and-fire model neurons. A stochastic mean field analysis shows that synchronization occurs only when the network size is sufficiently small. This gives evidence that the dynamics, and hence processing, of finite size populations can be drastically different from that observed in the infinite size limit. Our results agree with experimentally observed synchrony in cortical networks, and further strengthen the link between synchrony and propagation in cortical systems.
Collapse
Affiliation(s)
- Brent Doiron
- Center for Neural Science, New York University, New York, New York 10003, USA
| | | | | |
Collapse
|