1
|
Mosheiff N, Ermentrout B, Huang C. Chaotic dynamics in spatially distributed neuronal networks generate population-wide shared variability. PLoS Comput Biol 2023; 19:e1010843. [PMID: 36626362 PMCID: PMC9870129 DOI: 10.1371/journal.pcbi.1010843] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 01/23/2023] [Accepted: 12/26/2022] [Indexed: 01/11/2023] Open
Abstract
Neural activity in the cortex is highly variable in response to repeated stimuli. Population recordings across the cortex demonstrate that the variability of neuronal responses is shared among large groups of neurons and concentrates in a low dimensional space. However, the source of the population-wide shared variability is unknown. In this work, we analyzed the dynamical regimes of spatially distributed networks of excitatory and inhibitory neurons. We found chaotic spatiotemporal dynamics in networks with similar excitatory and inhibitory projection widths, an anatomical feature of the cortex. The chaotic solutions contain broadband frequency power in rate variability and have distance-dependent and low-dimensional correlations, in agreement with experimental findings. In addition, rate chaos can be induced by globally correlated noisy inputs. These results suggest that spatiotemporal chaos in cortical networks can explain the shared variability observed in neuronal population responses.
Collapse
Affiliation(s)
- Noga Mosheiff
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, United States of America
| | - Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Chengcheng Huang
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, United States of America
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
2
|
Gradient-based learning drives robust representations in recurrent neural networks by balancing compression and expansion. NAT MACH INTELL 2022. [DOI: 10.1038/s42256-022-00498-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|
3
|
Baker C, Zhu V, Rosenbaum R. Nonlinear stimulus representations in neural circuits with approximate excitatory-inhibitory balance. PLoS Comput Biol 2020; 16:e1008192. [PMID: 32946433 PMCID: PMC7526938 DOI: 10.1371/journal.pcbi.1008192] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Revised: 09/30/2020] [Accepted: 07/24/2020] [Indexed: 12/02/2022] Open
Abstract
Balanced excitation and inhibition is widely observed in cortex. How does this balance shape neural computations and stimulus representations? This question is often studied using computational models of neuronal networks in a dynamically balanced state. But balanced network models predict a linear relationship between stimuli and population responses. So how do cortical circuits implement nonlinear representations and computations? We show that every balanced network architecture admits stimuli that break the balanced state and these breaks in balance push the network into a "semi-balanced state" characterized by excess inhibition to some neurons, but an absence of excess excitation. The semi-balanced state produces nonlinear stimulus representations and nonlinear computations, is unavoidable in networks driven by multiple stimuli, is consistent with cortical recordings, and has a direct mathematical relationship to artificial neural networks.
Collapse
Affiliation(s)
- Cody Baker
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Vicky Zhu
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, IN, USA
| |
Collapse
|
4
|
Ceni A, Olmi S, Torcini A, Angulo-Garcia D. Cross frequency coupling in next generation inhibitory neural mass models. CHAOS (WOODBURY, N.Y.) 2020; 30:053121. [PMID: 32491891 DOI: 10.1063/1.5125216] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/21/2019] [Accepted: 04/22/2020] [Indexed: 06/11/2023]
Abstract
Coupling among neural rhythms is one of the most important mechanisms at the basis of cognitive processes in the brain. In this study, we consider a neural mass model, rigorously obtained from the microscopic dynamics of an inhibitory spiking network with exponential synapses, able to autonomously generate collective oscillations (COs). These oscillations emerge via a super-critical Hopf bifurcation, and their frequencies are controlled by the synaptic time scale, the synaptic coupling, and the excitability of the neural population. Furthermore, we show that two inhibitory populations in a master-slave configuration with different synaptic time scales can display various collective dynamical regimes: damped oscillations toward a stable focus, periodic and quasi-periodic oscillations, and chaos. Finally, when bidirectionally coupled, the two inhibitory populations can exhibit different types of θ-γ cross-frequency couplings (CFCs): phase-phase and phase-amplitude CFC. The coupling between θ and γ COs is enhanced in the presence of an external θ forcing, reminiscent of the type of modulation induced in hippocampal and cortex circuits via optogenetic drive.
Collapse
Affiliation(s)
- Andrea Ceni
- Department of Computer Science, College of Engineering, Mathematics and Physical Sciences, University of Exeter, Exeter EX4 4QF, United Kingdom
| | - Simona Olmi
- Inria Sophia Antipolis Méditerranée Research Centre, 2004 Route des Lucioles, 06902 Valbonne, France
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, Université de Cergy-Pontoise, CNRS, UMR 8089, 95302 Cergy-Pontoise cedex, France
| | - David Angulo-Garcia
- Grupo de Modelado Computacional-Dinámica y Complejidad de Sistemas, Instituto de Matemáticas Aplicadas, Universidad de Cartagena, Carrera 6 #36-100, 130001 Cartagena de Indias, Colombia
| |
Collapse
|
5
|
Puelma Touzel M, Wolf F. Statistical mechanics of spike events underlying phase space partitioning and sequence codes in large-scale models of neural circuits. Phys Rev E 2019; 99:052402. [PMID: 31212548 DOI: 10.1103/physreve.99.052402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2017] [Indexed: 11/07/2022]
Abstract
Cortical circuits operate in an inhibition-dominated regime of spiking activity. Recently, it was found that spiking circuit models in this regime can, despite disordered connectivity and asynchronous irregular activity, exhibit a locally stable dynamics that may be used for neural computation. The lack of existing mathematical tools has precluded analytical insight into this phase. Here we present analytical methods tailored to the granularity of spike-based interactions for analyzing attractor geometry in high-dimensional spiking dynamics. We apply them to reveal the properties of the complex geometry of trajectories of population spiking activity in a canonical model of locally stable spiking dynamics. We find that attractor basin boundaries are the preimages of spike-time collision events involving connected neurons. These spike-based instabilities control the divergence rate of neighboring basins and have no equivalent in rate-based models. They are located according to the disordered connectivity at a random subset of edges in a hypercube representation of the phase space. Iterating backward these edges using the stable dynamics induces a partition refinement on this space that converges to the attractor basins. We formulate a statistical theory of the locations of such events relative to attracting trajectories via a tractable representation of local trajectory ensembles. Averaging over the disorder, we derive the basin diameter distribution, whose characteristic scale emerges from the relative strengths of the stabilizing inhibitory coupling and destabilizing spike interactions. Our study provides an approach to analytically dissect how connectivity, coupling strength, and single-neuron dynamics shape the phase space geometry in the locally stable regime of spiking neural circuit dynamics.
Collapse
Affiliation(s)
- Maximilian Puelma Touzel
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany and Mila, Université de Montréal, Montréal, Quebec, Canada H2S 3H1
| | - Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany; Faculty of Physics, Georg August University, 37077 Göttingen, Germany; Bernstein Center for Computational Neuroscience, 37077 Göttingen, Germany; and Kavli Institute for Theoretical Physics, University of California, Santa Barbara, Santa Barbara, California 93106-4111, USA
| |
Collapse
|
6
|
Affiliation(s)
- Adrienne L Fairhall
- Dept. of Physiology and Biophysics and UW Institute for Neuroengineering, University of Washington, Seattle, Washington, USA.
| |
Collapse
|
7
|
Abstract
Implicit expectations induced by predictable stimuli sequences affect neuronal response to upcoming stimuli at both single cell and neural population levels. Temporally regular sensory streams also phase entrain ongoing low frequency brain oscillations but how and why this happens is unknown. Here we investigate how random recurrent neural networks without plasticity respond to stimuli streams containing oddballs. We found the neuronal correlates of sensory stream adaptation emerge if networks generate chaotic oscillations which can be phase entrained by stimulus streams. The resultant activity patterns are close to critical and support history dependent response on long timescales. Because critical network entrainment is a slow process stimulus response adapts gradually over multiple repetitions. Repeated stimuli generate suppressed responses but oddball responses are large and distinct. Oscillatory mismatch responses persist in population activity for long periods after stimulus offset while individual cell mismatch responses are strongly phasic. These effects are weakened in temporally irregular sensory streams. Thus we show that network phase entrainment provides a biologically plausible mechanism for neural oddball detection. Our results do not depend on specific network characteristics, are consistent with experimental studies and may be relevant for multiple pathologies demonstrating altered mismatch processing such as schizophrenia and depression.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM T.J. Watson Research Center, Yorktown Heights, NY, USA.
- Okinawa Institute of Science and Technology Graduate University (OIST), Okinawa, Japan.
| |
Collapse
|
8
|
Pyle R, Rosenbaum R. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks. PHYSICAL REVIEW LETTERS 2017; 118:018103. [PMID: 28106418 DOI: 10.1103/physrevlett.118.018103] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2016] [Indexed: 06/06/2023]
Abstract
Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.
Collapse
Affiliation(s)
- Ryan Pyle
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana 46556, USA
| |
Collapse
|
9
|
Lajoie G, Lin KK, Thivierge JP, Shea-Brown E. Encoding in Balanced Networks: Revisiting Spike Patterns and Chaos in Stimulus-Driven Systems. PLoS Comput Biol 2016; 12:e1005258. [PMID: 27973557 PMCID: PMC5156368 DOI: 10.1371/journal.pcbi.1005258] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2016] [Accepted: 11/21/2016] [Indexed: 11/22/2022] Open
Abstract
Highly connected recurrent neural networks often produce chaotic dynamics, meaning their precise activity is sensitive to small perturbations. What are the consequences of chaos for how such networks encode streams of temporal stimuli? On the one hand, chaos is a strong source of randomness, suggesting that small changes in stimuli will be obscured by intrinsically generated variability. On the other hand, recent work shows that the type of chaos that occurs in spiking networks can have a surprisingly low-dimensional structure, suggesting that there may be room for fine stimulus features to be precisely resolved. Here we show that strongly chaotic networks produce patterned spikes that reliably encode time-dependent stimuli: using a decoder sensitive to spike times on timescales of 10’s of ms, one can easily distinguish responses to very similar inputs. Moreover, recurrence serves to distribute signals throughout chaotic networks so that small groups of cells can encode substantial information about signals arriving elsewhere. A conclusion is that the presence of strong chaos in recurrent networks need not exclude precise encoding of temporal stimuli via spike patterns. Recurrently connected populations of excitatory and inhibitory neurons found in cortex are known to produce rich and irregular spiking activity, with complex trial-to-trial variability in response to input stimuli. Many theoretical studies found this firing regime to be associated with chaos, where tiny perturbations explode to impact subsequent neural activity. As a result, the precise spiking patterns produced by such networks would be expected to be too fragile to carry any valuable information about stimuli, since inevitable sources of noise such as synaptic failure or ion channel fluctuations would be amplified by chaotic dynamics on repeated trials. In this article we revisit the implications of chaos in input-driven networks and directly measure its impact on evoked population spike patterns. We find that chaotic network dynamics can, in fact, produce highly patterned spiking activity which can be used by a simple decoder to perform input-classification tasks. This can be explained by the presence of low-dimensional, input-specific chaotic attractors, leading to a form of trial-to-trial variability that is intermittent, rather than uniformly random. We propose that chaos is a manageable by-product of recurrent connectivity, which serves to efficiently distribute information about stimuli throughout a network.
Collapse
Affiliation(s)
- Guillaume Lajoie
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, Washington, United States of America
- Department of Nonlinear Dynamics, Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- * E-mail:
| | - Kevin K. Lin
- School of Mathematics, University of Arizona, Tucson, Arizona, United States of America
| | | | - Eric Shea-Brown
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
10
|
Vincent-Lamarre P, Lajoie G, Thivierge JP. Driving reservoir models with oscillations: a solution to the extreme structural sensitivity of chaotic networks. J Comput Neurosci 2016; 41:305-322. [DOI: 10.1007/s10827-016-0619-3] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2016] [Revised: 07/20/2016] [Accepted: 08/05/2016] [Indexed: 01/01/2023]
|
11
|
Abstract
This work is part of an effort to understand the neural basis for our visual system's ability, or failure, to accurately track moving visual signals. We consider here a ring model of spiking neurons, intended as a simplified computational model of a single hypercolumn of the primary visual cortex of primates. Signals that consist of edges with time-varying orientations localized in space are considered. Our model is calibrated to produce spontaneous and driven firing rates roughly consistent with experiments, and our two main findings, for which we offer dynamical explanation on the level of neuronal interactions, are the following. First, we have documented consistent transient overshoots in signal perception following signal switches due to emergent interactions of the E- and I-populations. Second, for continuously moving signals, we have found that accuracy is considerably lower at reversals of orientation than when continuing in the same direction (as when the signal is a rotating bar). To measure performance, we use two metrics, called fidelity and reliability, to compare signals reconstructed by the system to the ones presented and assess trial-to-trial variability. We propose that the same population mechanisms responsible for orientation selectivity also impose constraints on dynamic signal tracking that manifest in perception failures consistent with psychophysical observations.
Collapse
Affiliation(s)
- Guillaume Lajoie
- Institute for Neuroengineering, University of Washington, Seattle, WA 98195, U.S.A.
| | - Lai-Sang Young
- Courant Institute of Mathematical Sciences, New York University, New York, NY 10012, U.S.A.
| |
Collapse
|
12
|
Pyle R, Rosenbaum R. Highly connected neurons spike less frequently in balanced networks. Phys Rev E 2016; 93:040302. [PMID: 27176240 DOI: 10.1103/physreve.93.040302] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2016] [Indexed: 11/07/2022]
Abstract
Biological neuronal networks exhibit highly variable spiking activity. Balanced networks offer a parsimonious model of this variability in which strong excitatory synaptic inputs are canceled by strong inhibitory inputs on average, and irregular spiking activity is driven by fluctuating synaptic currents. Most previous studies of balanced networks assume a homogeneous or distance-dependent connectivity structure, but connectivity in biological cortical networks is more intricate. We use a heterogeneous mean-field theory of balanced networks to show that heterogeneous in-degrees can break balance. Moreover, heterogeneous architectures that achieve balance promote lower firing rates in neurons with larger in-degrees, consistent with some recent experimental observations.
Collapse
Affiliation(s)
- Ryan Pyle
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA.,Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana 46556, USA
| |
Collapse
|
13
|
Thomas PJ. Commentary on Structured chaos shapes spike-response noise entropy in balanced neural networks, by Lajoie, Thivierge, and Shea-Brown. Front Comput Neurosci 2015; 9:23. [PMID: 25805988 PMCID: PMC4354338 DOI: 10.3389/fncom.2015.00023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2014] [Accepted: 02/08/2015] [Indexed: 11/13/2022] Open
|
14
|
Lajoie G, Thivierge JP, Shea-Brown E. Structured chaos shapes spike-response noise entropy in balanced neural networks. Front Comput Neurosci 2014; 8:123. [PMID: 25324772 PMCID: PMC4183092 DOI: 10.3389/fncom.2014.00123] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2014] [Accepted: 09/11/2014] [Indexed: 11/13/2022] Open
Abstract
Large networks of sparsely coupled, excitatory and inhibitory cells occur throughout the brain. For many models of these networks, a striking feature is that their dynamics are chaotic and thus, are sensitive to small perturbations. How does this chaos manifest in the neural code? Specifically, how variable are the spike patterns that such a network produces in response to an input signal? To answer this, we derive a bound for a general measure of variability-spike-train entropy. This leads to important insights on the variability of multi-cell spike pattern distributions in large recurrent networks of spiking neurons responding to fluctuating inputs. The analysis is based on results from random dynamical systems theory and is complemented by detailed numerical simulations. We find that the spike pattern entropy is an order of magnitude lower than what would be extrapolated from single cells. This holds despite the fact that network coupling becomes vanishingly sparse as network size grows-a phenomenon that depends on "extensive chaos," as previously discovered for balanced networks without stimulus drive. Moreover, we show how spike pattern entropy is controlled by temporal features of the inputs. Our findings provide insight into how neural networks may encode stimuli in the presence of inherently chaotic dynamics.
Collapse
Affiliation(s)
- Guillaume Lajoie
- Nonlinear Dynamics Department, Max Planck Institute for Dynamics and Self-Organization Goettingen, Germany ; Bernstein Center for Computational Neuroscience, Max Planck Institute for Dynamics and Self-Organization Goettingen, Germany ; Applied Mathematics Department, University of Washington Seattle, WA, USA
| | - Jean-Philippe Thivierge
- School of Psychology and Center for Neural Dynamics, University of Ottawa Ottawa, ON, Canada
| | - Eric Shea-Brown
- Applied Mathematics Department, University of Washington Seattle, WA, USA ; Physiology and Biophysics Department, University of Washington Seattle, WA, USA
| |
Collapse
|
15
|
Lajoie G, Thivierge JP, Shea-Brown E. Structured chaos shapes joint spike-response noise entropy in temporally driven balanced networks. BMC Neurosci 2014. [PMCID: PMC4126493 DOI: 10.1186/1471-2202-15-s1-p48] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
|
16
|
Wolf F, Engelken R, Puelma-Touzel M, Weidinger JDF, Neef A. Dynamical models of cortical circuits. Curr Opin Neurobiol 2014; 25:228-36. [PMID: 24658059 DOI: 10.1016/j.conb.2014.01.017] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2013] [Revised: 01/21/2014] [Accepted: 01/22/2014] [Indexed: 11/27/2022]
Abstract
Cortical neurons operate within recurrent neuronal circuits. Dissecting their operation is key to understanding information processing in the cortex and requires transparent and adequate dynamical models of circuit function. Convergent evidence from experimental and theoretical studies indicates that strong feedback inhibition shapes the operating regime of cortical circuits. For circuits operating in inhibition-dominated regimes, mathematical and computational studies over the past several years achieved substantial advances in understanding response modulation and heterogeneity, emergent stimulus selectivity, inter-neuron correlations, and microstate dynamics. The latter indicate a surprisingly strong dependence of the collective circuit dynamics on the features of single neuron action potential generation. New approaches are needed to definitely characterize the cortical operating regime.
Collapse
Affiliation(s)
- Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany; Bernstein Center for Computational Neuroscience, Göttingen, Germany; Bernstein Focus Neurotechnology, Göttingen, Germany; Faculty of Physics, Göttingen University, Göttingen, Germany.
| | - Rainer Engelken
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany; Bernstein Center for Computational Neuroscience, Göttingen, Germany; Bernstein Focus Neurotechnology, Göttingen, Germany; Faculty of Physics, Göttingen University, Göttingen, Germany
| | - Maximilian Puelma-Touzel
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany; Bernstein Center for Computational Neuroscience, Göttingen, Germany; Bernstein Focus Neurotechnology, Göttingen, Germany; Faculty of Physics, Göttingen University, Göttingen, Germany
| | - Juan Daniel Flórez Weidinger
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany; Bernstein Center for Computational Neuroscience, Göttingen, Germany; Bernstein Focus Neurotechnology, Göttingen, Germany; Faculty of Physics, Göttingen University, Göttingen, Germany
| | - Andreas Neef
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany; Bernstein Center for Computational Neuroscience, Göttingen, Germany; Bernstein Focus Neurotechnology, Göttingen, Germany; Faculty of Physics, Göttingen University, Göttingen, Germany
| |
Collapse
|
17
|
Taillefumier T, Magnasco M. A transition to sharp timing in stochastic leaky integrate-and-fire neurons driven by frozen noisy input. Neural Comput 2014; 26:819-59. [PMID: 24555453 DOI: 10.1162/neco_a_00577] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The firing activity of intracellularly stimulated neurons in cortical slices has been demonstrated to be profoundly affected by the temporal structure of the injected current (Mainen & Sejnowski, 1995 ). This suggests that the timing features of the neural response may be controlled as much by its own biophysical characteristics as by how a neuron is wired within a circuit. Modeling studies have shown that the interplay between internal noise and the fluctuations of the driving input controls the reliability and the precision of neuronal spiking (Cecchi et al., 2000 ; Tiesinga, 2002 ; Fellous, Rudolph, Destexhe, & Sejnowski, 2003 ). In order to investigate this interplay, we focus on the stochastic leaky integrate-and-fire neuron and identify the Hölder exponent H of the integrated input as the key mathematical property dictating the regime of firing of a single-unit neuron. We have recently provided numerical evidence (Taillefumier & Magnasco, 2013 ) for the existence of a phase transition when [Formula: see text] becomes less than the statistical Hölder exponent associated with internal gaussian white noise (H=1/2). Here we describe the theoretical and numerical framework devised for the study of a neuron that is periodically driven by frozen noisy inputs with exponent H>0. In doing so, we account for the existence of a transition between two regimes of firing when H=1/2, and we show that spiking times have a continuous density when the Hölder exponent satisfies H>1/2. The transition at H=1/2 formally separates rate codes, for which the neural firing probability varies smoothly, from temporal codes, for which the neuron fires at sharply defined times regardless of the intensity of internal noise.
Collapse
Affiliation(s)
- Thibaud Taillefumier
- Laboratory of Mathematical Physics, Rockefeller University, New York, NY 10065, U.S.A., and Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ 08544, U.S.A.
| | | |
Collapse
|
18
|
Yu N, Li YX, Kuske R. A computational study of spike time reliability in two types of threshold dynamics. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2013; 3:11. [PMID: 23945258 PMCID: PMC3849148 DOI: 10.1186/2190-8567-3-11] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/20/2012] [Accepted: 04/23/2013] [Indexed: 06/02/2023]
Abstract
Spike time reliability (STR) refers to the phenomenon in which repetitive applications of a frozen copy of one stochastic signal to a neuron trigger spikes with reliable timing while a constant signal fails to do so. Observed and explored in numerous experimental and theoretical studies, STR is a complex dynamic phenomenon depending on the nature of external inputs as well as intrinsic properties of a neuron. The neuron under consideration could be either quiescent or spontaneously spiking in the absence of the external stimulus. Focusing on the situation in which the unstimulated neuron is quiescent but close to a switching point to oscillations, we numerically analyze STR treating each spike occurrence as a time localized event in a model neuron. We study both the averaged properties as well as individual features of spike-evoking epochs (SEEs). The effects of interactions between spikes is minimized by selecting signals that generate spikes with relatively long interspike intervals (ISIs). Under these conditions, the frequency content of the input signal has little impact on STR. We study two distinct cases, Type I in which the f-I relation (f for frequency, I for applied current) is continuous and Type II where the f-I relation exhibits a jump. STR in the two types shows a number of similar features and differ in some others. SEEs that are capable of triggering spikes show great variety in amplitude and time profile. On average, reliable spike timing is associated with an accelerated increase in the "action" of the signal as a threshold for spike generation is approached. Here, "action" is defined as the average amount of current delivered during a fixed time interval. When individual SEEs are studied, however, their time profiles are found important for triggering more precisely timed spikes. The SEEs that have a more favorable time profile are capable of triggering spikes with higher precision even at lower action levels.
Collapse
Affiliation(s)
- Na Yu
- Department of Mathematics, University of British Columbia, Vancouver, BC, Canada, V6T 1Z2
- Department of Cell Biology and Anatomy, Louisiana State University Health Sciences Center, New Orleans, LA, 70112, USA
| | - Yue-Xian Li
- Department of Mathematics, University of British Columbia, Vancouver, BC, Canada, V6T 1Z2
| | - Rachel Kuske
- Department of Mathematics, University of British Columbia, Vancouver, BC, Canada, V6T 1Z2
| |
Collapse
|
19
|
Lajoie G, Lin KK, Shea-Brown E. Chaos and reliability in balanced spiking networks with temporal drive. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2013; 87:052901. [PMID: 23767592 PMCID: PMC4124755 DOI: 10.1103/physreve.87.052901] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/13/2012] [Revised: 12/21/2012] [Indexed: 06/02/2023]
Abstract
Biological information processing is often carried out by complex networks of interconnected dynamical units. A basic question about such networks is that of reliability: If the same signal is presented many times with the network in different initial states, will the system entrain to the signal in a repeatable way? Reliability is of particular interest in neuroscience, where large, complex networks of excitatory and inhibitory cells are ubiquitous. These networks are known to autonomously produce strongly chaotic dynamics-an obvious threat to reliability. Here, we show that such chaos persists in the presence of weak and strong stimuli, but that even in the presence of chaos, intermittent periods of highly reliable spiking often coexist with unreliable activity. We elucidate the local dynamical mechanisms involved in this intermittent reliability, and investigate the relationship between this phenomenon and certain time-dependent attractors arising from the dynamics. A conclusion is that chaotic dynamics do not have to be an obstacle to precise spike responses, a fact with implications for signal coding in large networks.
Collapse
Affiliation(s)
- Guillaume Lajoie
- Department of Applied Mathematics, University of Washington, Seattle, Washington 98195, USA
| | | | | |
Collapse
|