1
|
Similar local neuronal dynamics may lead to different collective behavior. Phys Rev E 2021; 104:064309. [PMID: 35030861 DOI: 10.1103/physreve.104.064309] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2021] [Accepted: 12/10/2021] [Indexed: 11/07/2022]
Abstract
This report is concerned with the relevance of the microscopic rules that implement individual neuronal activation, in determining the collective dynamics, under variations of the network topology. To fix ideas we study the dynamics of two cellular automaton models, commonly used, rather in-distinctively, as the building blocks of large-scale neuronal networks. One model, due to Greenberg and Hastings (GH), can be described by evolution equations mimicking an integrate-and-fire process, while the other model, due to Kinouchi and Copelli (KC), represents an abstract branching process, where a single active neuron activates a given number of postsynaptic neurons according to a prescribed "activity" branching ratio. Despite the apparent similarity between the local neuronal dynamics of the two models, it is shown that they exhibit very different collective dynamics as a function of the network topology. The GH model shows qualitatively different dynamical regimes as the network topology is varied, including transients to a ground (inactive) state, continuous and discontinuous dynamical phase transitions. In contrast, the KC model only exhibits a continuous phase transition, independently of the network topology. These results highlight the importance of paying attention to the microscopic rules chosen to model the interneuronal interactions in large-scale numerical simulations, in particular when the network topology is far from a mean-field description. One such case is the extensive work being done in the context of the Human Connectome, where a wide variety of types of models are being used to understand the brain collective dynamics.
Collapse
|
2
|
Neuronal avalanches in Watts-Strogatz networks of stochastic spiking neurons. Phys Rev E 2021; 104:014137. [PMID: 34412363 DOI: 10.1103/physreve.104.014137] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2021] [Accepted: 06/08/2021] [Indexed: 01/03/2023]
Abstract
Networks of stochastic leaky integrate-and-fire neurons, both at the mean-field level and in square lattices, present a continuous absorbing phase transition with power-law neuronal avalanches at the critical point. Here we complement these results showing that small-world Watts-Strogatz networks have mean-field critical exponents for any rewiring probability p>0. For the ring (p=0), the exponents are the same from the dimension d=1 of the directed-percolation class. In the model, firings are stochastic and occur in discrete time steps, based on a sigmoidal firing probability function. Each neuron has a membrane potential that integrates the signals received from its neighbors. The membrane potentials are subject to a leakage parameter. We study topologies with a varied number of neuron connections and different values of the leakage parameter. Results indicate that the dynamic range is larger for p=0. We also study a homeostatic synaptic depression mechanism to self-organize the network towards the critical region. These stochastic oscillations are characteristic of the so-called self-organized quasicriticality.
Collapse
|
3
|
Single-neuron dynamical effects of dendritic pruning implicated in aging and neurodegeneration: towards a measure of neuronal reserve. Sci Rep 2021; 11:1309. [PMID: 33446683 PMCID: PMC7809359 DOI: 10.1038/s41598-020-78815-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 11/13/2020] [Indexed: 12/13/2022] Open
Abstract
Aging is a main risk factor for neurodegenerative disorders including Alzheimer's disease. It is often accompanied by reduced cognitive functions, gray-matter volume, and dendritic integrity. Although age-related brain structural changes have been observed across multiple scales, their functional implications remain largely unknown. Here we simulate the aging effects on neuronal morphology as dendritic pruning and characterize its dynamical implications. Utilizing a detailed computational modeling approach, we simulate the dynamics of digitally reconstructed neurons obtained from Neuromorpho.org. We show that dendritic pruning affects neuronal integrity: firing rate is reduced, causing a reduction in energy consumption, energy efficiency, and dynamic range. Pruned neurons require less energy but their function is often impaired, which can explain the diminished ability to distinguish between similar experiences (pattern separation) in older people. Our measures indicate that the resilience of neuronal dynamics is neuron-specific, heterogeneous, and strongly affected by dendritic topology and the position of the soma. Based on the emergent neuronal dynamics, we propose to classify the effects of dendritic deterioration, and put forward a topological measure of “neuronal reserve” that quantifies the resilience of neuronal dynamics to dendritic pruning. Moreover, our findings suggest that increasing dendritic excitability could partially mitigate the dynamical effects of aging.
Collapse
|
4
|
Critical network cascades with re-excitable nodes: Why treelike approximations usually work, when they break down, and how to correct them. Phys Rev E 2020; 101:062304. [PMID: 32688572 DOI: 10.1103/physreve.101.062304] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2019] [Accepted: 05/07/2020] [Indexed: 06/11/2023]
Abstract
Network science is a rapidly expanding field, with a large and growing body of work on network-based dynamical processes. Most theoretical results in this area rely on the so-called locally treelike approximation. This is, however, usually an "uncontrolled" approximation, in the sense that the magnitudes of the error are typically unknown, although numerical results show that this error is often surprisingly small. In this paper we place this approximation on more rigorous footing by calculating the magnitude of deviations away from tree-based theories in the context of discrete-time critical network cascades with re-excitable nodes. We discuss the conditions under which tree-like approximations give good results for calculating network criticality, and also explain the reasons for deviation from this approximation, in terms of the density of certain kinds of network motifs. Using this understanding, we derive results for network criticality that apply to general networks that explicitly do not satisfy the locally treelike approximation. In particular, we focus on the biparallel motif, the smallest motif relevant to the failure of a tree-based theory in this context, and we derive the corrections due to such motifs on the conditions for criticality. We verify our claims on computer-generated networks, and we confirm that our theory accurately predicts the observed deviations from criticality. Using our theory, we explain why numerical simulations often show that deviations from a tree-based theory are surprisingly small. More specifically, we show that these deviations are negligible for networks whose average degree is even modestly large compared to one, justifying why tree-based theories appear to work well for most real-world networks.
Collapse
|
5
|
Oscillations and collective excitability in a model of stochastic neurons under excitatory and inhibitory coupling. Phys Rev E 2019; 100:062416. [PMID: 31962449 DOI: 10.1103/physreve.100.062416] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2019] [Indexed: 06/10/2023]
Abstract
We study a model with excitable neurons modeled as stochastic units with three states, representing quiescence, firing, and refractoriness. The transition rates between quiescence and firing depend exponentially on the number of firing neighbors, whereas all other rates are kept constant. This model class was shown to exhibit collective oscillations (synchronization) if neurons are spiking autonomously, but not if neurons are in the excitable regime. In both cases, neurons were restricted to interact through excitatory coupling. Here we show that a plethora of collective phenomena appear if inhibitory coupling is added. Besides the usual transition between an absorbing and an active phase, the model with excitatory and inhibitory neurons can also undergo reentrant transitions to an oscillatory phase. In the mean-field description, oscillations can emerge through supercritical or subcritical Hopf bifurcations, as well as through infinite period bifurcations. The model has bistability between active and oscillating behavior, as well as collective excitability, a regime where the system can display a peak of global activity when subject to a sufficiently strong perturbation. We employ a variant of the Shinomoto-Kuramoto order parameter to characterize the phase transitions and their system-size dependence.
Collapse
|
6
|
Criticality in the brain: A synthesis of neurobiology, models and cognition. Prog Neurobiol 2017; 158:132-152. [PMID: 28734836 DOI: 10.1016/j.pneurobio.2017.07.002] [Citation(s) in RCA: 223] [Impact Index Per Article: 31.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2016] [Revised: 06/15/2017] [Accepted: 07/13/2017] [Indexed: 11/26/2022]
Abstract
Cognitive function requires the coordination of neural activity across many scales, from neurons and circuits to large-scale networks. As such, it is unlikely that an explanatory framework focused upon any single scale will yield a comprehensive theory of brain activity and cognitive function. Modelling and analysis methods for neuroscience should aim to accommodate multiscale phenomena. Emerging research now suggests that multi-scale processes in the brain arise from so-called critical phenomena that occur very broadly in the natural world. Criticality arises in complex systems perched between order and disorder, and is marked by fluctuations that do not have any privileged spatial or temporal scale. We review the core nature of criticality, the evidence supporting its role in neural systems and its explanatory potential in brain health and disease.
Collapse
|
7
|
Approximate-master-equation approach for the Kinouchi-Copelli neural model on networks. Phys Rev E 2017; 95:012310. [PMID: 28208444 DOI: 10.1103/physreve.95.012310] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2016] [Indexed: 11/07/2022]
Abstract
In this work, we use the approximate-master-equation approach to study the dynamics of the Kinouchi-Copelli neural model on various networks. By categorizing each neuron in terms of its state and also the states of its neighbors, we are able to uncover how the coupled system evolves with respective to time by directly solving a set of ordinary differential equations. In particular, we can easily calculate the statistical properties of the time evolution of the network instantaneous response, the network response curve, the dynamic range, and the critical point in the framework of the approximate-master-equation approach. The possible usage of the proposed theoretical approach to other spreading phenomena is briefly discussed.
Collapse
|
8
|
Abstract
As few real systems comprise indistinguishable units, diversity is a hallmark of nature. Diversity among interacting units shapes properties of collective behavior such as synchronization and information transmission. However, the benefits of diversity on information processing at the edge of a phase transition, ordinarily assumed to emerge from identical elements, remain largely unexplored. Analyzing a general model of excitable systems with heterogeneous excitability, we find that diversity can greatly enhance optimal performance (by two orders of magnitude) when distinguishing incoming inputs. Heterogeneous systems possess a subset of specialized elements whose capability greatly exceeds that of the nonspecialized elements. We also find that diversity can yield multiple percolation, with performance optimized at tricriticality. Our results are robust in specific and more realistic neuronal systems comprising a combination of excitatory and inhibitory units, and indicate that diversity-induced amplification can be harnessed by neuronal systems for evaluating stimulus intensities.
Collapse
|
9
|
Undersampled critical branching processes on small-world and random networks fail to reproduce the statistics of spike avalanches. PLoS One 2014; 9:e94992. [PMID: 24751599 PMCID: PMC3994033 DOI: 10.1371/journal.pone.0094992] [Citation(s) in RCA: 45] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2013] [Accepted: 03/19/2014] [Indexed: 11/18/2022] Open
Abstract
The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent . Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.
Collapse
|
10
|
Single-neuron criticality optimizes analog dendritic computation. Sci Rep 2013; 3:3222. [PMID: 24226045 PMCID: PMC3827605 DOI: 10.1038/srep03222] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2013] [Accepted: 10/30/2013] [Indexed: 11/08/2022] Open
Abstract
Active dendritic branchlets enable the propagation of dendritic spikes, whose computational functions remain an open question. Here we propose a concrete function to the active channels in large dendritic trees. Modelling the input-output response of large active dendritic arbors subjected to complex spatio-temporal inputs and exhibiting non-stereotyped dendritic spikes, we find that the dendritic arbor can undergo a continuous phase transition from a quiescent to an active state, thereby exhibiting spontaneous and self-sustained localized activity as suggested by experiments. Analogously to the critical brain hypothesis, which states that neuronal networks self-organize near criticality to take advantage of its specific properties, here we propose that neurons with large dendritic arbors optimize their capacity to distinguish incoming stimuli at the critical state. We suggest that "computation at the edge of a phase transition" is more compatible with the view that dendritic arbors perform an analog rather than a digital dendritic computation.
Collapse
|
11
|
Scaling behavior in probabilistic neuronal cellular automata. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2013; 87:012704. [PMID: 23410356 DOI: 10.1103/physreve.87.012704] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/03/2012] [Revised: 11/16/2012] [Indexed: 06/01/2023]
Abstract
We study a neural network model of interacting stochastic discrete two-state cellular automata on a regular lattice. The system is externally tuned to a critical point which varies with the degree of stochasticity (or the effective temperature). There are avalanches of neuronal activity, namely, spatially and temporally contiguous sites of activity; a detailed numerical study of these activity avalanches is presented, and single, joint, and marginal probability distributions are computed. At the critical point, we find that the scaling exponents for the variables are in good agreement with a mean-field theory.
Collapse
|
12
|
Signal integration enhances the dynamic range in neuronal systems. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2012; 85:040902. [PMID: 22680413 DOI: 10.1103/physreve.85.040902] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/20/2011] [Revised: 12/09/2011] [Indexed: 06/01/2023]
Abstract
The dynamic range measures the capacity of a system to discriminate the intensity of an external stimulus. Such an ability is fundamental for living beings to survive: to leverage resources and to avoid danger. Consequently, the larger is the dynamic range, the greater is the probability of survival. We investigate how the integration of different input signals affects the dynamic range, and in general the collective behavior of a network of excitable units. By means of numerical simulations and a mean-field approach, we explore the nonequilibrium phase transition in the presence of integration. We show that the firing rate in random and scale-free networks undergoes a discontinuous phase transition depending on both the integration time and the density of integrator units. Moreover, in the presence of external stimuli, we find that a system of excitable integrator units operating in a bistable regime largely enhances its dynamic range.
Collapse
|
13
|
Statistical physics approach to dendritic computation: the excitable-wave mean-field approximation. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2012; 85:011911. [PMID: 22400595 DOI: 10.1103/physreve.85.011911] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/12/2011] [Revised: 11/23/2011] [Indexed: 05/31/2023]
Abstract
We analytically study the input-output properties of a neuron whose active dendritic tree, modeled as a Cayley tree of excitable elements, is subjected to Poisson stimulus. Both single-site and two-site mean-field approximations incorrectly predict a nonequilibrium phase transition which is not allowed in the model. We propose an excitable-wave mean-field approximation which shows good agreement with previously published simulation results [Gollo et al., PLoS Comput. Biol. 5, e1000402 (2009)] and accounts for finite-size effects. We also discuss the relevance of our results to experiments in neuroscience, emphasizing the role of active dendrites in the enhancement of dynamic range and in gain control modulation.
Collapse
|
14
|
Effects of network topology, transmission delays, and refractoriness on the response of coupled excitable systems to a stochastic stimulus. CHAOS (WOODBURY, N.Y.) 2011; 21:025117. [PMID: 21721795 PMCID: PMC3183795 DOI: 10.1063/1.3600760] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/14/2011] [Accepted: 05/27/2011] [Indexed: 05/31/2023]
Abstract
We study the effects of network topology on the response of networks of coupled discrete excitable systems to an external stochastic stimulus. We extend recent results that characterize the response in terms of spectral properties of the adjacency matrix by allowing distributions in the transmission delays and in the number of refractory states and by developing a nonperturbative approximation to the steady state network response. We confirm our theoretical results with numerical simulations. We find that the steady state response amplitude is inversely proportional to the duration of refractoriness, which reduces the maximum attainable dynamic range. We also find that transmission delays alter the time required to reach steady state. Importantly, neither delays nor refractoriness impact the general prediction that criticality and maximum dynamic range occur when the largest eigenvalue of the adjacency matrix is unity.
Collapse
|
15
|
Abstract
BACKGROUND Scale-invariant neuronal avalanches have been observed in cell cultures and slices as well as anesthetized and awake brains, suggesting that the brain operates near criticality, i.e. within a narrow margin between avalanche propagation and extinction. In theory, criticality provides many desirable features for the behaving brain, optimizing computational capabilities, information transmission, sensitivity to sensory stimuli and size of memory repertoires. However, a thorough characterization of neuronal avalanches in freely-behaving (FB) animals is still missing, thus raising doubts about their relevance for brain function. METHODOLOGY/PRINCIPAL FINDINGS To address this issue, we employed chronically implanted multielectrode arrays (MEA) to record avalanches of action potentials (spikes) from the cerebral cortex and hippocampus of 14 rats, as they spontaneously traversed the wake-sleep cycle, explored novel objects or were subjected to anesthesia (AN). We then modeled spike avalanches to evaluate the impact of sparse MEA sampling on their statistics. We found that the size distribution of spike avalanches are well fit by lognormal distributions in FB animals, and by truncated power laws in the AN group. FB data surrogation markedly decreases the tail of the distribution, i.e. spike shuffling destroys the largest avalanches. The FB data are also characterized by multiple key features compatible with criticality in the temporal domain, such as 1/f spectra and long-term correlations as measured by detrended fluctuation analysis. These signatures are very stable across waking, slow-wave sleep and rapid-eye-movement sleep, but collapse during anesthesia. Likewise, waiting time distributions obey a single scaling function during all natural behavioral states, but not during anesthesia. Results are equivalent for neuronal ensembles recorded from visual and tactile areas of the cerebral cortex, as well as the hippocampus. CONCLUSIONS/SIGNIFICANCE Altogether, the data provide a comprehensive link between behavior and brain criticality, revealing a unique scale-invariant regime of spike avalanches across all major behaviors.
Collapse
|
16
|
Discontinuous nonequilibrium phase transitions in a nonlinearly pulse-coupled excitable lattice model. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2009; 80:061105. [PMID: 20365116 DOI: 10.1103/physreve.80.061105] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2009] [Indexed: 05/29/2023]
Abstract
We study a modified version of the stochastic susceptible-infected-refractory-susceptible (SIRS) model by employing a nonlinear (exponential) reinforcement in the contagion rate and no diffusion. We run simulations for complete and random graphs as well as d-dimensional hypercubic lattices (for d=3,2,1). For weak nonlinearity, a continuous nonequilibrium phase transition between an absorbing and an active phase is obtained, such as in the usual stochastic SIRS model [Joo and Lebowitz, Phys. Rev. E 70, 036114 (2004)]. However, for strong nonlinearity, the nonequilibrium transition between the two phases can be discontinuous for d>or=2, which is confirmed by well-characterized hysteresis cycles and bistability. Analytical mean-field results correctly predict the overall structure of the phase diagram. Furthermore, contrary to what was observed in a model of phase-coupled stochastic oscillators with a similar nonlinearity in the coupling [Wood, Phys. Rev. Lett. 96, 145701 (2006)], we did not find a transition to a stable (partially) synchronized state in our nonlinearly pulse-coupled excitable elements. For long enough refractory times and high enough nonlinearity, however, the system can exhibit collective excitability and unstable stochastic oscillations.
Collapse
|
17
|
Abstract
Since the first experimental evidences of active conductances in dendrites, most neurons have been shown to exhibit dendritic excitability through the expression of a variety of voltage-gated ion channels. However, despite experimental and theoretical efforts undertaken in the past decades, the role of this excitability for some kind of dendritic computation has remained elusive. Here we show that, owing to very general properties of excitable media, the average output of a model of an active dendritic tree is a highly non-linear function of its afferent rate, attaining extremely large dynamic ranges (above 50 dB). Moreover, the model yields double-sigmoid response functions as experimentally observed in retinal ganglion cells. We claim that enhancement of dynamic range is the primary functional role of active dendritic conductances. We predict that neurons with larger dendritic trees should have larger dynamic range and that blocking of active conductances should lead to a decrease in dynamic range. Most neurons present cellular tree-like extensions known as dendrites, which receive input signals from synapses with other cells. Some neurons have very large and impressive dendritic arbors. What is the function of such elaborate and costly structures? The functional role of dendrites is not obvious because, if dendrites were an electrical passive medium, then signals from their periphery could not influence the neuron output activity. Dendrites, however, are not passive, but rather active media that amplify and support pulses (dendritic spikes). These voltage pulses do not simply add but can also annihilate each other when they collide. To understand the net effect of the complex interactions among dendritic spikes under massive synaptic input, here we examine a computational model of excitable dendritic trees. We show that, in contrast to passive trees, they have a very large dynamic range, which implies a greater capacity of the neuron to distinguish among the widely different intensities of input which it receives. Our results provide an explanation to the concentration invariance property observed in olfactory processing, due to the very similar response to different inputs. In addition, our modeling approach also suggests a microscopic neural basis for the century old psychophysical laws.
Collapse
|
18
|
Deterministic excitable media under Poisson drive: power law responses, spiral waves, and dynamic range. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2008; 77:051911. [PMID: 18643106 DOI: 10.1103/physreve.77.051911] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/27/2008] [Indexed: 05/26/2023]
Abstract
When each site of a spatially extended excitable medium is independently driven by a Poisson stimulus with rate h , the interplay between creation and annihilation of excitable waves leads to an average activity F . It has recently been suggested that in the low-stimulus regime (h approximately 0) the response function F(h) of hypercubic deterministic systems behaves as a power law, F approximately h{m} . Moreover, the response exponent m has been predicted to depend only on the dimensionality d of the lattice, m=1/(1+d) [T. Ohta and T. Yoshimura, Physica D 205, 189 (2005)]. In order to test this prediction, we study the response function of excitable lattices modeled by either coupled Morris-Lecar equations or Greenberg-Hastings cellular automata. We show that the prediction is verified in our model systems for d=1 , 2, and 3, provided that a minimum set of conditions is satisfied. Under these conditions, the dynamic range-which measures the range of stimulus intensities that can be coded by the network activity-increases with the dimensionality d of the network. The power law scenario breaks down, however, if the system can exhibit self-sustained activity (spiral waves). In this case, we recover a scenario that is common to probabilistic excitable media: as a function of the conductance coupling G among the excitable elements, the dynamic range is maximized precisely at the critical value G_{c} above which self-sustained activity becomes stable. We discuss the implications of these results in the context of neural coding.
Collapse
|