1
|
Ramlow L, Falcke M, Lindner B. An integrate-and-fire approach to Ca 2+ signaling. Part II: Cumulative refractoriness. Biophys J 2023; 122:4710-4729. [PMID: 37981761 PMCID: PMC10754692 DOI: 10.1016/j.bpj.2023.11.015] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Revised: 10/20/2023] [Accepted: 11/15/2023] [Indexed: 11/21/2023] Open
Abstract
Inositol 1,4,5-trisphosphate-induced Ca2+ signaling is a second messenger system used by almost all eukaryotic cells. The agonist concentration stimulating Ca2+ signals is encoded in the frequency of a Ca2+ concentration spike sequence. When a cell is stimulated, the interspike intervals (ISIs) often show a distinct transient during which they gradually increase, a system property we refer to as cumulative refractoriness. We extend a previously published stochastic model to include the Ca2+ concentration in the intracellular Ca2+ store as a slow adaptation variable. This model can reproduce both stationary and transient statistics of experimentally observed ISI sequences. We derive approximate expressions for the mean and coefficient of variation of the stationary ISIs. We also consider the response to the onset of a constant stimulus and estimate the length of the transient and the strength of the adaptation of the ISI. We show that the adaptation sets the coefficient of variation in agreement with current ideas derived from experiments. Moreover, we explain why, despite a pronounced transient behavior, ISI correlations can be weak, as often observed in experiments. Finally, we fit our model to reproduce the transient statistics of experimentally observed ISI sequences in stimulated HEK cells. The fitted model is able to qualitatively reproduce the relationship between the stationary interval correlations and the number of transient intervals, as well as the strength of the ISI adaptation. We also find positive correlations in the experimental sequence that cannot be explained by our model.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Department of Physics, Humboldt University Berlin, Berlin, Germany; Max Delbrück Center for Molecular Medicine, Berlin, Germany
| | - Martin Falcke
- Department of Physics, Humboldt University Berlin, Berlin, Germany; Max Delbrück Center for Molecular Medicine, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Department of Physics, Humboldt University Berlin, Berlin, Germany
| |
Collapse
|
2
|
Stern M, Istrate N, Mazzucato L. A reservoir of timescales emerges in recurrent circuits with heterogeneous neural assemblies. eLife 2023; 12:e86552. [PMID: 38084779 PMCID: PMC10810607 DOI: 10.7554/elife.86552] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 12/07/2023] [Indexed: 01/26/2024] Open
Abstract
The temporal activity of many physical and biological systems, from complex networks to neural circuits, exhibits fluctuations simultaneously varying over a large range of timescales. Long-tailed distributions of intrinsic timescales have been observed across neurons simultaneously recorded within the same cortical circuit. The mechanisms leading to this striking temporal heterogeneity are yet unknown. Here, we show that neural circuits, endowed with heterogeneous neural assemblies of different sizes, naturally generate multiple timescales of activity spanning several orders of magnitude. We develop an analytical theory using rate networks, supported by simulations of spiking networks with cell-type specific connectivity, to explain how neural timescales depend on assembly size and show that our model can naturally explain the long-tailed timescale distribution observed in the awake primate cortex. When driving recurrent networks of heterogeneous neural assemblies by a time-dependent broadband input, we found that large and small assemblies preferentially entrain slow and fast spectral components of the input, respectively. Our results suggest that heterogeneous assemblies can provide a biologically plausible mechanism for neural circuits to demix complex temporal input signals by transforming temporal into spatial neural codes via frequency-selective neural assemblies.
Collapse
Affiliation(s)
- Merav Stern
- Institute of Neuroscience, University of OregonEugeneUnited States
- Faculty of Medicine, The Hebrew University of JerusalemJerusalemIsrael
| | - Nicolae Istrate
- Institute of Neuroscience, University of OregonEugeneUnited States
- Departments of Physics, University of OregonEugeneUnited States
| | - Luca Mazzucato
- Institute of Neuroscience, University of OregonEugeneUnited States
- Departments of Physics, University of OregonEugeneUnited States
- Mathematics and Biology, University of OregonEugeneUnited States
| |
Collapse
|
3
|
Schlungbaum M, Lindner B. Detecting a periodic signal by a population of spiking neurons in the weakly nonlinear response regime. THE EUROPEAN PHYSICAL JOURNAL. E, SOFT MATTER 2023; 46:108. [PMID: 37930460 PMCID: PMC10627932 DOI: 10.1140/epje/s10189-023-00371-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Accepted: 10/20/2023] [Indexed: 11/07/2023]
Abstract
Motivated by experimental observations, we investigate a variant of the cocktail party problem: the detection of a weak periodic stimulus in the presence of fluctuations and another periodic stimulus which is stronger than the periodic signal to be detected. Specifically, we study the response of a population of stochastic leaky integrate-and-fire (LIF) neurons to two periodic signals and focus in particular on the question, whether the presence of one of the stimuli can be detected from the population activity. As a detection criterion, we use a simple threshold-crossing of the population activity over a certain time window. We show by means of the receiver operating characteristics (ROC) that the detectability depends only weakly on the time window of observation but rather strongly on the stimulus amplitude. Counterintuitively, the detection of the weak periodic signal can be facilitated by the presence of a strong periodic input current depending on the frequencies of the two signals and on the dynamical regime in which the neurons operate. Beside numerical simulations of the model, we present an analytical approximation for the ROC curve that is based on the weakly nonlinear response theory for a stochastic LIF neuron.
Collapse
Affiliation(s)
- Maria Schlungbaum
- Physics Department, Humboldt University Berlin, Berlin, Germany.
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
4
|
Clark DG, Abbott LF, Litwin-Kumar A. Dimension of Activity in Random Neural Networks. PHYSICAL REVIEW LETTERS 2023; 131:118401. [PMID: 37774280 DOI: 10.1103/physrevlett.131.118401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/09/2022] [Revised: 05/25/2023] [Accepted: 08/08/2023] [Indexed: 10/01/2023]
Abstract
Neural networks are high-dimensional nonlinear dynamical systems that process information through the coordinated activity of many connected units. Understanding how biological and machine-learning networks function and learn requires knowledge of the structure of this coordinated activity, information contained, for example, in cross covariances between units. Self-consistent dynamical mean field theory (DMFT) has elucidated several features of random neural networks-in particular, that they can generate chaotic activity-however, a calculation of cross covariances using this approach has not been provided. Here, we calculate cross covariances self-consistently via a two-site cavity DMFT. We use this theory to probe spatiotemporal features of activity coordination in a classic random-network model with independent and identically distributed (i.i.d.) couplings, showing an extensive but fractionally low effective dimension of activity and a long population-level timescale. Our formulas apply to a wide range of single-unit dynamics and generalize to non-i.i.d. couplings. As an example of the latter, we analyze the case of partially symmetric couplings.
Collapse
Affiliation(s)
- David G Clark
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| | - L F Abbott
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| | - Ashok Litwin-Kumar
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| |
Collapse
|
5
|
Winston CN, Mastrovito D, Shea-Brown E, Mihalas S. Heterogeneity in Neuronal Dynamics Is Learned by Gradient Descent for Temporal Processing Tasks. Neural Comput 2023; 35:555-592. [PMID: 36827598 PMCID: PMC10044000 DOI: 10.1162/neco_a_01571] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Accepted: 11/02/2022] [Indexed: 02/26/2023]
Abstract
Individual neurons in the brain have complex intrinsic dynamics that are highly diverse. We hypothesize that the complex dynamics produced by networks of complex and heterogeneous neurons may contribute to the brain's ability to process and respond to temporally complex data. To study the role of complex and heterogeneous neuronal dynamics in network computation, we develop a rate-based neuronal model, the generalized-leaky-integrate-and-fire-rate (GLIFR) model, which is a rate equivalent of the generalized-leaky-integrate-and-fire model. The GLIFR model has multiple dynamical mechanisms, which add to the complexity of its activity while maintaining differentiability. We focus on the role of after-spike currents, currents induced or modulated by neuronal spikes, in producing rich temporal dynamics. We use machine learning techniques to learn both synaptic weights and parameters underlying intrinsic dynamics to solve temporal tasks. The GLIFR model allows the use of standard gradient descent techniques rather than surrogate gradient descent, which has been used in spiking neural networks. After establishing the ability to optimize parameters using gradient descent in single neurons, we ask how networks of GLIFR neurons learn and perform on temporally challenging tasks, such as sequential MNIST. We find that these networks learn diverse parameters, which gives rise to diversity in neuronal dynamics, as demonstrated by clustering of neuronal parameters. GLIFR networks have mixed performance when compared to vanilla recurrent neural networks, with higher performance in pixel-by-pixel MNIST but lower in line-by-line MNIST. However, they appear to be more robust to random silencing. We find that the ability to learn heterogeneity and the presence of after-spike currents contribute to these gains in performance. Our work demonstrates both the computational robustness of neuronal complexity and diversity in networks and a feasible method of training such models using exact gradients.
Collapse
Affiliation(s)
- Chloe N Winston
- Departments of Neuroscience and Computer Science, University of Washington, Seattle, WA 98195, U.S.A
- University of Washington Computational Neuroscience Center, Seattle, WA 98195, U.S.A.
| | - Dana Mastrovito
- Allen Institute for Brain Science, Seattle, WA 98109, U.S.A.
| | - Eric Shea-Brown
- University of Washington Computational Neuroscience Center, Seattle, WA 98195, U.S.A
- Allen Institute for Brain Science, Seattle, WA 98109, U.S.A
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195, U.S.A.
| | - Stefan Mihalas
- University of Washington Computational Neuroscience Center, Seattle, WA 98195, U.S.A
- Allen Institute for Brain Science, Seattle, WA 98109, U.S.A
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195, U.S.A.
| |
Collapse
|
6
|
Joshi SN, Joshi AN, Joshi ND. Interplay between biochemical processes and network properties generates neuronal up and down states at the tripartite synapse. Phys Rev E 2023; 107:024415. [PMID: 36932559 DOI: 10.1103/physreve.107.024415] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2022] [Accepted: 01/03/2023] [Indexed: 06/18/2023]
Abstract
Neuronal up and down states have long been known to exist both in vitro and in vivo. A variety of functions and mechanisms have been proposed for their generation, but there has not been a clear connection between the functions and mechanisms. We explore the potential contribution of cellular-level biochemistry to the network-level mechanisms thought to underlie the generation of up and down states. We develop a neurochemical model of a single tripartite synapse, assumed to be within a network of similar tripartite synapses, to investigate possible function-mechanism links for the appearance of up and down states. We characterize the behavior of our model in different regions of parameter space and show that resource limitation at the tripartite synapse affects its ability to faithfully transmit input signals, leading to extinction-down states. Recovery of resources allows for "reignition" into up states. The tripartite synapse exhibits distinctive "regimes" of operation depending on whether ATP, neurotransmitter (glutamate), both, or neither, is limiting. Our model qualitatively matches the behavior of six disparate experimental systems, including both in vitro and in vivo models, without changing any model parameters except those related to the experimental conditions. We also explore the effects of varying different critical parameters within the model. Here we show that availability of energy, represented by ATP, and glutamate for neurotransmission at the cellular level are intimately related, and are capable of promoting state transitions at the network level as ignition and extinction phenomena. Our model is complementary to existing models of neuronal up and down states in that it focuses on cellular-level dynamics while still retaining essential network-level processes. Our model predicts the existence of a "final common pathway" of behavior at the tripartite synapse arising from scarcity of resources and may explain use dependence in the phenomenon of "local sleep." Ultimately, sleeplike behavior may be a fundamental property of networks of tripartite synapses.
Collapse
Affiliation(s)
- Shubhada N Joshi
- National Center for Adaptive Neurotechnologies (NCAN), David Axelrod Institute, Wadsworth Center, New York State Department of Health, 120 New Scotland Ave., Albany, New York 12208, USA
| | - Aditya N Joshi
- Stanford University School of Medicine, 300 Pasteur Dr., Stanford, California 94305, USA
| | - Narendra D Joshi
- General Electric Global Research, 1 Research Circle, Niskayuna, New York 12309, USA
| |
Collapse
|
7
|
Input correlations impede suppression of chaos and learning in balanced firing-rate networks. PLoS Comput Biol 2022; 18:e1010590. [DOI: 10.1371/journal.pcbi.1010590] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 12/15/2022] [Accepted: 09/20/2022] [Indexed: 12/12/2022] Open
Abstract
Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks.
Collapse
|
8
|
Zhang S, Liu A, Zhou Z, Huang Z, Cheng J, Chen D, Zhong Q, Yu Q, Peng Z, Hong M. Clinical features and power spectral entropy of electroencephalography in Wilson's disease with dystonia. Brain Behav 2022; 12:e2791. [PMID: 36282481 PMCID: PMC9759124 DOI: 10.1002/brb3.2791] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 08/28/2022] [Accepted: 10/02/2022] [Indexed: 12/24/2022] Open
Abstract
OBJECTIVE To study the clinical features and power spectral entropy (PSE) of electroencephalography signals in Wilson's disease (WD) patients with dystonia. METHODS Several scale evaluations were performed to assess the clinical features of WD patients. Demographic information and electroencephalography signals were obtained in all subjects. RESULTS 34 WD patients with dystonia were recruited in the case group and 24 patients without dystonia were recruited in the control group. 20 healthy individuals were included in the healthy control group. The mean body mass index (BMI) in the case group was significantly lower than that in the controls (p < .05). The case group had significantly higher SAS, SDS, and Bucco-Facial-Apraxia Assessment scores (p < .05). Total BADS scores in the case group were lower than those in the control group (p < .01). Note that 94.11% of the case group presented with dysarthria and 70.59% of them suffered from dysphagia. Dysphagia was mainly related to the oral preparatory stage and oral stage. Mean power spectral entropy (PSE) values in the case group were significantly different (p < .05) from those in the control group and the healthy group across the different tasks. CONCLUSIONS The patients with dystonia were usually accompanied with low BMI, anxiety, depression, apraxia, executive dysfunction, dysarthria and dysphagia. The cortical activities of the WD patients with dystonia seemed to be more chaotic during the eyes-closed and reading tasks but lower during the swallowing stages than those in the control group.
Collapse
Affiliation(s)
- Shaoru Zhang
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Aiqun Liu
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Zhihua Zhou
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Zheng Huang
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Jing Cheng
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Danping Chen
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Qizhi Zhong
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Qingyun Yu
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Zhongxing Peng
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| | - Mingfan Hong
- Department of Neurology, The First Affiliated Hospital, Clinical Medicine College of Guangdong Pharmaceutical University, Guangzhou, Guangdong, China
| |
Collapse
|
9
|
Rohlfs C. A descriptive analysis of olfactory sensation and memory in Drosophila and its relation to artificial neural networks. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.10.068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
10
|
Cottam R, Vounckx R. Chaos, complexity and computation in the evolution of biological systems. Biosystems 2022; 217:104671. [DOI: 10.1016/j.biosystems.2022.104671] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2021] [Revised: 03/31/2022] [Accepted: 04/03/2022] [Indexed: 11/30/2022]
|
11
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
12
|
Krishnamurthy K, Can T, Schwab DJ. Theory of Gating in Recurrent Neural Networks. PHYSICAL REVIEW. X 2022; 12:011011. [PMID: 36545030 PMCID: PMC9762509 DOI: 10.1103/physrevx.12.011011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience. Prior theoretical work has focused on RNNs with additive interactions. However gating i.e., multiplicative interactions are ubiquitous in real neurons and also the central feature of the best-performing RNNs in ML. Here, we show that gating offers flexible control of two salient features of the collective dynamics: (i) timescales and (ii) dimensionality. The gate controlling timescales leads to a novel marginally stable state, where the network functions as a flexible integrator. Unlike previous approaches, gating permits this important function without parameter fine-tuning or special symmetries. Gates also provide a flexible, context-dependent mechanism to reset the memory trace, thus complementing the memory function. The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs. At this transition, unlike additive RNNs, the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity). The rich dynamics are summarized in phase diagrams, thus providing a map for principled parameter initialization choices to ML practitioners.
Collapse
Affiliation(s)
- Kamesh Krishnamurthy
- Joseph Henry Laboratories of Physics and PNI, Princeton University, Princeton, New Jersey 08544, USA
| | - Tankut Can
- Institute for Advanced Study, Princeton, New Jersey 08540, USA
| | - David J. Schwab
- Initiative for Theoretical Sciences, Graduate Center, CUNY, New York, New York 10016, USA
| |
Collapse
|
13
|
Gastaldi C, Schwalger T, De Falco E, Quiroga RQ, Gerstner W. When shared concept cells support associations: Theory of overlapping memory engrams. PLoS Comput Biol 2021; 17:e1009691. [PMID: 34968383 PMCID: PMC8754331 DOI: 10.1371/journal.pcbi.1009691] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2021] [Revised: 01/12/2022] [Accepted: 11/29/2021] [Indexed: 12/02/2022] Open
Abstract
Assemblies of neurons, called concepts cells, encode acquired concepts in human Medial Temporal Lobe. Those concept cells that are shared between two assemblies have been hypothesized to encode associations between concepts. Here we test this hypothesis in a computational model of attractor neural networks. We find that for concepts encoded in sparse neural assemblies there is a minimal fraction cmin of neurons shared between assemblies below which associations cannot be reliably implemented; and a maximal fraction cmax of shared neurons above which single concepts can no longer be retrieved. In the presence of a periodically modulated background signal, such as hippocampal oscillations, recall takes the form of association chains reminiscent of those postulated by theories of free recall of words. Predictions of an iterative overlap-generating model match experimental data on the number of concepts to which a neuron responds. Experimental evidence suggests that associations between concepts are encoded in the hippocampus by cells shared between neuronal assemblies (“overlap” of concepts). What is the necessary overlap that ensures a reliable encoding of associations? Under which conditions can associations induce a simultaneous or a chain-like activation of concepts? Our theoretical model shows that the ideal overlap presents a tradeoff: the overlap should be larger than a minimum value in order to reliably encode associations, but lower than a maximum value to prevent loss of individual memories. Our theory explains experimental data from human Medial Temporal Lobe and provides a mechanism for chain-like recall in presence of inhibition, while still allowing for simultaneous recall if inhibition is weak.
Collapse
Affiliation(s)
- Chiara Gastaldi
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| | - Tilo Schwalger
- Institut für Mathematik, Technische Universität Berlin, Berlin, Germany
| | - Emanuela De Falco
- School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Rodrigo Quian Quiroga
- Centre for Systems Neuroscience, University of Leicester, Leicester, United Kingdom
- Peng Cheng Laboratory, Shenzhen, China
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
14
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel’s time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability. The elementary processing units in the central nervous system are neurons that transmit information by short electrical pulses, so called action potentials or spikes. The generation of the action potential is a random process that can be shaped by correlated fluctuations (colored noise) and by adaptation. A consequence of these two ubiquitous features is that the successive time intervals between spikes, the interspike intervals, are not independent but correlated. As these correlations can significantly improve information transmission and weak-signal detection, it is an important task to develop analytical approaches to these statistics for well-established computational models. Here we present a theory of interval correlations for a widely used class of integrate-and-fire models endowed with an adaptation mechanism and subject to correlated fluctuations. We demonstrate which patterns of interval correlations can be expected from the interplay of colored noise, adaptation and intrinsic nonlinear dynamics.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
- * E-mail:
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
15
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
16
|
Bondanelli G, Deneux T, Bathellier B, Ostojic S. Network dynamics underlying OFF responses in the auditory cortex. eLife 2021; 10:e53151. [PMID: 33759763 PMCID: PMC8057817 DOI: 10.7554/elife.53151] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Accepted: 03/19/2021] [Indexed: 11/13/2022] Open
Abstract
Across sensory systems, complex spatio-temporal patterns of neural activity arise following the onset (ON) and offset (OFF) of stimuli. While ON responses have been widely studied, the mechanisms generating OFF responses in cortical areas have so far not been fully elucidated. We examine here the hypothesis that OFF responses are single-cell signatures of recurrent interactions at the network level. To test this hypothesis, we performed population analyses of two-photon calcium recordings in the auditory cortex of awake mice listening to auditory stimuli, and compared them to linear single-cell and network models. While the single-cell model explained some prominent features of the data, it could not capture the structure across stimuli and trials. In contrast, the network model accounted for the low-dimensional organization of population responses and their global structure across stimuli, where distinct stimuli activated mostly orthogonal dimensions in the neural state-space.
Collapse
Affiliation(s)
- Giulio Bondanelli
- Laboratoire de Neurosciences Cognitives et Computationelles, Département d’études cognitives, ENS, PSL University, INSERMParisFrance
- Neural Computation Laboratory, Center for Human Technologies, Istituto Italiano di Tecnologia (IIT)GenoaItaly
| | - Thomas Deneux
- Départment de Neurosciences Intégratives et Computationelles (ICN), Institut des Neurosciences Paris-Saclay (NeuroPSI), UMR 9197 CNRS, Université Paris SudGif-sur-YvetteFrance
| | - Brice Bathellier
- Départment de Neurosciences Intégratives et Computationelles (ICN), Institut des Neurosciences Paris-Saclay (NeuroPSI), UMR 9197 CNRS, Université Paris SudGif-sur-YvetteFrance
- Institut Pasteur, INSERM, Institut de l’AuditionParisFrance
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationelles, Département d’études cognitives, ENS, PSL University, INSERMParisFrance
| |
Collapse
|
17
|
Unpredictable Oscillations for Hopfield-Type Neural Networks with Delayed and Advanced Arguments. MATHEMATICS 2021. [DOI: 10.3390/math9050571] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
This is the first time that the method for the investigation of unpredictable solutions of differential equations has been extended to unpredictable oscillations of neural networks with a generalized piecewise constant argument, which is delayed and advanced. The existence and exponential stability of the unique unpredictable oscillation are proven. According to the theory, the presence of unpredictable oscillations is strong evidence for Poincaré chaos. Consequently, the paper is a contribution to chaos applications in neuroscience. The model is inspired by chaotic time-varying stimuli, which allow studying the distribution of chaotic signals in neural networks. Unpredictable inputs create an excitation wave of neurons that transmit chaotic signals. The technique of analysis includes the ideas used for differential equations with a piecewise constant argument. The results are illustrated by examples and simulations. They are carried out in MATLAB Simulink to demonstrate the simplicity of the diagrammatic approaches.
Collapse
|
18
|
Li Z, Dong Z, Bai X, Liu M. Characterizing the orientation selectivity in V1 and V4 of macaques by quadratic phase coupling. J Neural Eng 2020; 17:036028. [PMID: 32480396 DOI: 10.1088/1741-2552/ab9843] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
OBJECTIVE Orientation selectivity is one of the significant characteristics of neurons in the primary visual cortex (V1). Some neurons in extrastriate visual cortical areas also exhibit certain orientation selectivity. But it is still not well understood that how the orientation selectivity generates. Most previous studies about the orientation selectivity are based on the spike firing rate. However, the spikes are prone to be biased by the detection and sorting algorithms. Then, in this paper, the local field potential (LFP) is adopted to investigate the mechanism of orientation selectivity. APPROACH We used the quadratic phase coupling (QPC), which was calculated by wavelet bicoherence, to describe the characteristics of orientation selectivity available in V1 and V4. The raw wideband neural signals were recorded by two chronically implanted multi-electrode arrays, which were placed in V1 and V4 respectively in two macaques performing a selective visual attention task. MAIN RESULTS There is a strong correlation between the total bicoherence (TotalBic), which is a quantization for the overall QPC of frequency pairs in gamma band, and the grating orientation. Furthermore, the QPC distribution at the non-preferred orientation is mainly concentrated in the low frequencies (30-40 Hz) of gamma; while the QPC distribution at the preferred orientation concentrates in both the low frequencies and high frequencies (60-80 Hz) of gamma. In addition, the TotalBic of the gamma-band LFP between V1 and V4 varies with the grating orientations, indicating that the QPC is available in the feedforward link and the gamma-band LFP in V1 modulates the QPC in V4. SIGNIFICANCE The QPC reflects the orientations of the sinusoidal grating and describes the interaction of gamma-band LFP between different brain regions. Our results suggest that the QPC is an alternative avenue to explore the mechanism for generating orientation selectivity of visual neurons effectively.
Collapse
Affiliation(s)
- Zhaohui Li
- School of Information Science and Engineering, Yanshan University, Qinhuangdao, People's Republic of China. Hebei Key Laboratory of Information Transmission and Signal Processing, Yanshan University, Qinhuangdao, People's Republic of China
| | | | | | | |
Collapse
|
19
|
Muscinelli SP, Gerstner W, Schwalger T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput Biol 2019; 15:e1007122. [PMID: 31181063 PMCID: PMC6586367 DOI: 10.1371/journal.pcbi.1007122] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Revised: 06/20/2019] [Accepted: 05/22/2019] [Indexed: 02/07/2023] Open
Abstract
While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks.
Collapse
Affiliation(s)
- Samuel P. Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany
| |
Collapse
|
20
|
Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS Comput Biol 2019; 15:e1006893. [PMID: 30897092 PMCID: PMC6445477 DOI: 10.1371/journal.pcbi.1006893] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 04/02/2019] [Accepted: 02/19/2019] [Indexed: 11/19/2022] Open
Abstract
Neural activity in awake behaving animals exhibits a vast range of timescales that can be several fold larger than the membrane time constant of individual neurons. Two types of mechanisms have been proposed to explain this conundrum. One possibility is that large timescales are generated by a network mechanism based on positive feedback, but this hypothesis requires fine-tuning of the strength or structure of the synaptic connections. A second possibility is that large timescales in the neural dynamics are inherited from large timescales of underlying biophysical processes, two prominent candidates being intrinsic adaptive ionic currents and synaptic transmission. How the timescales of adaptation or synaptic transmission influence the timescale of the network dynamics has however not been fully explored. To address this question, here we analyze large networks of randomly connected excitatory and inhibitory units with additional degrees of freedom that correspond to adaptation or synaptic filtering. We determine the fixed points of the systems, their stability to perturbations and the corresponding dynamical timescales. Furthermore, we apply dynamical mean field theory to study the temporal statistics of the activity in the fluctuating regime, and examine how the adaptation and synaptic timescales transfer from individual units to the whole population. Our overarching finding is that synaptic filtering and adaptation in single neurons have very different effects at the network level. Unexpectedly, the macroscopic network dynamics do not inherit the large timescale present in adaptive currents. In contrast, the timescales of network activity increase proportionally to the time constant of the synaptic filter. Altogether, our study demonstrates that the timescales of different biophysical processes have different effects on the network level, so that the slow processes within individual neurons do not necessarily induce slow activity in large recurrent neural networks.
Collapse
Affiliation(s)
- Manuel Beiran
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|