51
|
Zirkle J, Rubchinsky LL. Spike-Timing Dependent Plasticity Effect on the Temporal Patterning of Neural Synchronization. Front Comput Neurosci 2020; 14:52. [PMID: 32595464 PMCID: PMC7303326 DOI: 10.3389/fncom.2020.00052] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2019] [Accepted: 05/12/2020] [Indexed: 11/29/2022] Open
Abstract
Neural synchrony in the brain at rest is usually variable and intermittent, thus intervals of predominantly synchronized activity are interrupted by intervals of desynchronized activity. Prior studies suggested that this temporal structure of the weakly synchronous activity might be functionally significant: many short desynchronizations may be functionally different from few long desynchronizations even if the average synchrony level is the same. In this study, we used computational neuroscience methods to investigate the effects of spike-timing dependent plasticity (STDP) on the temporal patterns of synchronization in a simple model. We employed a small network of conductance-based model neurons that were connected via excitatory plastic synapses. The dynamics of this network was subjected to the time-series analysis methods used in prior experimental studies. We found that STDP could alter the synchronized dynamics in the network in several ways, depending on the time scale that plasticity acts on. However, in general, the action of STDP in the simple network considered here is to promote dynamics with short desynchronizations (i.e., dynamics reminiscent of that observed in experimental studies). Complex interplay of the cellular and synaptic dynamics may lead to the activity-dependent adjustment of synaptic strength in such a way as to facilitate experimentally observed short desynchronizations in the intermittently synchronized neural activity.
Collapse
Affiliation(s)
- Joel Zirkle
- Department of Mathematical Sciences, Indiana University Purdue University Indianapolis, Indianapolis, IN, United States
| | - Leonid L Rubchinsky
- Department of Mathematical Sciences, Indiana University Purdue University Indianapolis, Indianapolis, IN, United States.,Stark Neurosciences Research Institute, Indiana University School of Medicine, Indianapolis, IN, United States
| |
Collapse
|
52
|
Tanaka T, Nakajima K, Aoyagi T. Effect of recurrent infomax on the information processing capability of input-driven recurrent neural networks. Neurosci Res 2020; 156:225-233. [DOI: 10.1016/j.neures.2020.02.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2019] [Revised: 01/28/2020] [Accepted: 02/06/2020] [Indexed: 11/29/2022]
|
53
|
Casal MA, Galella S, Vilarroya O, Garcia-Ojalvo J. Soft-wired long-term memory in a natural recurrent neuronal network. CHAOS (WOODBURY, N.Y.) 2020; 30:061101. [PMID: 32611119 DOI: 10.1063/5.0009709] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Accepted: 05/14/2020] [Indexed: 06/11/2023]
Abstract
Recurrent neuronal networks are known to be endowed with fading (short-term) memory, whereas long-term memory is usually considered to be hard-wired in the network connectivity via Hebbian learning, for instance. Here, we use the neuronal network of the roundworm C. elegans to show that recurrent architectures in living organisms can exhibit long-term memory without relying on specific hard-wired modules. We applied a genetic algorithm, using a binary genome that encodes for inhibitory-excitatory connectivity, to solve the unconstrained optimization problem of fitting the experimentally observed dynamics of the worm's neuronal network. Our results show that the network operates in a complex chaotic regime, as measured by the permutation entropy. In that complex regime, the response of the system to repeated presentations of a time-varying stimulus reveals a consistent behavior that can be interpreted as long-term memory. This memory is soft-wired, since it does not require structural changes in the network connectivity, but relies only on the system dynamics for encoding.
Collapse
Affiliation(s)
- Miguel A Casal
- Department of Experimental and Health Sciences, Universitat Pompeu Fabra, Dr. Aiguader 88, 08003 Barcelona, Spain
| | - Santiago Galella
- Department of Experimental and Health Sciences, Universitat Pompeu Fabra, Dr. Aiguader 88, 08003 Barcelona, Spain
| | - Oscar Vilarroya
- Department of Psychiatry and Legal Medicine, Universitat Autònoma de Barcelona, Cerdanyola del Vallès 08193, Spain
| | - Jordi Garcia-Ojalvo
- Department of Experimental and Health Sciences, Universitat Pompeu Fabra, Dr. Aiguader 88, 08003 Barcelona, Spain
| |
Collapse
|
54
|
Capone C, Rebollo B, Muñoz A, Illa X, Del Giudice P, Sanchez-Vives MV, Mattia M. Slow Waves in Cortical Slices: How Spontaneous Activity is Shaped by Laminar Structure. Cereb Cortex 2020; 29:319-335. [PMID: 29190336 DOI: 10.1093/cercor/bhx326] [Citation(s) in RCA: 41] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2017] [Accepted: 11/07/2017] [Indexed: 12/29/2022] Open
Abstract
Cortical slow oscillations (SO) of neural activity spontaneously emerge and propagate during deep sleep and anesthesia and are also expressed in isolated brain slices and cortical slabs. We lack full understanding of how SO integrate the different structural levels underlying local excitability of cell assemblies and their mutual interaction. Here, we focus on ongoing slow waves (SWs) in cortical slices reconstructed from a 16-electrode array designed to probe the neuronal activity at multiple spatial scales. In spite of the variable propagation patterns observed, we reproducibly found a smooth strip of loci leading the SW fronts, overlapping cortical layers 4 and 5, along which Up states were the longest and displayed the highest firing rate. Propagation modes were uncorrelated in time, signaling a memoryless generation of SWs. All these features could be modeled by a multimodular large-scale network of spiking neurons with a specific balance between local and intermodular connectivity. Modules work as relaxation oscillators with a weakly stable Down state and a peak of local excitability to model layers 4 and 5. These conditions allow for both optimal sensitivity to the network structure and richness of propagation modes, both of which are potential substrates for dynamic flexibility in more general contexts.
Collapse
Affiliation(s)
- Cristiano Capone
- PhD Program in Physics, Sapienza University, Rome, Italy.,Istituto Superiore di Sanità, Rome, Italy
| | - Beatriz Rebollo
- IDIBAPS (Institut d'Investigacions Biomèdiques August Pi i Sunyer), Barcelona, Spain
| | | | - Xavi Illa
- IMB-CNM-CSIC (Instituto de Microelectrónica de Barcelona), Universitat Autónoma de Barcelona, Barcelona, Spain.,CIBER-BBN, Networking Center on Bioengineering, Biomaterials and Nanomedicine, Zaragoza, Spain
| | - Paolo Del Giudice
- Istituto Superiore di Sanità, Rome, Italy.,INFN-Roma1 (Istituto Nazionale di Fisica Nucleare), Rome, Italy
| | - Maria V Sanchez-Vives
- IDIBAPS (Institut d'Investigacions Biomèdiques August Pi i Sunyer), Barcelona, Spain.,ICREA (Institució Catalana de Recerca i Estudis Avançats), Barcelona, Spain
| | | |
Collapse
|
55
|
Morozov A, Abbott K, Cuddington K, Francis T, Gellner G, Hastings A, Lai YC, Petrovskii S, Scranton K, Zeeman ML. Long transients in ecology: Theory and applications. Phys Life Rev 2020; 32:1-40. [PMID: 31982327 DOI: 10.1016/j.plrev.2019.09.004] [Citation(s) in RCA: 73] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Accepted: 09/09/2019] [Indexed: 11/15/2022]
Abstract
This paper discusses the recent progress in understanding the properties of transient dynamics in complex ecological systems. Predicting long-term trends as well as sudden changes and regime shifts in ecosystems dynamics is a major issue for ecology as such changes often result in population collapse and extinctions. Analysis of population dynamics has traditionally been focused on their long-term, asymptotic behavior whilst largely disregarding the effect of transients. However, there is a growing understanding that in ecosystems the asymptotic behavior is rarely seen. A big new challenge for theoretical and empirical ecology is to understand the implications of long transients. It is believed that the identification of the corresponding mechanisms along with the knowledge of scaling laws of the transient's lifetime should substantially improve the quality of long-term forecasting and crisis anticipation. Although transient dynamics have received considerable attention in physical literature, research into ecological transients is in its infancy and systematic studies are lacking. This text aims to partially bridge this gap and facilitate further progress in quantitative analysis of long transients in ecology. By revisiting and critically examining a broad variety of mathematical models used in ecological applications as well as empirical facts, we reveal several main mechanisms leading to the emergence of long transients and hence lays the basis for a unifying theory.
Collapse
Affiliation(s)
- Andrew Morozov
- Mathematics, University of Leicester, UK; Shirshov Institute of Oceanology, Moscow, Russia
| | | | | | - Tessa Francis
- Tacoma Puget Sound Institute, University of Washington, USA
| | | | - Alan Hastings
- Environmental Science and Policy, University of California, Davis, USA; Santa Fe Institute, Santa Fe, New Mexico, USA
| | - Ying-Cheng Lai
- Electrical, Computer and Energy Engineering, Arizona State University, Tempe, USA
| | - Sergei Petrovskii
- Mathematics, University of Leicester, UK; Peoples Friendship University of Russia (RUDN University), Moscow, Russia.
| | | | | |
Collapse
|
56
|
Bondanelli G, Ostojic S. Coding with transient trajectories in recurrent neural networks. PLoS Comput Biol 2020; 16:e1007655. [PMID: 32053594 PMCID: PMC7043794 DOI: 10.1371/journal.pcbi.1007655] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2019] [Revised: 02/26/2020] [Accepted: 01/14/2020] [Indexed: 01/04/2023] Open
Abstract
Following a stimulus, the neural response typically strongly varies in time and across neurons before settling to a steady-state. While classical population coding theory disregards the temporal dimension, recent works have argued that trajectories of transient activity can be particularly informative about stimulus identity and may form the basis of computations through dynamics. Yet the dynamical mechanisms needed to generate a population code based on transient trajectories have not been fully elucidated. Here we examine transient coding in a broad class of high-dimensional linear networks of recurrently connected units. We start by reviewing a well-known result that leads to a distinction between two classes of networks: networks in which all inputs lead to weak, decaying transients, and networks in which specific inputs elicit amplified transient responses and are mapped onto output states during the dynamics. Theses two classes are simply distinguished based on the spectrum of the symmetric part of the connectivity matrix. For the second class of networks, which is a sub-class of non-normal networks, we provide a procedure to identify transiently amplified inputs and the corresponding readouts. We first apply these results to standard randomly-connected and two-population networks. We then build minimal, low-rank networks that robustly implement trajectories mapping a specific input onto a specific orthogonal output state. Finally, we demonstrate that the capacity of the obtained networks increases proportionally with their size.
Collapse
Affiliation(s)
- Giulio Bondanelli
- Laboratoire de Neurosciences Cognitives et Computationelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
57
|
Dynamical Emergence Theory (DET): A Computational Account of Phenomenal Consciousness. Minds Mach (Dordr) 2020. [DOI: 10.1007/s11023-020-09516-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
58
|
Pessoa L. Neural dynamics of emotion and cognition: From trajectories to underlying neural geometry. Neural Netw 2019; 120:158-166. [PMID: 31522827 PMCID: PMC6899176 DOI: 10.1016/j.neunet.2019.08.007] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2019] [Revised: 07/15/2019] [Accepted: 08/09/2019] [Indexed: 01/31/2023]
Abstract
How can we study, characterize, and understand the neural underpinnings of cognitive-emotional behaviors as inherently dynamic processes? In the past 50 years, Stephen Grossberg has developed a research program that embraces the themes of dynamics, decentralized computation, emergence, selection and competition, and autonomy. The present paper discusses how these principles can be heeded by experimental scientists to advance the understanding of the brain basis of behavior. It is suggested that a profitable way forward is to focus on investigating the dynamic multivariate structure of brain data. Accordingly, central research problems involve characterizing "neural trajectories" and the associated geometry of the underlying "neural space." Finally, it is argued that, at a time when the development of neurotechniques has reached a fever pitch, neuroscience needs to redirect its focus and invest comparable energy in the conceptual and theoretical dimensions of its research endeavor. Otherwise we run the risk of being able to measure "every atom" in the brain in a theoretical vacuum.
Collapse
Affiliation(s)
- Luiz Pessoa
- Department of Psychology, Department of Electrical and Computer Engineering, Maryland Neuroimaging Center, University of Maryland, College Park, USA.
| |
Collapse
|
59
|
Hemberger M, Shein-Idelson M, Pammer L, Laurent G. Reliable Sequential Activation of Neural Assemblies by Single Pyramidal Cells in a Three-Layered Cortex. Neuron 2019; 104:353-369.e5. [PMID: 31439429 DOI: 10.1016/j.neuron.2019.07.017] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2018] [Revised: 05/10/2019] [Accepted: 07/12/2019] [Indexed: 10/26/2022]
Abstract
Recent studies reveal the occasional impact of single neurons on surround firing statistics and even simple behaviors. Exploiting the advantages of a simple cortex, we examined the influence of single pyramidal neurons on surrounding cortical circuits. Brief activation of single neurons triggered reliable sequences of firing in tens of other excitatory and inhibitory cortical neurons, reflecting cascading activity through local networks, as indicated by delayed yet precisely timed polysynaptic subthreshold potentials. The evoked patterns were specific to the pyramidal cell of origin, extended over hundreds of micrometers from their source, and unfolded over up to 200 ms. Simultaneous activation of pyramidal cell pairs indicated balanced control of population activity, preventing paroxysmal amplification. Single cortical pyramidal neurons can thus trigger reliable postsynaptic activity that can propagate in a reliable fashion through cortex, generating rapidly evolving and non-random firing sequences reminiscent of those observed in mammalian hippocampus during "replay" and in avian song circuits.
Collapse
Affiliation(s)
- Mike Hemberger
- Max Planck Institute for Brain Research, Frankfurt am Main, 60438 Germany
| | - Mark Shein-Idelson
- Max Planck Institute for Brain Research, Frankfurt am Main, 60438 Germany; Department of Neurobiology, George S. Wise Faculty of Life Sciences, Sagol School for Neuroscience, Tel-Aviv University, Tel Aviv, Israel
| | - Lorenz Pammer
- Max Planck Institute for Brain Research, Frankfurt am Main, 60438 Germany
| | - Gilles Laurent
- Max Planck Institute for Brain Research, Frankfurt am Main, 60438 Germany.
| |
Collapse
|
60
|
Lagzi F, Atay FM, Rotter S. Bifurcation analysis of the dynamics of interacting subnetworks of a spiking network. Sci Rep 2019; 9:11397. [PMID: 31388027 PMCID: PMC6684592 DOI: 10.1038/s41598-019-47190-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2018] [Accepted: 07/10/2019] [Indexed: 12/04/2022] Open
Abstract
We analyze the collective dynamics of hierarchically structured networks of densely connected spiking neurons. These networks of sub-networks may represent interactions between cell assemblies or different nuclei in the brain. The dynamical activity pattern that results from these interactions depends on the strength of synaptic coupling between them. Importantly, the overall dynamics of a brain region in the absence of external input, so called ongoing brain activity, has been attributed to the dynamics of such interactions. In our study, two different network scenarios are considered: a system with one inhibitory and two excitatory subnetworks, and a network representation with three inhibitory subnetworks. To study the effect of synaptic strength on the global dynamics of the network, two parameters for relative couplings between these subnetworks are considered. For each case, a bifurcation analysis is performed and the results have been compared to large-scale network simulations. Our analysis shows that Generalized Lotka-Volterra (GLV) equations, well-known in predator-prey studies, yield a meaningful population-level description for the collective behavior of spiking neuronal interaction, which have a hierarchical structure. In particular, we observed a striking equivalence between the bifurcation diagrams of spiking neuronal networks and their corresponding GLV equations. This study gives new insight on the behavior of neuronal assemblies, and can potentially suggest new mechanisms for altering the dynamical patterns of spiking networks based on changing the synaptic strength between some groups of neurons.
Collapse
Affiliation(s)
- Fereshteh Lagzi
- Bernstein Center Freiburg, Freiburg, Germany. .,Faculty of Biology, University of Freiburg, Freiburg, Germany.
| | - Fatihcan M Atay
- Department of Mathematics, Bilkent University, Ankara, Turkey
| | - Stefan Rotter
- Bernstein Center Freiburg, Freiburg, Germany.,Faculty of Biology, University of Freiburg, Freiburg, Germany
| |
Collapse
|
61
|
Remington ED, Narain D, Hosseini EA, Jazayeri M. Flexible Sensorimotor Computations through Rapid Reconfiguration of Cortical Dynamics. Neuron 2019; 98:1005-1019.e5. [PMID: 29879384 DOI: 10.1016/j.neuron.2018.05.020] [Citation(s) in RCA: 158] [Impact Index Per Article: 26.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2017] [Revised: 03/19/2018] [Accepted: 05/11/2018] [Indexed: 10/14/2022]
Abstract
Neural mechanisms that support flexible sensorimotor computations are not well understood. In a dynamical system whose state is determined by interactions among neurons, computations can be rapidly reconfigured by controlling the system's inputs and initial conditions. To investigate whether the brain employs such control mechanisms, we recorded from the dorsomedial frontal cortex of monkeys trained to measure and produce time intervals in two sensorimotor contexts. The geometry of neural trajectories during the production epoch was consistent with a mechanism wherein the measured interval and sensorimotor context exerted control over cortical dynamics by adjusting the system's initial condition and input, respectively. These adjustments, in turn, set the speed at which activity evolved in the production epoch, allowing the animal to flexibly produce different time intervals. These results provide evidence that the language of dynamical systems can be used to parsimoniously link brain activity to sensorimotor computations.
Collapse
Affiliation(s)
- Evan D Remington
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Devika Narain
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA; Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA; Netherlands Institute for Neuroscience, Amsterdam, the Netherlands; Erasmus Medical Center, Rotterdam, the Netherlands
| | - Eghbal A Hosseini
- Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Mehrdad Jazayeri
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA; Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA.
| |
Collapse
|
62
|
Dudkowski D, Czołczyński K, Kapitaniak T. Traveling amplitude death in coupled pendula. CHAOS (WOODBURY, N.Y.) 2019; 29:083124. [PMID: 31472496 DOI: 10.1063/1.5111191] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/24/2019] [Accepted: 08/01/2019] [Indexed: 06/10/2023]
Abstract
We investigate the phenomenon of amplitude death [in two scenarios-traveling (TAD) and stationary] in coupled pendula with escapement mechanisms. The possible dynamics of the network is examined in coupling parameters' plane, and the corresponding examples of attractors are discussed. We analyze the properties of the observed patterns, studying the period of one full cycle of TAD under the influence of system's parameters, as well as the mechanism of its existence. It is shown, using the energy balance method, that the strict energy transfer between the pendula determines the direction in which the amplitude death travels from one unit to another. The occurrence of TAD is investigated as a result of a simple perturbation procedure, which shows that the transient dynamics on the road from complete synchronization to amplitude death is not straightforward. The pendula behavior during the transient processes is studied, and the influence of parameters and perturbation magnitude on the possible network's response is described. Finally, we analyze the energy transfer during the transient motion, indicating the potential triggers leading to the desired state. The obtained results suggest that the occurrence of traveling amplitude death is related to the chaotic dynamics and the phenomenon appears as a result of completely random process.
Collapse
Affiliation(s)
- Dawid Dudkowski
- Division of Dynamics, Lodz University of Technology, Stefanowskiego 1/15, 90-924 Lodz, Poland
| | - Krzysztof Czołczyński
- Division of Dynamics, Lodz University of Technology, Stefanowskiego 1/15, 90-924 Lodz, Poland
| | - Tomasz Kapitaniak
- Division of Dynamics, Lodz University of Technology, Stefanowskiego 1/15, 90-924 Lodz, Poland
| |
Collapse
|
63
|
Koppe G, Toutounji H, Kirsch P, Lis S, Durstewitz D. Identifying nonlinear dynamical systems via generative recurrent neural networks with applications to fMRI. PLoS Comput Biol 2019; 15:e1007263. [PMID: 31433810 PMCID: PMC6719895 DOI: 10.1371/journal.pcbi.1007263] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2019] [Revised: 09/03/2019] [Accepted: 07/11/2019] [Indexed: 12/31/2022] Open
Abstract
A major tenet in theoretical neuroscience is that cognitive and behavioral processes are ultimately implemented in terms of the neural system dynamics. Accordingly, a major aim for the analysis of neurophysiological measurements should lie in the identification of the computational dynamics underlying task processing. Here we advance a state space model (SSM) based on generative piecewise-linear recurrent neural networks (PLRNN) to assess dynamics from neuroimaging data. In contrast to many other nonlinear time series models which have been proposed for reconstructing latent dynamics, our model is easily interpretable in neural terms, amenable to systematic dynamical systems analysis of the resulting set of equations, and can straightforwardly be transformed into an equivalent continuous-time dynamical system. The major contributions of this paper are the introduction of a new observation model suitable for functional magnetic resonance imaging (fMRI) coupled to the latent PLRNN, an efficient stepwise training procedure that forces the latent model to capture the 'true' underlying dynamics rather than just fitting (or predicting) the observations, and of an empirical measure based on the Kullback-Leibler divergence to evaluate from empirical time series how well this goal of approximating the underlying dynamics has been achieved. We validate and illustrate the power of our approach on simulated 'ground-truth' dynamical systems as well as on experimental fMRI time series, and demonstrate that the learnt dynamics harbors task-related nonlinear structure that a linear dynamical model fails to capture. Given that fMRI is one of the most common techniques for measuring brain activity non-invasively in human subjects, this approach may provide a novel step toward analyzing aberrant (nonlinear) dynamics for clinical assessment or neuroscientific research.
Collapse
Affiliation(s)
- Georgia Koppe
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Hazem Toutounji
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
| | - Peter Kirsch
- Department of Clinical Psychology, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Stefanie Lis
- Institute for Psychiatric and Psychosomatic Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Daniel Durstewitz
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
- Faculty of Physics and Astronomy, Heidelberg University, Heidelberg, Germany
| |
Collapse
|
64
|
Sohn H, Narain D, Meirhaeghe N, Jazayeri M. Bayesian Computation through Cortical Latent Dynamics. Neuron 2019; 103:934-947.e5. [PMID: 31320220 DOI: 10.1016/j.neuron.2019.06.012] [Citation(s) in RCA: 120] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2018] [Revised: 04/15/2019] [Accepted: 06/13/2019] [Indexed: 10/26/2022]
Abstract
Statistical regularities in the environment create prior beliefs that we rely on to optimize our behavior when sensory information is uncertain. Bayesian theory formalizes how prior beliefs can be leveraged and has had a major impact on models of perception, sensorimotor function, and cognition. However, it is not known how recurrent interactions among neurons mediate Bayesian integration. By using a time-interval reproduction task in monkeys, we found that prior statistics warp neural representations in the frontal cortex, allowing the mapping of sensory inputs to motor outputs to incorporate prior statistics in accordance with Bayesian inference. Analysis of recurrent neural network models performing the task revealed that this warping was enabled by a low-dimensional curved manifold and allowed us to further probe the potential causal underpinnings of this computational strategy. These results uncover a simple and general principle whereby prior beliefs exert their influence on behavior by sculpting cortical latent dynamics.
Collapse
Affiliation(s)
- Hansem Sohn
- Department of Brain and Cognitive Sciences, McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
| | - Devika Narain
- Department of Brain and Cognitive Sciences, McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA 02139, USA; Erasmus Medical Center, Rotterdam 3015CN, the Netherlands
| | - Nicolas Meirhaeghe
- Harvard-MIT Division of Health Sciences and Technology, Cambridge, MA 02139, USA
| | - Mehrdad Jazayeri
- Department of Brain and Cognitive Sciences, McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA 02139, USA.
| |
Collapse
|
65
|
Abstract
By studying different sources of temporal variability in central pattern generator (CPG) circuits, we unveil fundamental aspects of the instantaneous balance between flexibility and robustness in sequential dynamics -a property that characterizes many systems that display neural rhythms. Our analysis of the triphasic rhythm of the pyloric CPG (Carcinus maenas) shows strong robustness of transient dynamics in keeping not only the activation sequences but also specific cycle-by-cycle temporal relationships in the form of strong linear correlations between pivotal time intervals, i.e. dynamical invariants. The level of variability and coordination was characterized using intrinsic time references and intervals in long recordings of both regular and irregular rhythms. Out of the many possible combinations of time intervals studied, only two cycle-by-cycle dynamical invariants were identified, existing even outside steady states. While executing a neural sequence, dynamical invariants reflect constraints to optimize functionality by shaping the actual intervals in which activity emerges to build the sequence. Our results indicate that such boundaries to the adaptability arise from the interaction between the rich dynamics of neurons and connections. We suggest that invariant temporal sequence relationships could be present in other networks, including those shaping sequences of functional brain rhythms, and underlie rhythm programming and functionality.
Collapse
|
66
|
Moyal R, Edelman S. Dynamic Computation in Visual Thalamocortical Networks. ENTROPY (BASEL, SWITZERLAND) 2019; 21:E500. [PMID: 33267214 PMCID: PMC7514988 DOI: 10.3390/e21050500] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 03/13/2019] [Revised: 05/10/2019] [Accepted: 05/14/2019] [Indexed: 02/06/2023]
Abstract
Contemporary neurodynamical frameworks, such as coordination dynamics and winnerless competition, posit that the brain approximates symbolic computation by transitioning between metastable attractive states. This article integrates these accounts with electrophysiological data suggesting that coherent, nested oscillations facilitate information representation and transmission in thalamocortical networks. We review the relationship between criticality, metastability, and representational capacity, outline existing methods for detecting metastable oscillatory patterns in neural time series data, and evaluate plausible spatiotemporal coding schemes based on phase alignment. We then survey the circuitry and the mechanisms underlying the generation of coordinated alpha and gamma rhythms in the primate visual system, with particular emphasis on the pulvinar and its role in biasing visual attention and awareness. To conclude the review, we begin to integrate this perspective with longstanding theories of consciousness and cognition.
Collapse
Affiliation(s)
- Roy Moyal
- Department of Psychology, Cornell University, Ithaca, NY 14853, USA
| | | |
Collapse
|
67
|
Allen WE, Chen MZ, Pichamoorthy N, Tien RH, Pachitariu M, Luo L, Deisseroth K. Thirst regulates motivated behavior through modulation of brainwide neural population dynamics. Science 2019; 364:253. [PMID: 30948440 PMCID: PMC6711472 DOI: 10.1126/science.aav3932] [Citation(s) in RCA: 211] [Impact Index Per Article: 35.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2018] [Accepted: 02/14/2019] [Indexed: 04/09/2023]
Abstract
Physiological needs produce motivational drives, such as thirst and hunger, that regulate behaviors essential to survival. Hypothalamic neurons sense these needs and must coordinate relevant brainwide neuronal activity to produce the appropriate behavior. We studied dynamics from ~24,000 neurons in 34 brain regions during thirst-motivated choice behavior in 21 mice as they consumed water and became sated. Water-predicting sensory cues elicited activity that rapidly spread throughout the brain of thirsty animals. These dynamics were gated by a brainwide mode of population activity that encoded motivational state. After satiation, focal optogenetic activation of hypothalamic thirst-sensing neurons returned global activity to the pre-satiation state. Thus, motivational states specify initial conditions that determine how a brainwide dynamical system transforms sensory input into behavioral output.
Collapse
Affiliation(s)
- William E Allen
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
- Department of Biology, Stanford University, Stanford, CA 94305, USA
- Neurosciences Graduate Program, Stanford University, Stanford, CA 94305, USA
| | - Michael Z Chen
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
- Department of Biology, Stanford University, Stanford, CA 94305, USA
| | | | - Rebecca H Tien
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
| | | | - Liqun Luo
- Department of Biology, Stanford University, Stanford, CA 94305, USA.
- Howard Hughes Medical Institute, Stanford University, Stanford, CA 94305, USA
| | - Karl Deisseroth
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA.
- Howard Hughes Medical Institute, Stanford University, Stanford, CA 94305, USA
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, CA 94305, USA
| |
Collapse
|
68
|
Papo D, Buldú JM. Brain synchronizability, a false friend. Neuroimage 2019; 196:195-199. [PMID: 30986500 DOI: 10.1016/j.neuroimage.2019.04.029] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2018] [Revised: 03/28/2019] [Accepted: 04/08/2019] [Indexed: 01/20/2023] Open
Abstract
Synchronization plays a fundamental role in healthy cognitive and motor function. However, how synchronization depends on the interplay between local dynamics, coupling and topology and how prone to synchronization a network is, given its topological organization, are still poorly understood issues. To investigate the synchronizability of both anatomical and functional brain networks various studies resorted to the Master Stability Function (MSF) formalism, an elegant tool which allows analysing the stability of synchronous states in a dynamical system consisting of many coupled oscillators. Here, we argue that brain dynamics does not fulfil the formal criteria under which synchronizability is usually quantified and, perhaps more importantly, this measure refers to a global dynamical condition that never holds in the brain (not even in the most pathological conditions), and therefore no neurophysiological conclusions should be drawn based on it. We discuss the meaning of synchronizability and its applicability to neuroscience and propose alternative ways to quantify brain networks synchronization.
Collapse
Affiliation(s)
- D Papo
- SCALab UMR CNRS 9193, Université de Lille, Villeneuve d'Ascq, France.
| | - J M Buldú
- Laboratory of Biological Networks, Center for Biomedical Technology (UPM), 28223, Pozuelo de Alarcón, Madrid, Spain; Complex Systems Group & G.I.S.C., Universidad Rey Juan Carlos, 28933, Móstoles, Madrid, Spain
| |
Collapse
|
69
|
Ceni A, Ashwin P, Livi L. Interpreting Recurrent Neural Networks Behaviour via Excitable Network Attractors. Cognit Comput 2019. [DOI: 10.1007/s12559-019-09634-2] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
70
|
Vanag VK. "Cognitive" modes in small networks of almost identical chemical oscillators with pulsatile inhibitory coupling. CHAOS (WOODBURY, N.Y.) 2019; 29:033106. [PMID: 30927858 DOI: 10.1063/1.5063322] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/26/2018] [Accepted: 02/11/2019] [Indexed: 06/09/2023]
Abstract
The Lavrova-Vanag (LV) model of the periodical Belousov-Zhabotinsky (BZ) reaction has been investigated at pulsed self-perturbations, when a sharp spike of the BZ reaction induces a short inhibitory pulse that perturbs the BZ reaction after some time τ since each spike. The dynamics of this BZ system is strongly dependent on the amplitude Cinh of the perturbing pulses. At Cinh > Ccr, a new pseudo-steady state (SS) emerges far away from the limit cycle of the unperturbed BZ oscillator. The perturbed BZ system spends rather long time in the vicinity of this pseudo-SS, which serves as a trap for phase trajectories. As a result, the dynamics of the BZ system changes qualitatively. We observe new modes with packed spikes separated by either long "silent" dynamics or small-amplitude oscillations around pseudo-SS, depending on Cinh. Networks of two or three LV-BZ oscillators with strong pulsatile coupling and self-inhibition are able to generate so-called "cognitive" modes, which are very sensitive to small changes in Cinh. We demonstrate how the coupling between the BZ oscillators in these networks should be organized to find "cognitive" modes.
Collapse
Affiliation(s)
- Vladimir K Vanag
- Centre for Nonlinear Chemistry, Immanuel Kant Baltic Federal University, 14 A. Nevskogo Str., Kaliningrad 236041, Russia
| |
Collapse
|
71
|
Adler A, Zhao R, Shin ME, Yasuda R, Gan WB. Somatostatin-Expressing Interneurons Enable and Maintain Learning-Dependent Sequential Activation of Pyramidal Neurons. Neuron 2019; 102:202-216.e7. [PMID: 30792151 DOI: 10.1016/j.neuron.2019.01.036] [Citation(s) in RCA: 109] [Impact Index Per Article: 18.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2018] [Revised: 12/09/2018] [Accepted: 01/17/2019] [Indexed: 12/15/2022]
Abstract
The activities of neuronal populations exhibit temporal sequences that are thought to mediate spatial navigation, cognitive processing, and motor actions. The mechanisms underlying the generation and maintenance of sequential neuronal activity remain unclear. We found that layer 2 and/or 3 pyramidal neurons (PNs) showed sequential activation in the mouse primary motor cortex during motor skill learning. Concomitantly, the activity of somatostatin (SST)-expressing interneurons increased and decreased in a task-specific manner. Activating SST interneurons during motor training, either directly or via inhibiting vasoactive-intestinal-peptide-expressing interneurons, prevented learning-induced sequential activities of PNs and behavioral improvement. Conversely, inactivating SST interneurons during the learning of a new motor task reversed sequential activities and behavioral improvement that occurred during a previous task. Furthermore, the control of SST interneurons over sequential activation of PNs required CaMKII-dependent synaptic plasticity. These findings indicate that SST interneurons enable and maintain synaptic plasticity-dependent sequential activation of PNs during motor skill learning.
Collapse
Affiliation(s)
- Avital Adler
- Skirball Institute, Department of Neuroscience and Physiology, Department of Anesthesiology, New York University School of Medicine, New York, NY 10016, USA
| | - Ruohe Zhao
- Skirball Institute, Department of Neuroscience and Physiology, Department of Anesthesiology, New York University School of Medicine, New York, NY 10016, USA
| | - Myung Eun Shin
- Max Planck Florida Institute of Neuroscience, Jupiter, FL 33458, USA
| | - Ryohei Yasuda
- Max Planck Florida Institute of Neuroscience, Jupiter, FL 33458, USA; Department of Neurobiology, Duke University Medical Center, Durham, NC 27710, USA
| | - Wen-Biao Gan
- Skirball Institute, Department of Neuroscience and Physiology, Department of Anesthesiology, New York University School of Medicine, New York, NY 10016, USA.
| |
Collapse
|
72
|
Latorre R, Varona P, Rabinovich MI. Rhythmic control of oscillatory sequential dynamics in heteroclinic motifs. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2018.11.056] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
|
73
|
Williamson RC, Doiron B, Smith MA, Yu BM. Bridging large-scale neuronal recordings and large-scale network models using dimensionality reduction. Curr Opin Neurobiol 2019; 55:40-47. [PMID: 30677702 DOI: 10.1016/j.conb.2018.12.009] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2018] [Revised: 12/16/2018] [Accepted: 12/17/2018] [Indexed: 12/21/2022]
Abstract
A long-standing goal in neuroscience has been to bring together neuronal recordings and neural network modeling to understand brain function. Neuronal recordings can inform the development of network models, and network models can in turn provide predictions for subsequent experiments. Traditionally, neuronal recordings and network models have been related using single-neuron and pairwise spike train statistics. We review here recent studies that have begun to relate neuronal recordings and network models based on the multi-dimensional structure of neuronal population activity, as identified using dimensionality reduction. This approach has been used to study working memory, decision making, motor control, and more. Dimensionality reduction has provided common ground for incisive comparisons and tight interplay between neuronal recordings and network models.
Collapse
Affiliation(s)
- Ryan C Williamson
- Center for the Neural Basis of Cognition, Pittsburgh, PA, USA; Department of Machine Learning, Carnegie Mellon University, Pittsburgh, PA, USA; School of Medicine, University of Pittsburgh, Pittsburgh, PA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, Pittsburgh, PA, USA; Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| | - Matthew A Smith
- Center for the Neural Basis of Cognition, Pittsburgh, PA, USA; Department of Ophthalmology, University of Pittsburgh, Pittsburgh, PA, USA; Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA
| | - Byron M Yu
- Center for the Neural Basis of Cognition, Pittsburgh, PA, USA; Department of Electrical Engineering, Carnegie Mellon University, Pittsburgh, PA, USA; Department of Biomedical Engineering, Carnegie Mellon University, Pittsburgh, PA, USA.
| |
Collapse
|
74
|
Smelov PS, Proskurkin IS, Vanag VK. Controllable switching between stable modes in a small network of pulse-coupled chemical oscillators. Phys Chem Chem Phys 2019; 21:3033-3043. [DOI: 10.1039/c8cp07374k] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
Abstract
Switching between stable oscillatory modes in a network of four Belousov–Zhabotinsky oscillators unidirectionally coupled in a ring analysed computationally and experimentally.
Collapse
Affiliation(s)
- Pavel S. Smelov
- Centre for Nonlinear Chemistry
- Immanuel Kant Baltic Federal University
- Kaliningrad
- Russia
| | - Ivan S. Proskurkin
- Centre for Nonlinear Chemistry
- Immanuel Kant Baltic Federal University
- Kaliningrad
- Russia
| | - Vladimir K. Vanag
- Centre for Nonlinear Chemistry
- Immanuel Kant Baltic Federal University
- Kaliningrad
- Russia
| |
Collapse
|
75
|
Carrillo-Medina JL, Latorre R. Detection of Activation Sequences in Spiking-Bursting Neurons by means of the Recognition of Intraburst Neural Signatures. Sci Rep 2018; 8:16726. [PMID: 30425274 PMCID: PMC6233224 DOI: 10.1038/s41598-018-34757-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2018] [Accepted: 10/24/2018] [Indexed: 11/18/2022] Open
Abstract
Bursting activity is present in many cells of different nervous systems playing important roles in neural information processing. Multiple assemblies of bursting neurons act cooperatively to produce coordinated spatio-temporal patterns of sequential activity. A major goal in neuroscience is unveiling the mechanisms underlying neural information processing based on this sequential dynamics. Experimental findings have revealed the presence of precise cell-type-specific intraburst firing patterns in the activity of some bursting neurons. This characteristic neural signature coexists with the information encoded in other aspects of the spiking-bursting signals, and its functional meaning is still unknown. We investigate the ability of a neuron conductance-based model to detect specific presynaptic activation sequences taking advantage of intraburst fingerprints identifying the source of the signals building up a sequential pattern of activity. Our simulations point out that a reader neuron could use this information to contextualize incoming signals and accordingly compute a characteristic response by relying on precise phase relationships among the activity of different emitters. This would provide individual neurons enhanced capabilities to control and negotiate sequential dynamics. In this regard, we discuss the possible implications of the proposed contextualization mechanism for neural information processing.
Collapse
Affiliation(s)
- José Luis Carrillo-Medina
- Departamento de Eléctrica y Electrónica, Universidad de las Fuerzas Armadas - ESPE, Sangolquí, Ecuador
| | - Roberto Latorre
- Grupo de Neurocomputación Biológica, Dpto. Ingeniería Informática, Universidad Autónoma de Madrid, 28049, Madrid, Spain.
| |
Collapse
|
76
|
Hardy NF, Goudar V, Romero-Sosa JL, Buonomano DV. A model of temporal scaling correctly predicts that motor timing improves with speed. Nat Commun 2018; 9:4732. [PMID: 30413692 PMCID: PMC6226482 DOI: 10.1038/s41467-018-07161-6] [Citation(s) in RCA: 30] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2018] [Accepted: 10/20/2018] [Indexed: 11/09/2022] Open
Abstract
Timing is fundamental to complex motor behaviors: from tying a knot to playing the piano. A general feature of motor timing is temporal scaling: the ability to produce motor patterns at different speeds. One theory of temporal processing proposes that the brain encodes time in dynamic patterns of neural activity (population clocks), here we first examine whether recurrent neural network (RNN) models can account for temporal scaling. Appropriately trained RNNs exhibit temporal scaling over a range similar to that of humans and capture a signature of motor timing, Weber's law, but predict that temporal precision improves at faster speeds. Human psychophysics experiments confirm this prediction: the variability of responses in absolute time are lower at faster speeds. These results establish that RNNs can account for temporal scaling and suggest a novel psychophysical principle: the Weber-Speed effect.
Collapse
Affiliation(s)
- Nicholas F Hardy
- Neuroscience Interdepartmental Program, University of California Los Angeles, Los Angeles, CA, 90095, USA
- Departments of Neurobiology, University of California Los Angeles, Los Angeles, CA, 90095, USA
| | - Vishwa Goudar
- Departments of Neurobiology, University of California Los Angeles, Los Angeles, CA, 90095, USA
| | - Juan L Romero-Sosa
- Departments of Neurobiology, University of California Los Angeles, Los Angeles, CA, 90095, USA
| | - Dean V Buonomano
- Neuroscience Interdepartmental Program, University of California Los Angeles, Los Angeles, CA, 90095, USA.
- Departments of Neurobiology, University of California Los Angeles, Los Angeles, CA, 90095, USA.
- Departments of Psychology, University of California Los Angeles, Los Angeles, CA, 90095, USA.
| |
Collapse
|
77
|
Romera M, Talatchian P, Tsunegi S, Abreu Araujo F, Cros V, Bortolotti P, Trastoy J, Yakushiji K, Fukushima A, Kubota H, Yuasa S, Ernoult M, Vodenicarevic D, Hirtzlin T, Locatelli N, Querlioz D, Grollier J. Vowel recognition with four coupled spin-torque nano-oscillators. Nature 2018; 563:230-234. [PMID: 30374193 DOI: 10.1038/s41586-018-0632-y] [Citation(s) in RCA: 110] [Impact Index Per Article: 15.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2017] [Accepted: 07/31/2018] [Indexed: 11/10/2022]
Abstract
In recent years, artificial neural networks have become the flagship algorithm of artificial intelligence1. In these systems, neuron activation functions are static, and computing is achieved through standard arithmetic operations. By contrast, a prominent branch of neuroinspired computing embraces the dynamical nature of the brain and proposes to endow each component of a neural network with dynamical functionality, such as oscillations, and to rely on emergent physical phenomena, such as synchronization2-6, for solving complex problems with small networks7-11. This approach is especially interesting for hardware implementations, because emerging nanoelectronic devices can provide compact and energy-efficient nonlinear auto-oscillators that mimic the periodic spiking activity of biological neurons12-16. The dynamical couplings between oscillators can then be used to mediate the synaptic communication between the artificial neurons. One challenge for using nanodevices in this way is to achieve learning, which requires fine control and tuning of their coupled oscillations17; the dynamical features of nanodevices can be difficult to control and prone to noise and variability18. Here we show that the outstanding tunability of spintronic nano-oscillators-that is, the possibility of accurately controlling their frequency across a wide range, through electrical current and magnetic field-can be used to address this challenge. We successfully train a hardware network of four spin-torque nano-oscillators to recognize spoken vowels by tuning their frequencies according to an automatic real-time learning rule. We show that the high experimental recognition rates stem from the ability of these oscillators to synchronize. Our results demonstrate that non-trivial pattern classification tasks can be achieved with small hardware neural networks by endowing them with nonlinear dynamical features such as oscillations and synchronization.
Collapse
Affiliation(s)
- Miguel Romera
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France
| | - Philippe Talatchian
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France
| | - Sumito Tsunegi
- National Institute of Advanced Industrial Science and Technology (AIST), Spintronics Research Center, Tsukuba, Ibaraki, Japan
| | - Flavio Abreu Araujo
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France.,Institute of Condensed Matter and Nanosciences, UC Louvain, Louvain-la-Neuve, Belgium
| | - Vincent Cros
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France
| | - Paolo Bortolotti
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France
| | - Juan Trastoy
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France
| | - Kay Yakushiji
- National Institute of Advanced Industrial Science and Technology (AIST), Spintronics Research Center, Tsukuba, Ibaraki, Japan
| | - Akio Fukushima
- National Institute of Advanced Industrial Science and Technology (AIST), Spintronics Research Center, Tsukuba, Ibaraki, Japan
| | - Hitoshi Kubota
- National Institute of Advanced Industrial Science and Technology (AIST), Spintronics Research Center, Tsukuba, Ibaraki, Japan
| | - Shinji Yuasa
- National Institute of Advanced Industrial Science and Technology (AIST), Spintronics Research Center, Tsukuba, Ibaraki, Japan
| | - Maxence Ernoult
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France.,Centre de Nanosciences et de Nanotechnologies, CNRS, Université Paris-Sud, Université Paris-Saclay, Orsay, France
| | - Damir Vodenicarevic
- Centre de Nanosciences et de Nanotechnologies, CNRS, Université Paris-Sud, Université Paris-Saclay, Orsay, France
| | - Tifenn Hirtzlin
- Centre de Nanosciences et de Nanotechnologies, CNRS, Université Paris-Sud, Université Paris-Saclay, Orsay, France
| | - Nicolas Locatelli
- Centre de Nanosciences et de Nanotechnologies, CNRS, Université Paris-Sud, Université Paris-Saclay, Orsay, France
| | - Damien Querlioz
- Centre de Nanosciences et de Nanotechnologies, CNRS, Université Paris-Sud, Université Paris-Saclay, Orsay, France.
| | - Julie Grollier
- Unité Mixte de Physique, CNRS, Thales, Université Paris-Sud, Université Paris-Saclay, Palaiseau, France.
| |
Collapse
|
78
|
Dehghani N. Theoretical Principles of Multiscale Spatiotemporal Control of Neuronal Networks: A Complex Systems Perspective. Front Comput Neurosci 2018; 12:81. [PMID: 30349469 PMCID: PMC6187923 DOI: 10.3389/fncom.2018.00081] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2018] [Accepted: 09/11/2018] [Indexed: 01/14/2023] Open
Abstract
Success in the fine control of the nervous system depends on a deeper understanding of how neural circuits control behavior. There is, however, a wide gap between the components of neural circuits and behavior. We advance the idea that a suitable approach for narrowing this gap has to be based on a multiscale information-theoretic description of the system. We evaluate the possibility that brain-wide complex neural computations can be dissected into a hierarchy of computational motifs that rely on smaller circuit modules interacting at multiple scales. In doing so, we draw attention to the importance of formalizing the goals of stimulation in terms of neural computations so that the possible implementations are matched in scale to the underlying circuit modules.
Collapse
Affiliation(s)
- Nima Dehghani
- Department of Physics, Massachusetts Institute of Technology, Cambridge, MA, United States
- Center for Brains, Minds and Machines, Massachusetts Institute of Technology, Cambridge, MA, United States
| |
Collapse
|
79
|
Cavanna F, Vilas MG, Palmucci M, Tagliazucchi E. Dynamic functional connectivity and brain metastability during altered states of consciousness. Neuroimage 2018; 180:383-395. [DOI: 10.1016/j.neuroimage.2017.09.065] [Citation(s) in RCA: 76] [Impact Index Per Article: 10.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2017] [Revised: 09/01/2017] [Accepted: 09/29/2017] [Indexed: 11/16/2022] Open
|
80
|
Delshams A, Guillamon A, Huguet G. Quasiperiodic perturbations of heteroclinic attractor networks. CHAOS (WOODBURY, N.Y.) 2018; 28:103111. [PMID: 30384643 DOI: 10.1063/1.5050081] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2018] [Accepted: 09/23/2018] [Indexed: 06/08/2023]
Abstract
We consider heteroclinic attractor networks motivated by models of competition between neural populations during binocular rivalry. We show that gamma distributions of dominance times observed experimentally in binocular rivalry and other forms of bistable perception, commonly explained by means of noise in the models, can be achieved with quasiperiodic perturbations. For this purpose, we present a methodology based on the separatrix map to model the dynamics close to heteroclinic networks with quasiperiodic perturbations. Our methodology unifies two different approaches, one based on Melnikov integrals and the other one based on variational equations. We apply it to two models: first, to the Duffing equation, which comes from the perturbation of a Hamiltonian system and, second, to a heteroclinic attractor network for binocular rivalry, for which we develop a suitable method based on Melnikov integrals for non-Hamiltonian systems. In both models, the perturbed system shows chaotic behavior, while dominance times achieve good agreement with gamma distributions. Moreover, the separatrix map provides a new (discrete) model for bistable perception which, in addition, replaces the numerical integration of time-continuous models and, consequently, reduces the computational cost and avoids numerical instabilities.
Collapse
Affiliation(s)
- Amadeu Delshams
- Departament de Matemàtiques, Universitat Politècnica de Catalunya, Avda. Dr. Marañon 44-50, 08028 Barcelona, Spain
| | - Antoni Guillamon
- Departament de Matemàtiques, Universitat Politècnica de Catalunya, Avda. Dr. Marañon 44-50, 08028 Barcelona, Spain
| | - Gemma Huguet
- Departament de Matemàtiques, Universitat Politècnica de Catalunya, Avda. Diagonal 647, 08028 Barcelona, Spain
| |
Collapse
|
81
|
Williams NJ, Daly I, Nasuto SJ. Markov Model-Based Method to Analyse Time-Varying Networks in EEG Task-Related Data. Front Comput Neurosci 2018; 12:76. [PMID: 30297993 PMCID: PMC6160873 DOI: 10.3389/fncom.2018.00076] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2018] [Accepted: 08/20/2018] [Indexed: 12/27/2022] Open
Abstract
The dynamic nature of functional brain networks is being increasingly recognized in cognitive neuroscience, and methods to analyse such time-varying networks in EEG/MEG data are required. In this work, we propose a pipeline to characterize time-varying networks in single-subject EEG task-related data and further, evaluate its validity on both simulated and experimental datasets. Pre-processing is done to remove channel-wise and trial-wise differences in activity. Functional networks are estimated from short non-overlapping time windows within each “trial,” using a sparse-MVAR (Multi-Variate Auto-Regressive) model. Functional “states” are then identified by partitioning the entire space of functional networks into a small number of groups/symbols via k-means clustering.The multi-trial sequence of symbols is then described by a Markov Model (MM). We show validity of this pipeline on realistic electrode-level simulated EEG data, by demonstrating its ability to discriminate “trials” from two experimental conditions in a range of scenarios. We then apply it to experimental data from two individuals using a Brain-Computer Interface (BCI) via a P300 oddball task. Using just the Markov Model parameters, we obtain statistically significant discrimination between target and non-target trials. The functional networks characterizing each ‘state’ were also highly similar between the two individuals. This work marks the first application of the Markov Model framework to infer time-varying networks from EEG/MEG data. Due to the pre-processing, results from the pipeline are orthogonal to those from conventional ERP averaging or a typical EEG microstate analysis. The results provide powerful proof-of-concept for a Markov model-based approach to analyzing the data, paving the way for its use to track rapid changes in interaction patterns as a task is being performed. MATLAB code for the entire pipeline has been made available.
Collapse
Affiliation(s)
- Nitin J Williams
- Neuroscience Center, Helsinki Institute of Life Science, University of Helsinki, Helsinki, Finland
| | - Ian Daly
- Brain-Computer Interfaces and Neural Engineering Laboratory, School of Computer Science and Electronic Engineering, University of Essex, Colchester, United Kingdom
| | - Slawomir J Nasuto
- Biomedical Sciences and Biomedical Engineering Division, School of Biological Sciences, University of Reading, Reading, United Kingdom
| |
Collapse
|
82
|
Rabinovich MI, Varona P. Discrete Sequential Information Coding: Heteroclinic Cognitive Dynamics. Front Comput Neurosci 2018; 12:73. [PMID: 30245621 PMCID: PMC6137616 DOI: 10.3389/fncom.2018.00073] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2018] [Accepted: 08/14/2018] [Indexed: 12/22/2022] Open
Abstract
Discrete sequential information coding is a key mechanism that transforms complex cognitive brain activity into a low-dimensional dynamical process based on the sequential switching among finite numbers of patterns. The storage size of the corresponding process is large because of the permutation capacity as a function of control signals in ensembles of these patterns. Extracting low-dimensional functional dynamics from multiple large-scale neural populations is a central problem both in neuro- and cognitive- sciences. Experimental results in the last decade represent a solid base for the creation of low-dimensional models of different cognitive functions and allow moving toward a dynamical theory of consciousness. We discuss here a methodology to build simple kinetic equations that can be the mathematical skeleton of this theory. Models of the corresponding discrete information processing can be designed using the following dynamical principles: (i) clusterization of the neural activity in space and time and formation of information patterns; (ii) robustness of the sequential dynamics based on heteroclinic chains of metastable clusters; and (iii) sensitivity of such sequential dynamics to intrinsic and external informational signals. We analyze sequential discrete coding based on winnerless competition low-frequency dynamics. Under such dynamics, entrainment, and heteroclinic coordination leads to a large variety of coding regimes that are invariant in time.
Collapse
Affiliation(s)
- Mikhail I Rabinovich
- BioCircuits Institute, University of California, San Diego, La Jolla, CA, United States
| | - Pablo Varona
- Grupo de Neurocomputación Biológica, Departamento de Ingeniería Informática, Escuela Politécnica Superior, Universidad Autónoma de Madrid, Madrid, Spain
| |
Collapse
|
83
|
Hastings A, Abbott KC, Cuddington K, Francis T, Gellner G, Lai YC, Morozov A, Petrovskii S, Scranton K, Zeeman ML. Transient phenomena in ecology. Science 2018; 361:eaat6412. [PMID: 30190378 DOI: 10.1126/science.aat6412] [Citation(s) in RCA: 222] [Impact Index Per Article: 31.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2018] [Accepted: 07/02/2018] [Indexed: 05/15/2025]
Abstract
The importance of transient dynamics in ecological systems and in the models that describe them has become increasingly recognized. However, previous work has typically treated each instance of these dynamics separately. We review both empirical examples and model systems, and outline a classification of transient dynamics based on ideas and concepts from dynamical systems theory. This classification provides ways to understand the likelihood of transients for particular systems, and to guide investigations to determine the timing of sudden switches in dynamics and other characteristics of transients. Implications for both management and underlying ecological theories emerge.
Collapse
Affiliation(s)
- Alan Hastings
- Department of Environmental Science and Policy, University of California, Davis, CA 95616, USA.
| | - Karen C Abbott
- Department of Biology, Case Western Reserve University, Cleveland, OH 44106, USA
| | - Kim Cuddington
- Department of Biology, University of Waterloo, Waterloo, Ontario N2L 3G1, Canada
| | - Tessa Francis
- Puget Sound Institute, University of Washington, Tacoma, WA 98421, USA
| | - Gabriel Gellner
- Department of Biology, Colorado State University, Fort Collins, CO 80523, USA
| | - Ying-Cheng Lai
- School of Electrical, Computer and Energy Engineering, Arizona State University, Tempe, AZ 85287, USA
| | - Andrew Morozov
- Department of Mathematics, University of Leicester, Leicester LE1 7RH, UK
- Shirshov Institute of Oceanology, Moscow 117851, Russia
| | - Sergei Petrovskii
- Department of Mathematics, University of Leicester, Leicester LE1 7RH, UK
| | - Katherine Scranton
- Department of Ecology and Evolutionary Biology, University of California, Los Angeles, CA 90095, USA
| | - Mary Lou Zeeman
- Department of Mathematics, Bowdoin College, Brunswick, ME 04011, USA
| |
Collapse
|
84
|
Robust, Transient Neural Dynamics during Conscious Perception. Trends Cogn Sci 2018; 22:563-565. [DOI: 10.1016/j.tics.2018.04.005] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2018] [Revised: 04/09/2018] [Accepted: 04/24/2018] [Indexed: 11/17/2022]
|
85
|
Heitmann S, Breakspear M. Putting the "dynamic" back into dynamic functional connectivity. Netw Neurosci 2018; 2:150-174. [PMID: 30215031 PMCID: PMC6130444 DOI: 10.1162/netn_a_00041] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2017] [Accepted: 12/30/2017] [Indexed: 01/17/2023] Open
Abstract
The study of fluctuations in time-resolved functional connectivity is a topic of substantial current interest. As the term "dynamic functional connectivity" implies, such fluctuations are believed to arise from dynamics in the neuronal systems generating these signals. While considerable activity currently attends to methodological and statistical issues regarding dynamic functional connectivity, less attention has been paid toward its candidate causes. Here, we review candidate scenarios for dynamic (functional) connectivity that arise in dynamical systems with two or more subsystems; generalized synchronization, itinerancy (a form of metastability), and multistability. Each of these scenarios arises under different configurations of local dynamics and intersystem coupling: We show how they generate time series data with nonlinear and/or nonstationary multivariate statistics. The key issue is that time series generated by coupled nonlinear systems contain a richer temporal structure than matched multivariate (linear) stochastic processes. In turn, this temporal structure yields many of the phenomena proposed as important to large-scale communication and computation in the brain, such as phase-amplitude coupling, complexity, and flexibility. The code for simulating these dynamics is available in a freeware software platform, the Brain Dynamics Toolbox.
Collapse
|
86
|
Abstract
Functional oscillator networks, such as neuronal networks in the brain, exhibit switching between metastable states involving many oscillators. We give exact results how such global dynamics can arise in paradigmatic phase oscillator networks: Higher-order network interactions give rise to metastable chimeras-localized frequency synchrony patterns-which are joined by heteroclinic connections. Moreover, we illuminate the mechanisms that underly the switching dynamics in these experimentally accessible networks.
Collapse
Affiliation(s)
- Christian Bick
- Oxford Centre for Industrial and Applied Mathematics, Mathematical Institute, University of Oxford, OX2 6GG, United Kingdom and Department of Mathematics and Centre for Systems Dynamics and Control, University of Exeter, EX4 4QF, United Kingdom
| |
Collapse
|
87
|
Frégnac Y. Big data and the industrialization of neuroscience: A safe roadmap for understanding the brain? Science 2018; 358:470-477. [PMID: 29074766 DOI: 10.1126/science.aan8866] [Citation(s) in RCA: 55] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
New technologies in neuroscience generate reams of data at an exponentially increasing rate, spurring the design of very-large-scale data-mining initiatives. Several supranational ventures are contemplating the possibility of achieving, within the next decade(s), full simulation of the human brain.
Collapse
Affiliation(s)
- Yves Frégnac
- Unité de Neuroscience, Information et Complexité (UNIC-CNRS), Gif-sur-Yvette, France.
| |
Collapse
|
88
|
Qiao J, Wang G, Li X, Li W. A self-organizing deep belief network for nonlinear system modeling. Appl Soft Comput 2018. [DOI: 10.1016/j.asoc.2018.01.019] [Citation(s) in RCA: 39] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
89
|
Goudar V, Buonomano DV. Encoding sensory and motor patterns as time-invariant trajectories in recurrent neural networks. eLife 2018. [PMID: 29537963 PMCID: PMC5851701 DOI: 10.7554/elife.31134] [Citation(s) in RCA: 39] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023] Open
Abstract
Much of the information the brain processes and stores is temporal in nature—a spoken word or a handwritten signature, for example, is defined by how it unfolds in time. However, it remains unclear how neural circuits encode complex time-varying patterns. We show that by tuning the weights of a recurrent neural network (RNN), it can recognize and then transcribe spoken digits. The model elucidates how neural dynamics in cortical networks may resolve three fundamental challenges: first, encode multiple time-varying sensory and motor patterns as stable neural trajectories; second, generalize across relevant spatial features; third, identify the same stimuli played at different speeds—we show that this temporal invariance emerges because the recurrent dynamics generate neural trajectories with appropriately modulated angular velocities. Together our results generate testable predictions as to how recurrent networks may use different mechanisms to generalize across the relevant spatial and temporal features of complex time-varying stimuli.
Collapse
Affiliation(s)
- Vishwa Goudar
- Departments of Neurobiology, University of California, Los Angeles, Los Angeles, United States
| | - Dean V Buonomano
- Departments of Neurobiology, University of California, Los Angeles, Los Angeles, United States.,Integrative Center for Learning and Memory, University of California, Los Angeles, Los Angeles, United States.,Departments of Psychology, University of California, Los Angeles, Los Angeles, United States
| |
Collapse
|
90
|
Vanag VK, Yasuk VO. Dynamic modes in a network of five oscillators with inhibitory all-to-all pulse coupling. CHAOS (WOODBURY, N.Y.) 2018; 28:033105. [PMID: 29604639 DOI: 10.1063/1.5004015] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
The dynamic modes of five almost identical oscillators with pulsatile inhibitory coupling with time delay have been studied theoretically. The models of the Belousov-Zhabotinsky reaction and phase oscillators with all-to-all coupling have been considered. In the parametric plane Cinh-τ, where Cinh is the coupling strength and τ is the time delay between a spike in one oscillator and pulsed perturbations of all other oscillators, three main regimes have been found: regular modes, when each oscillator gives only one spike during the global period T, C (complex) modes, when the number of pulses of different oscillators is different, and OS (oscillations-suppression) modes, when at least one oscillator is suppressed. The regular modes consist of several cluster modes and are found at relatively small Cinh. The C and OS modes observed at larger Cinh intertwine in the Cinh-τ plane. In a relatively narrow range of Cinh, the dynamics of the C modes are very sensitive to small changes in Cinh and τ, as well as to the initial conditions, which are the characteristic features of the chaos. On the other hand, the dynamics of the C modes are periodic (but with different periods) and well reproducible. The number of different C modes is enormously large. At still larger Cinh, the C modes lose sensitivity to small changes in the parameters and finally vanish, while the OS modes survive.
Collapse
Affiliation(s)
- Vladimir K Vanag
- Immanuel Kant Baltic Federal University, 14 A. Nevskogo str., Kaliningrad 236041, Russia
| | - Vitaly O Yasuk
- Immanuel Kant Baltic Federal University, 14 A. Nevskogo str., Kaliningrad 236041, Russia
| |
Collapse
|
91
|
Varona P, Rabinovich MI. Hierarchical dynamics of informational patterns and decision-making. Proc Biol Sci 2017; 283:rspb.2016.0475. [PMID: 27252020 DOI: 10.1098/rspb.2016.0475] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2016] [Accepted: 05/05/2016] [Indexed: 12/22/2022] Open
Abstract
Traditional studies on the interaction of cognitive functions in healthy and disordered brains have used the analyses of the connectivity of several specialized brain networks-the functional connectome. However, emerging evidence suggests that both brain networks and functional spontaneous brain-wide network communication are intrinsically dynamic. In the light of studies investigating the cooperation between different cognitive functions, we consider here the dynamics of hierarchical networks in cognitive space. We show, using an example of behavioural decision-making based on sequential episodic memory, how the description of metastable pattern dynamics underlying basic cognitive processes helps to understand and predict complex processes like sequential episodic memory recall and competition among decision strategies. The mathematical images of the discussed phenomena in the phase space of the corresponding cognitive model are hierarchical heteroclinic networks. One of the most important features of such networks is the robustness of their dynamics. Different kinds of instabilities of these dynamics can be related to 'dynamical signatures' of creativity and different psychiatric disorders. The suggested approach can also be useful for the understanding of the dynamical processes that are the basis of consciousness.
Collapse
Affiliation(s)
- Pablo Varona
- Grupo de Neurocomputación Biológica, Departamento de Ingeniería Informática, Escuela Politécnica Superior, Universidad Autónoma de Madrid, 28049 Madrid, Spain
| | - Mikhail I Rabinovich
- BioCircuits Institute, University of California, San Diego, 9500 Gilman Drive #0328, La Jolla, CA 92093-0328, USA
| |
Collapse
|
92
|
Abstract
Implicit expectations induced by predictable stimuli sequences affect neuronal response to upcoming stimuli at both single cell and neural population levels. Temporally regular sensory streams also phase entrain ongoing low frequency brain oscillations but how and why this happens is unknown. Here we investigate how random recurrent neural networks without plasticity respond to stimuli streams containing oddballs. We found the neuronal correlates of sensory stream adaptation emerge if networks generate chaotic oscillations which can be phase entrained by stimulus streams. The resultant activity patterns are close to critical and support history dependent response on long timescales. Because critical network entrainment is a slow process stimulus response adapts gradually over multiple repetitions. Repeated stimuli generate suppressed responses but oddball responses are large and distinct. Oscillatory mismatch responses persist in population activity for long periods after stimulus offset while individual cell mismatch responses are strongly phasic. These effects are weakened in temporally irregular sensory streams. Thus we show that network phase entrainment provides a biologically plausible mechanism for neural oddball detection. Our results do not depend on specific network characteristics, are consistent with experimental studies and may be relevant for multiple pathologies demonstrating altered mismatch processing such as schizophrenia and depression.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM T.J. Watson Research Center, Yorktown Heights, NY, USA.
- Okinawa Institute of Science and Technology Graduate University (OIST), Okinawa, Japan.
| |
Collapse
|
93
|
Baria AT, Maniscalco B, He BJ. Initial-state-dependent, robust, transient neural dynamics encode conscious visual perception. PLoS Comput Biol 2017; 13:e1005806. [PMID: 29176808 PMCID: PMC5720802 DOI: 10.1371/journal.pcbi.1005806] [Citation(s) in RCA: 50] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2017] [Revised: 12/07/2017] [Accepted: 10/01/2017] [Indexed: 01/09/2023] Open
Abstract
Recent research has identified late-latency, long-lasting neural activity as a robust correlate of conscious perception. Yet, the dynamical nature of this activity is poorly understood, and the mechanisms governing its presence or absence and the associated conscious perception remain elusive. We applied dynamic-pattern analysis to whole-brain slow (< 5 Hz) cortical dynamics recorded by magnetoencephalography (MEG) in human subjects performing a threshold-level visual perception task. Up to 1 second before stimulus onset, brain activity pattern across widespread cortices significantly predicted whether a threshold-level visual stimulus was later consciously perceived. This initial state of brain activity interacts nonlinearly with stimulus input to shape the evolving cortical activity trajectory, with seen and unseen trials following well separated trajectories. We observed that cortical activity trajectories during conscious perception are fast evolving and robust to small variations in the initial state. In addition, spontaneous brain activity pattern prior to stimulus onset also influences unconscious perceptual making in unseen trials. Together, these results suggest that brain dynamics underlying conscious visual perception belongs to the class of initial-state-dependent, robust, transient neural dynamics. What brain mechanisms underlie conscious perception? A commonly adopted paradigm for studying this question is to present human subjects with threshold-level stimuli. When shown repeatedly, the same stimulus is sometimes consciously perceived, sometimes not. Using magnetoencephalography, we shed light on the neural mechanisms governing whether the stimulus is consciously perceived in a given trial. We observed that depending on the initial brain state defined by widespread activity pattern in the slow cortical potential (<5 Hz) range, a physically identical, brief (30–60 ms) stimulus input triggers distinct sequences of activity pattern evolution over time that correspond to either consciously perceiving the stimulus or not. Such activity pattern evolution forms a “trajectory” in the state space and affords significant single-trial decoding of perceptual outcome from 1 sec before to 3 sec after stimulus onset. While previous theories on conscious perception have emphasized sustained, high-level activity, we found that brain dynamics underlying conscious perception exhibit fast-changing activity patterns. These results significantly further our understanding on the neural mechanisms governing conscious access of a stimulus and the dynamical nature of distributed neural activity underlying conscious perception.
Collapse
Affiliation(s)
- Alexis T. Baria
- National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland, United States of America
| | - Brian Maniscalco
- National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland, United States of America
- Neuroscience Institute, New York University Langone Medical Center, New York, NY, United States of America
| | - Biyu J. He
- National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland, United States of America
- Neuroscience Institute, New York University Langone Medical Center, New York, NY, United States of America
- Departments of Neurology, Neuroscience and Physiology, and Radiology, New York University Langone Medical Center, New York, NY, United States of America
- * E-mail:
| |
Collapse
|
94
|
Chaisangmongkon W, Swaminathan SK, Freedman DJ, Wang XJ. Computing by Robust Transience: How the Fronto-Parietal Network Performs Sequential, Category-Based Decisions. Neuron 2017; 93:1504-1517.e4. [PMID: 28334612 DOI: 10.1016/j.neuron.2017.03.002] [Citation(s) in RCA: 101] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2016] [Revised: 09/30/2016] [Accepted: 02/27/2017] [Indexed: 10/19/2022]
Abstract
Decision making involves dynamic interplay between internal judgements and external perception, which has been investigated in delayed match-to-category (DMC) experiments. Our analysis of neural recordings shows that, during DMC tasks, LIP and PFC neurons demonstrate mixed, time-varying, and heterogeneous selectivity, but previous theoretical work has not established the link between these neural characteristics and population-level computations. We trained a recurrent network model to perform DMC tasks and found that the model can remarkably reproduce key features of neuronal selectivity at the single-neuron and population levels. Analysis of the trained networks elucidates that robust transient trajectories of the neural population are the key driver of sequential categorical decisions. The directions of trajectories are governed by network self-organized connectivity, defining a "neural landscape" consisting of a task-tailored arrangement of slow states and dynamical tunnels. With this model, we can identify functionally relevant circuit motifs and generalize the framework to solve other categorization tasks.
Collapse
Affiliation(s)
- Warasinee Chaisangmongkon
- Department of Neurobiology and Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, CT 06511, USA; Institute of Field Robotics, King Mongkut's University of Technology Thonburi, Bangkok 10140, Thailand
| | | | - David J Freedman
- Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA; Grossman Institute for Neuroscience, Quantitative Biology, and Human Behavior, Chicago, IL 60637, USA
| | - Xiao-Jing Wang
- Department of Neurobiology and Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, CT 06511, USA; Center for Neural Science, New York University, New York, NY 10003, USA; NYU-ECNU Joint Institute of Brain and Cognitive Science, NYU-Shanghai, Shanghai 200122, China.
| |
Collapse
|
95
|
Dechery JB, MacLean JN. Emergent cortical circuit dynamics contain dense, interwoven ensembles of spike sequences. J Neurophysiol 2017; 118:1914-1925. [PMID: 28724786 DOI: 10.1152/jn.00394.2017] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2017] [Revised: 07/05/2017] [Accepted: 07/14/2017] [Indexed: 01/30/2023] Open
Abstract
Temporal codes are theoretically powerful encoding schemes, but their precise form in the neocortex remains unknown in part because of the large number of possible codes and the difficulty in disambiguating informative spikes from statistical noise. A biologically plausible and computationally powerful temporal coding scheme is the Hebbian assembly phase sequence (APS), which predicts reliable propagation of spikes between functionally related assemblies of neurons. Here, we sought to measure the inherent capacity of neocortical networks to produce reliable sequences of spikes, as would be predicted by an APS code. To record microcircuit activity, the scale at which computation is implemented, we used two-photon calcium imaging to densely sample spontaneous activity in murine neocortical networks ex vivo. We show that the population spike histogram is sufficient to produce a spatiotemporal progression of activity across the population. To more comprehensively evaluate the capacity for sequential spiking that cannot be explained by the overall population spiking, we identify statistically significant spike sequences. We found a large repertoire of sequence spikes that collectively comprise the majority of spiking in the circuit. Sequences manifest probabilistically and share neuron membership, resulting in unique ensembles of interwoven sequences characterizing individual spatiotemporal progressions of activity. Distillation of population dynamics into its constituent sequences provides a way to capture trial-to-trial variability and may prove to be a powerful decoding substrate in vivo. Informed by these data, we suggest that the Hebbian APS be reformulated as interwoven sequences with flexible assembly membership due to shared overlapping neurons.NEW & NOTEWORTHY Neocortical computation occurs largely within microcircuits comprised of individual neurons and their connections within small volumes (<500 μm3). We found evidence for a long-postulated temporal code, the Hebbian assembly phase sequence, by identifying repeated and co-occurring sequences of spikes. Variance in population activity across trials was explained in part by the ensemble of active sequences. The presence of interwoven sequences suggests that neuronal assembly structure can be variable and is determined by previous activity.
Collapse
Affiliation(s)
- Joseph B Dechery
- Committee on Computational Neuroscience, University of Chicago, Chicago, Illinois; and
| | - Jason N MacLean
- Committee on Computational Neuroscience, University of Chicago, Chicago, Illinois; and .,Department of Neurobiology, University of Chicago, Illinois
| |
Collapse
|
96
|
Maslennikov OV, Shchapin DS, Nekorkin VI. Transient sequences in a hypernetwork generated by an adaptive network of spiking neurons. PHILOSOPHICAL TRANSACTIONS. SERIES A, MATHEMATICAL, PHYSICAL, AND ENGINEERING SCIENCES 2017; 375:20160288. [PMID: 28507233 PMCID: PMC5434079 DOI: 10.1098/rsta.2016.0288] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 11/24/2016] [Indexed: 05/24/2023]
Abstract
We propose a model of an adaptive network of spiking neurons that gives rise to a hypernetwork of its dynamic states at the upper level of description. Left to itself, the network exhibits a sequence of transient clustering which relates to a traffic in the hypernetwork in the form of a random walk. Receiving inputs the system is able to generate reproducible sequences corresponding to stimulus-specific paths in the hypernetwork. We illustrate these basic notions by a simple network of discrete-time spiking neurons together with its FPGA realization and analyse their properties.This article is part of the themed issue 'Mathematical methods in medicine: neuroscience, cardiology and pathology'.
Collapse
Affiliation(s)
- Oleg V Maslennikov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
| | - Dmitry S Shchapin
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
| | - Vladimir I Nekorkin
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
| |
Collapse
|
97
|
Deco G, Kringelbach ML, Jirsa VK, Ritter P. The dynamics of resting fluctuations in the brain: metastability and its dynamical cortical core. Sci Rep 2017; 7:3095. [PMID: 28596608 PMCID: PMC5465179 DOI: 10.1038/s41598-017-03073-5] [Citation(s) in RCA: 278] [Impact Index Per Article: 34.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2015] [Accepted: 04/06/2017] [Indexed: 01/22/2023] Open
Abstract
In the human brain, spontaneous activity during resting state consists of rapid transitions between functional network states over time but the underlying mechanisms are not understood. We use connectome based computational brain network modeling to reveal fundamental principles of how the human brain generates large-scale activity observable by noninvasive neuroimaging. We used structural and functional neuroimaging data to construct whole- brain models. With this novel approach, we reveal that the human brain during resting state operates at maximum metastability, i.e. in a state of maximum network switching. In addition, we investigate cortical heterogeneity across areas. Optimization of the spectral characteristics of each local brain region revealed the dynamical cortical core of the human brain, which is driving the activity of the rest of the whole brain. Brain network modelling goes beyond correlational neuroimaging analysis and reveals non-trivial network mechanisms underlying non-invasive observations. Our novel findings significantly pertain to the important role of computational connectomics in understanding principles of brain function.
Collapse
Affiliation(s)
- Gustavo Deco
- Center for Brain and Cognition, Computational Neuroscience Group, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Roc Boronat 138, Barcelona, 08018, Spain.,Institució Catalana de la Recerca i Estudis Avançats (ICREA), Passeig Lluís Companys 23, Barcelona, 08010, Spain.,Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, 04103, Leipzig, Germany.,School of Psychological Sciences, Monash University, Melbourne, Clayton VIC 3800, Australia
| | - Morten L Kringelbach
- Department of Psychiatry, University of Oxford, Oxford, UK. .,Center of Music in the Brain (MIB), Clinical Medicine, Aarhus University, Aarhus C, DK, Denmark.
| | - Viktor K Jirsa
- Aix Marseille Univ, INSERM, INS, Inst Neurosci Syst, Marseille, France
| | - Petra Ritter
- Max-Planck Institute for Cognitive and Brain Sciences, Leipzig, Germany.,Department of Neurology, Charité, Charitéplatz 1, 10117, Berlin, Germany
| |
Collapse
|
98
|
Working Memory Requires a Combination of Transient and Attractor-Dominated Dynamics to Process Unreliably Timed Inputs. Sci Rep 2017; 7:2473. [PMID: 28559576 PMCID: PMC5449410 DOI: 10.1038/s41598-017-02471-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2016] [Accepted: 04/11/2017] [Indexed: 12/20/2022] Open
Abstract
Working memory stores and processes information received as a stream of continuously incoming stimuli. This requires accurate sequencing and it remains puzzling how this can be reliably achieved by the neuronal system as our perceptual inputs show a high degree of temporal variability. One hypothesis is that accurate timing is achieved by purely transient neuronal dynamics; by contrast a second hypothesis states that the underlying network dynamics are dominated by attractor states. In this study, we resolve this contradiction by theoretically investigating the performance of the system using stimuli with differently accurate timing. Interestingly, only the combination of attractor and transient dynamics enables the network to perform with a low error rate. Further analysis reveals that the transient dynamics of the system are used to process information, while the attractor states store it. The interaction between both types of dynamics yields experimentally testable predictions and we show that this way the system can reliably interact with a timing-unreliable Hebbian-network representing long-term memory. Thus, this study provides a potential solution to the long-standing problem of the basic neuronal dynamics underlying working memory.
Collapse
|
99
|
Balaguer-Ballester E. Cortical Variability and Challenges for Modeling Approaches. Front Syst Neurosci 2017; 11:15. [PMID: 28420968 PMCID: PMC5378710 DOI: 10.3389/fnsys.2017.00015] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2016] [Accepted: 03/06/2017] [Indexed: 11/16/2022] Open
Affiliation(s)
- Emili Balaguer-Ballester
- Department of Computing and Informatics, Faculty of Science and Technology, Bournemouth UniversityBournemouth, UK.,Bernstein Center for Computational Neuroscience, Medical Faculty Mannheim and Heidelberg UniversityMannheim, Germany
| |
Collapse
|
100
|
Neves FS, Voit M, Timme M. Noise-constrained switching times for heteroclinic computing. CHAOS (WOODBURY, N.Y.) 2017; 27:033107. [PMID: 28364740 DOI: 10.1063/1.4977552] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/20/2023]
Abstract
Heteroclinic computing offers a novel paradigm for universal computation by collective system dynamics. In such a paradigm, input signals are encoded as complex periodic orbits approaching specific sequences of saddle states. Without inputs, the relevant states together with the heteroclinic connections between them form a network of states-the heteroclinic network. Systems of pulse-coupled oscillators or spiking neurons naturally exhibit such heteroclinic networks of saddles, thereby providing a substrate for general analog computations. Several challenges need to be resolved before it becomes possible to effectively realize heteroclinic computing in hardware. The time scales on which computations are performed crucially depend on the switching times between saddles, which in turn are jointly controlled by the system's intrinsic dynamics and the level of external and measurement noise. The nonlinear dynamics of pulse-coupled systems often strongly deviate from that of time-continuously coupled (e.g., phase-coupled) systems. The factors impacting switching times in pulse-coupled systems are still not well understood. Here we systematically investigate switching times in dependence of the levels of noise and intrinsic dissipation in the system. We specifically reveal how local responses to pulses coact with external noise. Our findings confirm that, like in time-continuous phase-coupled systems, piecewise-continuous pulse-coupled systems exhibit switching times that transiently increase exponentially with the number of switches up to some order of magnitude set by the noise level. Complementarily, we show that switching times may constitute a good predictor for the computation reliability, indicating how often an input signal must be reiterated. By characterizing switching times between two saddles in conjunction with the reliability of a computation, our results provide a first step beyond the coding of input signal identities toward a complementary coding for the intensity of those signals. The results offer insights on how future heteroclinic computing systems may operate under natural, and thus noisy, conditions.
Collapse
Affiliation(s)
- Fabio Schittler Neves
- Network Dynamics, Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
| | - Maximilian Voit
- Network Dynamics, Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
| | - Marc Timme
- Network Dynamics, Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
| |
Collapse
|