1
|
Morrison M, Kutz JN. Nonlinear control of networked dynamical systems. IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING 2021; 8:174-189. [PMID: 33997094 PMCID: PMC8117950 DOI: 10.1109/tnse.2020.3032117] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
We develop a principled mathematical framework for controlling nonlinear, networked dynamical systems. Our method integrates dimensionality reduction, bifurcation theory, and emerging model discovery tools to find low-dimensional subspaces where feed-forward control can be used to manipulate a system to a desired outcome. The method leverages the fact that many high-dimensional networked systems have many fixed points, allowing for the computation of control signals that will move the system between any pair of fixed points. The sparse identification of nonlinear dynamics (SINDy) algorithm is used to fit a nonlinear dynamical system to the evolution on the dominant, low-rank subspace. This then allows us to use bifurcation theory to find collections of constant control signals that will produce the desired objective path for a prescribed outcome. Specifically, we can destabilize a given fixed point while making the target fixed point an attractor. The discovered control signals can be easily projected back to the original high-dimensional state and control space. We illustrate our nonlinear control procedure on established bistable, low-dimensional biological systems, showing how control signals are found that generate switches between the fixed points. We then demonstrate our control procedure for high-dimensional systems on random high-dimensional networks and Hopfield memory networks.
Collapse
Affiliation(s)
- Megan Morrison
- Department of Applied Mathematics, University of Washington, Seattle, WA, 98195 USA
| | - J Nathan Kutz
- Department of Applied Mathematics, University of Washington, Seattle, WA, 98195 USA
| |
Collapse
|
2
|
Fieseler C, Zimmer M, Kutz JN. Unsupervised learning of control signals and their encodings in Caenorhabditis elegans whole-brain recordings. J R Soc Interface 2020; 17:20200459. [PMID: 33292096 PMCID: PMC7811586 DOI: 10.1098/rsif.2020.0459] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Accepted: 11/12/2020] [Indexed: 01/08/2023] Open
Abstract
A major goal of computational neuroscience is to understand the relationship between synapse-level structure and network-level functionality. Caenorhabditis elegans is a model organism to probe this relationship due to the historic availability of the synaptic structure (connectome) and recent advances in whole brain calcium imaging techniques. Recent work has applied the concept of network controllability to neuronal networks, discovering some neurons that are able to drive the network to a certain state. However, previous work uses a linear model of the network dynamics, and it is unclear if the real neuronal network conforms to this assumption. Here, we propose a method to build a global, low-dimensional model of the dynamics, whereby an underlying global linear dynamical system is actuated by temporally sparse control signals. A key novelty of this method is discovering candidate control signals that the network uses to control itself. We analyse these control signals in two ways, showing they are interpretable and biologically plausible. First, these control signals are associated with transitions between behaviours, which were previously annotated via expert-generated features. Second, these signals can be predicted both from neurons previously implicated in behavioural transitions but also additional neurons previously unassociated with these behaviours. The proposed mathematical framework is generic and can be generalized to other neurosensory systems, potentially revealing transitions and their encodings in a completely unsupervised way.
Collapse
Affiliation(s)
- Charles Fieseler
- Department of Physics, University of Washington, Seattle, WA 98195, USA
| | - Manuel Zimmer
- Department of Neurobiology, University of Vienna, Althanstrasse 14, 1090 Vienna, Austria
- Research Institute of Molecular Pathology (IMP), Vienna Biocenter (VBC), Campus-Vienna-Biocenter 1, 1F030 Vienna, Austria
| | - J. Nathan Kutz
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195, USA
| |
Collapse
|
3
|
Short term depression, presynaptic inhibition and local neuron diversity play key functional roles in the insect antennal lobe. J Comput Neurosci 2020; 48:213-227. [PMID: 32388764 DOI: 10.1007/s10827-020-00747-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2019] [Revised: 03/13/2020] [Accepted: 04/17/2020] [Indexed: 10/24/2022]
Abstract
As the oldest, but least understood sensory system in evolution, the olfactory system represents one of the most challenging research targets in sensory neurobiology. Although a large number of computational models of the olfactory system have been proposed, they do not account for the diversity in physiology, connectivity of local neurons, and several recent discoveries in the insect antennal lobe, a major olfactory organ in insects. Recent studies revealed that the response of some projection neurons were reduced by application of a GABA antagonist, and that insects are sensitive to odor pulse frequency. To account for these observations, we propose a spiking neural circuit model of the insect antennal lobe. Based on recent anatomical and physiological studies, we included three sub-types of local neurons as well as synaptic short-term depression (STD) in the model and showed that the interaction between STD and local neurons resulted in frequency-sensitive responses. We further discovered that the unexpected response of the projection neurons to the GABA antagonist is the result of complex interactions between STD and presynaptic inhibition, which is required for enhancing sensitivity to odor stimuli. Finally, we found that odor discrimination is improved if the innervation of the local neurons in the glomeruli follows a specific pattern. Our findings suggest that STD, presynaptic inhibition and diverse physiology and connectivity of local neurons are not independent properties, but they interact to play key roles in the function of antennal lobes.
Collapse
|
4
|
Neural Circuit Dynamics for Sensory Detection. J Neurosci 2020; 40:3408-3423. [PMID: 32165416 DOI: 10.1523/jneurosci.2185-19.2020] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2019] [Revised: 01/19/2020] [Accepted: 02/28/2020] [Indexed: 11/21/2022] Open
Abstract
We consider the question of how sensory networks enable the detection of sensory stimuli in a combinatorial coding space. We are specifically interested in the olfactory system, wherein recent experimental studies have reported the existence of rich, enigmatic response patterns associated with stimulus onset and offset. This study aims to identify the functional relevance of such response patterns (i.e., what benefits does such neural activity provide in the context of detecting stimuli in a natural environment). We study this problem through the lens of normative, optimization-based modeling. Here, we define the notion of a low-dimensional latent representation of stimulus identity, which is generated through action of the sensory network. The objective of our optimization framework is to ensure high-fidelity tracking of a nominal representation in this latent space in an energy-efficient manner. It turns out that the optimal motifs emerging from this framework possess morphologic similarity with prototypical onset and offset responses observed in vivo in locusts (Schistocerca americana) of either sex. Furthermore, this objective can be exactly achieved by a network with reciprocal excitatory-inhibitory competitive dynamics, similar to interactions between projection neurons and local neurons in the early olfactory system of insects. The derived model also makes several predictions regarding maintenance of robust latent representations in the presence of confounding background information and trade-offs between the energy of sensory activity and resultant behavioral measures such as speed and accuracy of stimulus detection.SIGNIFICANCE STATEMENT A key area of study in olfactory coding involves understanding the transformation from high-dimensional sensory stimulus to low-dimensional decoded representation. Here, we examine not only the dimensionality reduction of this mapping but also its temporal dynamics, with specific focus on stimuli that are temporally continuous. Through optimization-based synthesis, we examine how sensory networks can track representations without prior assumption of discrete trial structure. We show that such tracking can be achieved by canonical network architectures and dynamics, and that the resulting responses resemble observations from neurons in the insect olfactory system. Thus, our results provide hypotheses regarding the functional role of olfactory circuit activity at both single neuronal and population scales.
Collapse
|
5
|
Kim J, Leahy W, Shlizerman E. Neural Interactome: Interactive Simulation of a Neuronal System. Front Comput Neurosci 2019; 13:8. [PMID: 30930759 PMCID: PMC6425397 DOI: 10.3389/fncom.2019.00008] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2018] [Accepted: 01/30/2019] [Indexed: 01/14/2023] Open
Abstract
Connectivity and biophysical processes determine the functionality of neuronal networks. We, therefore, developed a real-time framework, called Neural Interactome,, to simultaneously visualize and interact with the structure and dynamics of such networks. Neural Interactome is a cross-platform framework, which combines graph visualization with the simulation of neural dynamics, or experimentally recorded multi neural time series, to allow application of stimuli to neurons to examine network responses. In addition, Neural Interactome supports structural changes, such as disconnection of neurons from the network (ablation feature). Neural dynamics can be explored on a single neuron level (using a zoom feature), back in time (using a review feature), and recorded (using presets feature). The development of the Neural Interactome was guided by generic concepts to be applicable to neuronal networks with different neural connectivity and dynamics. We implement the framework using a model of the nervous system of Caenorhabditis elegans (C. elegans) nematode, a model organism with resolved connectome and neural dynamics. We show that Neural Interactome assists in studying neural response patterns associated with locomotion and other stimuli. In particular, we demonstrate how stimulation and ablation help in identifying neurons that shape particular dynamics. We examine scenarios that were experimentally studied, such as touch response circuit, and explore new scenarios that did not undergo elaborate experimental studies.
Collapse
Affiliation(s)
- Jimin Kim
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, United States
| | - William Leahy
- Department of Applied Mathematics, University of Washington, Seattle, WA, United States
| | - Eli Shlizerman
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, United States.,Department of Applied Mathematics, University of Washington, Seattle, WA, United States
| |
Collapse
|
6
|
Delahunt CB, Riffell JA, Kutz JN. Biological Mechanisms for Learning: A Computational Model of Olfactory Learning in the Manduca sexta Moth, With Applications to Neural Nets. Front Comput Neurosci 2018; 12:102. [PMID: 30618694 PMCID: PMC6306094 DOI: 10.3389/fncom.2018.00102] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2018] [Accepted: 12/03/2018] [Indexed: 11/23/2022] Open
Abstract
The insect olfactory system, which includes the antennal lobe (AL), mushroom body (MB), and ancillary structures, is a relatively simple neural system capable of learning. Its structural features, which are widespread in biological neural systems, process olfactory stimuli through a cascade of networks where large dimension shifts occur from stage to stage and where sparsity and randomness play a critical role in coding. Learning is partly enabled by a neuromodulatory reward mechanism of octopamine stimulation of the AL, whose increased activity induces synaptic weight updates in the MB through Hebbian plasticity. Enforced sparsity in the MB focuses Hebbian growth on neurons that are the most important for the representation of the learned odor. Based upon current biophysical knowledge, we have constructed an end-to-end computational firing-rate model of the Manduca sexta moth olfactory system which includes the interaction of the AL and MB under octopamine stimulation. Our model is able to robustly learn new odors, and neural firing rates in our simulations match the statistical features of in vivo firing rate data. From a biological perspective, the model provides a valuable tool for examining the role of neuromodulators, like octopamine, in learning, and gives insight into critical interactions between sparsity, Hebbian growth, and stimulation during learning. Our simulations also inform predictions about structural details of the olfactory system that are not currently well-characterized. From a machine learning perspective, the model yields bio-inspired mechanisms that are potentially useful in constructing neural nets for rapid learning from very few samples. These mechanisms include high-noise layers, sparse layers as noise filters, and a biologically-plausible optimization method to train the network based on octopamine stimulation, sparse layers, and Hebbian growth.
Collapse
Affiliation(s)
- Charles B. Delahunt
- Department of Electrical Engineering, University of Washington, Seattle, WA, United States
- Computational Neuroscience Center, University of Washington, Seattle, WA, United States
| | - Jeffrey A. Riffell
- Department of Biology, University of Washington, Seattle, WA, United States
| | - J. Nathan Kutz
- Department of Applied Mathematics, University of Washington, Seattle, WA, United States
| |
Collapse
|
7
|
Abstract
The nervous system extracts information from its environment and distributes and processes that information to inform and drive behaviour. In this task, the nervous system faces a type of data analysis problem, for, while a visual scene may be overflowing with information, reaching for the television remote before us requires extraction of only a relatively small fraction of that information. We could care about an almost infinite number of visual stimulus patterns, but we don't: we distinguish two actors' faces with ease but two different images of television static with significant difficulty. Equally, we could respond with an almost infinite number of movements, but we don't: the motions executed to pick up the remote are highly stereotyped and related to many other grasping motions. If we were to look at what was going on inside the brain during this task, we would find populations of neurons whose electrical activity was highly structured and correlated with the images on the screen and the action of localizing and picking up the remote.
Collapse
Affiliation(s)
- Rich Pang
- Neuroscience Graduate Program, University of Washington, Box 357270, T-471 Health Sciences Ctr, Seattle, WA 98195, USA
| | - Benjamin J Lansdell
- Department of Applied Mathematics, University of Washington, Lewis Hall #202, Box 353925, Seattle, WA 98195, USA
| | - Adrienne L Fairhall
- Department of Physiology and Biophysics, University of Washington, 1705 NE Pacific Street, Box 357290, Seattle, WA 98195, USA; WRF UW Institute for Neuroengineering, University of Washington, Box Seattle, WA 98195, USA; Center for Sensorimotor Neural Engineering, University of Washington, Box 37, 1414 NE 42nd St., Suite 204, Seattle, WA 98105, USA.
| |
Collapse
|
8
|
Barreiro AK, Kutz JN, Shlizerman E. Symmetries Constrain Dynamics in a Family of Balanced Neural Networks. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2017; 7:10. [PMID: 29019105 PMCID: PMC5635020 DOI: 10.1186/s13408-017-0052-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/12/2017] [Accepted: 09/19/2017] [Indexed: 06/07/2023]
Abstract
We examine a family of random firing-rate neural networks in which we enforce the neurobiological constraint of Dale's Law-each neuron makes either excitatory or inhibitory connections onto its post-synaptic targets. We find that this constrained system may be described as a perturbation from a system with nontrivial symmetries. We analyze the symmetric system using the tools of equivariant bifurcation theory and demonstrate that the symmetry-implied structures remain evident in the perturbed system. In comparison, spectral characteristics of the network coupling matrix are relatively uninformative about the behavior of the constrained system.
Collapse
Affiliation(s)
- Andrea K. Barreiro
- Department of Mathematics, Southern Methodist University, POB 750156, Dallas, TX 75275 USA
| | - J. Nathan Kutz
- Department of Applied Mathematics, University of Washington, Box 353925, Seattle, WA 98195-3925 USA
| | - Eli Shlizerman
- Department of Applied Mathematics, University of Washington, Box 353925, Seattle, WA 98195-3925 USA
| |
Collapse
|
9
|
Blaszka D, Sanders E, Riffell JA, Shlizerman E. Classification of Fixed Point Network Dynamics from Multiple Node Timeseries Data. Front Neuroinform 2017; 11:58. [PMID: 28979202 PMCID: PMC5611511 DOI: 10.3389/fninf.2017.00058] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2017] [Accepted: 08/24/2017] [Indexed: 11/26/2022] Open
Abstract
Fixed point networks are dynamic networks encoding stimuli via distinct output patterns. Although, such networks are common in neural systems, their structures are typically unknown or poorly characterized. It is thereby valuable to use a supervised approach for resolving how a network encodes inputs of interest and the superposition of those inputs from sampled multiple node time series. In this paper, we show that accomplishing such a task involves finding a low-dimensional state space from supervised noisy recordings. We demonstrate that while standard methods for dimension reduction are unable to provide optimal separation of fixed points and transient trajectories approaching them, the combination of dimension reduction with selection (clustering) and optimization can successfully provide such functionality. Specifically, we propose two methods: Exclusive Threshold Reduction (ETR) and Optimal Exclusive Threshold Reduction (OETR) for finding a basis for the classification state space. We show that the classification space—constructed through the combination of dimension reduction and optimal separation—can directly facilitate recognition of stimuli, and classify complex inputs (mixtures) into similarity classes. We test our methodology on a benchmark data-set recorded from the olfactory system. We also use the benchmark to compare our results with the state-of-the-art. The comparison shows that our methods are capable to construct classification spaces and perform recognition at a significantly better rate than previously proposed approaches.
Collapse
Affiliation(s)
- David Blaszka
- Department of Applied Mathematics, University of WashingtonSeattle, WA, United States
| | - Elischa Sanders
- Department of Biology, University of WashingtonSeattle, WA, United States
| | - Jeffrey A Riffell
- Department of Biology, University of WashingtonSeattle, WA, United States
| | - Eli Shlizerman
- Department of Applied Mathematics, University of WashingtonSeattle, WA, United States.,Department of Electrical Engineering, University of WashingtonSeattle, WA, United States
| |
Collapse
|
10
|
Kunert-Graf JM, Shlizerman E, Walker A, Kutz JN. Multistability and Long-Timescale Transients Encoded by Network Structure in a Model of C. elegans Connectome Dynamics. Front Comput Neurosci 2017; 11:53. [PMID: 28659783 PMCID: PMC5468412 DOI: 10.3389/fncom.2017.00053] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2017] [Accepted: 05/31/2017] [Indexed: 11/13/2022] Open
Abstract
The neural dynamics of the nematode Caenorhabditis elegans are experimentally low-dimensional and may be understood as long-timescale transitions between multiple low-dimensional attractors. Previous modeling work has found that dynamic models of the worm's full neuronal network are capable of generating reasonable dynamic responses to certain inputs, even when all neurons are treated as identical save for their connectivity. This study investigates such a model of C. elegans neuronal dynamics, finding that a wide variety of multistable responses are generated in response to varied inputs. Specifically, we generate bifurcation diagrams for all possible single-neuron inputs, showing the existence of fixed points and limit cycles for different input regimes. The nature of the dynamical response is seen to vary according to the type of neuron receiving input; for example, input into sensory neurons is more likely to drive a bifurcation in the system than input into motor neurons. As a specific example we consider compound input into the neuron pairs PLM and ASK, discovering bistability of a limit cycle and a fixed point. The transient timescales in approaching each of these states are much longer than any intrinsic timescales of the system. This suggests consistency of our model with the characterization of dynamics in neural systems as long-timescale transitions between discrete, low-dimensional attractors corresponding to behavioral states.
Collapse
Affiliation(s)
| | - Eli Shlizerman
- Department of Applied Mathematics, University of WashingtonSeattle, WA, United States.,Department of Electrical Engineering, University of WashingtonSeattle, WA, United States
| | - Andrew Walker
- Department of Applied Mathematics, University of WashingtonSeattle, WA, United States
| | - J Nathan Kutz
- Department of Physics, University of WashingtonSeattle, WA, United States.,Department of Applied Mathematics, University of WashingtonSeattle, WA, United States
| |
Collapse
|
11
|
Maia PD, Kutz JN. Reaction time impairments in decision-making networks as a diagnostic marker for traumatic brain injuries and neurological diseases. J Comput Neurosci 2017; 42:323-347. [PMID: 28393281 DOI: 10.1007/s10827-017-0643-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2016] [Revised: 02/18/2017] [Accepted: 03/29/2017] [Indexed: 10/19/2022]
Abstract
The presence of diffuse Focal Axonal Swellings (FAS) is a hallmark cellular feature in many neurological diseases and traumatic brain injury. Among other things, the FAS have a significant impact on spike-train encodings that propagate through the affected neurons, leading to compromised signal processing on a neuronal network level. This work merges, for the first time, three fields of study: (i) signal processing in excitatory-inhibitory (EI) networks of neurons via population codes, (ii) decision-making theory driven by the production of evidence from stimulus, and (iii) compromised spike-train propagation through FAS. As such, we demonstrate a mathematical architecture capable of characterizing compromised decision-making driven by cellular mechanisms. The computational model also leads to several novel predictions and diagnostics for understanding injury level and cognitive deficits, including a key finding that decision-making reaction times, rather than accuracy, are indicative of network level damage. The results have a number of translational implications, including that the level of network damage can be characterized by the reaction times in simple cognitive and motor tests.
Collapse
Affiliation(s)
- Pedro D Maia
- Department of Applied Mathematics, University of Washington, Seattle, WA, 98195-3925, USA.
| | - J Nathan Kutz
- Department of Applied Mathematics, University of Washington, Seattle, WA, 98195-3925, USA
| |
Collapse
|
12
|
Sargsyan S, Brunton SL, Kutz JN. Nonlinear model reduction for dynamical systems using sparse sensor locations from learned libraries. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:033304. [PMID: 26465583 DOI: 10.1103/physreve.92.033304] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/19/2015] [Indexed: 06/05/2023]
Abstract
We demonstrate the synthesis of sparse sampling and dimensionality reduction to characterize and model nonlinear dynamical systems over a range of bifurcation parameters. First, we construct modal libraries using the classical proper orthogonal decomposition in order to expose the dominant low-rank coherent structures. Here, libraries of the nonlinear terms are also constructed in order to take advantage of the discrete empirical interpolation method and projection that allows for the approximation of nonlinear terms from a sparse number of grid points. The selected grid points are shown to be effective sensing and measurement locations for characterizing the underlying dynamics, stability, and bifurcations of nonlinear dynamical systems. The use of empirical interpolation points and sparse representation facilitates a family of local reduced-order models for each physical regime, rather than a higher-order global model, which has the benefit of physical interpretability of energy transfer between coherent structures. The method advocated also allows for orders-of-magnitude improvement in computational speed and memory requirements. To illustrate the method, the discrete interpolation points and nonlinear modal libraries are used for sparse representation in order to classify and reconstruct the dynamic bifurcation regimes in the complex Ginzburg-Landau equation. It is also shown that point measurements of the nonlinearity are more effective than linear measurements when sensor noise is present.
Collapse
Affiliation(s)
- Syuzanna Sargsyan
- Department of Applied Mathematics, University of Washington, Seattle, Washington 98195-3925, USA
| | - Steven L Brunton
- Department of Mechanical Engineering, University of Washington, Seattle, Washington 98195, USA
| | - J Nathan Kutz
- Department of Applied Mathematics, University of Washington, Seattle, Washington 98195-3925, USA
| |
Collapse
|