1
|
Huh N, Kim SP, Lee J, Sohn JW. Extracting single-trial neural interaction using latent dynamical systems model. Mol Brain 2021; 14:32. [PMID: 33588875 PMCID: PMC7885376 DOI: 10.1186/s13041-021-00740-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Accepted: 01/25/2021] [Indexed: 11/10/2022] Open
Abstract
In systems neuroscience, advances in simultaneous recording technology have helped reveal the population dynamics that underlie the complex neural correlates of animal behavior and cognitive processes. To investigate these correlates, neural interactions are typically abstracted from spike trains of pairs of neurons accumulated over the course of many trials. However, the resultant averaged values do not lead to understanding of neural computation in which the responses of populations are highly variable even under identical external conditions. Accordingly, neural interactions within the population also show strong fluctuations. In the present study, we introduce an analysis method reflecting the temporal variation of neural interactions, in which cross-correlograms on rate estimates are applied via a latent dynamical systems model. Using this method, we were able to predict time-varying neural interactions within a single trial. In addition, the pairwise connections estimated in our analysis increased along behavioral epochs among neurons categorized within similar functional groups. Thus, our analysis method revealed that neurons in the same groups communicate more as the population gets involved in the assigned task. We also showed that the characteristics of neural interaction from our model differ from the results of a typical model employing cross-correlation coefficients. This suggests that our model can extract nonoverlapping information about network topology, unlike the typical model.
Collapse
Affiliation(s)
- Namjung Huh
- Department of Medical Science, College of Medicine, Catholic Kwandong University, Gangneung, 25601, Republic of Korea.
| | - Sung-Phil Kim
- Department of Biomedical Engineering, Ulsan National Institute of Science and Technology, Ulsan, 44919, Republic of Korea
| | - Joonyeol Lee
- Center for Neuroscience Imaging Research, Institute for Basic Science (IBS), Suwon, 16419, Republic of Korea.,Department of Biomedical Engineering, Sungkyunkwan University (SKKU), Suwon, 16419, Republic of Korea
| | - Jeong-Woo Sohn
- Department of Medical Science, College of Medicine, Catholic Kwandong University, Gangneung, 25601, Republic of Korea. .,Translational Brain Research Center, International St. Mary's Hospital, Catholic Kwandong University, Incheon, 22711, Republic of Korea.
| |
Collapse
|
2
|
Humphries MD. Dynamical networks: Finding, measuring, and tracking neural population activity using network science. Netw Neurosci 2017; 1:324-338. [PMID: 30090869 PMCID: PMC6063717 DOI: 10.1162/netn_a_00020] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2017] [Accepted: 06/06/2017] [Indexed: 11/04/2022] Open
Abstract
Systems neuroscience is in a headlong rush to record from as many neurons at the same time as possible. As the brain computes and codes using neuron populations, it is hoped these data will uncover the fundamentals of neural computation. But with hundreds, thousands, or more simultaneously recorded neurons come the inescapable problems of visualizing, describing, and quantifying their interactions. Here I argue that network science provides a set of scalable, analytical tools that already solve these problems. By treating neurons as nodes and their interactions as links, a single network can visualize and describe an arbitrarily large recording. I show that with this description we can quantify the effects of manipulating a neural circuit, track changes in population dynamics over time, and quantitatively define theoretical concepts of neural populations such as cell assemblies. Using network science as a core part of analyzing population recordings will thus provide both qualitative and quantitative advances to our understanding of neural computation.
Collapse
Affiliation(s)
- Mark D. Humphries
- Faculty of Biology, Medicine and Health, University of Manchester, Manchester, United Kingdom
| |
Collapse
|
3
|
de Assis JM, Santos MO, de Assis FM. Auditory Stimuli Coding by Postsynaptic Potential and Local Field Potential Features. PLoS One 2016; 11:e0160089. [PMID: 27513950 PMCID: PMC4981406 DOI: 10.1371/journal.pone.0160089] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2015] [Accepted: 07/13/2016] [Indexed: 11/19/2022] Open
Abstract
The relation between physical stimuli and neurophysiological responses, such as action potentials (spikes) and Local Field Potentials (LFP), has recently been experimented in order to explain how neurons encode auditory information. However, none of these experiments presented analyses with postsynaptic potentials (PSPs). In the present study, we have estimated information values between auditory stimuli and amplitudes/latencies of PSPs and LFPs in anesthetized rats in vivo. To obtain these values, a new method of information estimation was used. This method produced more accurate estimates than those obtained by using the traditional binning method; a fact that was corroborated by simulated data. The traditional binning method could not certainly impart such accuracy even when adjusted by quadratic extrapolation. We found that the information obtained from LFP amplitude variation was significantly greater than the information obtained from PSP amplitude variation. This confirms the fact that LFP reflects the action of many PSPs. Results have shown that the auditory cortex codes more information of stimuli frequency with slow oscillations in groups of neurons than it does with slow oscillations in neurons separately.
Collapse
Affiliation(s)
- Juliana M. de Assis
- Department of Electrical Engineering, Federal University of Campina Grande, Campina Grande, Paraíba, Brazil
| | - Mikaelle O. Santos
- Department of Electrical Engineering, Federal University of Campina Grande, Campina Grande, Paraíba, Brazil
| | - Francisco M. de Assis
- Department of Electrical Engineering, Federal University of Campina Grande, Campina Grande, Paraíba, Brazil
| |
Collapse
|
4
|
Abstract
Intrinsic neuronal variability significantly limits information encoding in the primary visual cortex (V1). Certain stimuli can suppress this intertrial variability to increase the reliability of neuronal responses. In particular, responses to natural scenes, which have broadband spatiotemporal statistics, are more reliable than responses to stimuli such as gratings. However, very little is known about which stimulus statistics modulate reliable coding and how this occurs at the neural ensemble level. Here, we sought to elucidate the role that spatial correlations in natural scenes play in reliable coding. We developed a novel noise-masking method to systematically alter spatial correlations in natural movies, without altering their edge structure. Using high-speed two-photon calcium imaging in vivo, we found that responses in mouse V1 were much less reliable at both the single neuron and population level when spatial correlations were removed from the image. This change in reliability was due to a reorganization of between-neuron correlations. Strongly correlated neurons formed ensembles that reliably and accurately encoded visual stimuli, whereas reducing spatial correlations reduced the activation of these ensembles, leading to an unreliable code. Together with an ensemble-specific normalization model, these results suggest that the coordinated activation of specific subsets of neurons underlies the reliable coding of natural scenes.
Collapse
|
5
|
Emergence of assortative mixing between clusters of cultured neurons. PLoS Comput Biol 2014; 10:e1003796. [PMID: 25188377 PMCID: PMC4154651 DOI: 10.1371/journal.pcbi.1003796] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2014] [Accepted: 07/06/2014] [Indexed: 11/19/2022] Open
Abstract
The analysis of the activity of neuronal cultures is considered to be a good proxy of the functional connectivity of in vivo neuronal tissues. Thus, the functional complex network inferred from activity patterns is a promising way to unravel the interplay between structure and functionality of neuronal systems. Here, we monitor the spontaneous self-sustained dynamics in neuronal cultures formed by interconnected aggregates of neurons (clusters). Dynamics is characterized by the fast activation of groups of clusters in sequences termed bursts. The analysis of the time delays between clusters' activations within the bursts allows the reconstruction of the directed functional connectivity of the network. We propose a method to statistically infer this connectivity and analyze the resulting properties of the associated complex networks. Surprisingly enough, in contrast to what has been reported for many biological networks, the clustered neuronal cultures present assortative mixing connectivity values, meaning that there is a preference for clusters to link to other clusters that share similar functional connectivity, as well as a rich-club core, which shapes a 'connectivity backbone' in the network. These results point out that the grouping of neurons and the assortative connectivity between clusters are intrinsic survival mechanisms of the culture.
Collapse
|
6
|
Faghihi F, Kolodziejski C, Fiala A, Wörgötter F, Tetzlaff C. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency. Front Comput Neurosci 2013; 7:183. [PMID: 24391579 PMCID: PMC3868887 DOI: 10.3389/fncom.2013.00183] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2013] [Accepted: 12/03/2013] [Indexed: 11/13/2022] Open
Abstract
Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.
Collapse
Affiliation(s)
- Faramarz Faghihi
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| | - Christoph Kolodziejski
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| | - André Fiala
- Molecular Neurobiology of Behavior, Johann-Friedrich-Blumenbach-Institute for Zoology and Anthropology, Georg-August-Universität Göttingen, Germany
| | - Florentin Wörgötter
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| |
Collapse
|
7
|
Stetter O, Battaglia D, Soriano J, Geisel T. Model-free reconstruction of excitatory neuronal connectivity from calcium imaging signals. PLoS Comput Biol 2012; 8:e1002653. [PMID: 22927808 PMCID: PMC3426566 DOI: 10.1371/journal.pcbi.1002653] [Citation(s) in RCA: 139] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2012] [Accepted: 07/01/2012] [Indexed: 12/13/2022] Open
Abstract
A systematic assessment of global neural network connectivity through direct electrophysiological assays has remained technically infeasible, even in simpler systems like dissociated neuronal cultures. We introduce an improved algorithmic approach based on Transfer Entropy to reconstruct structural connectivity from network activity monitored through calcium imaging. We focus in this study on the inference of excitatory synaptic links. Based on information theory, our method requires no prior assumptions on the statistics of neuronal firing and neuronal connections. The performance of our algorithm is benchmarked on surrogate time series of calcium fluorescence generated by the simulated dynamics of a network with known ground-truth topology. We find that the functional network topology revealed by Transfer Entropy depends qualitatively on the time-dependent dynamic state of the network (bursting or non-bursting). Thus by conditioning with respect to the global mean activity, we improve the performance of our method. This allows us to focus the analysis to specific dynamical regimes of the network in which the inferred functional connectivity is shaped by monosynaptic excitatory connections, rather than by collective synchrony. Our method can discriminate between actual causal influences between neurons and spurious non-causal correlations due to light scattering artifacts, which inherently affect the quality of fluorescence imaging. Compared to other reconstruction strategies such as cross-correlation or Granger Causality methods, our method based on improved Transfer Entropy is remarkably more accurate. In particular, it provides a good estimation of the excitatory network clustering coefficient, allowing for discrimination between weakly and strongly clustered topologies. Finally, we demonstrate the applicability of our method to analyses of real recordings of in vitro disinhibited cortical cultures where we suggest that excitatory connections are characterized by an elevated level of clustering compared to a random graph (although not extreme) and can be markedly non-local. Unraveling the general organizing principles of connectivity in neural circuits is a crucial step towards understanding brain function. However, even the simpler task of assessing the global excitatory connectivity of a culture in vitro, where neurons form self-organized networks in absence of external stimuli, remains challenging. Neuronal cultures undergo spontaneous switching between episodes of synchronous bursting and quieter inter-burst periods. We introduce here a novel algorithm which aims at inferring the connectivity of neuronal cultures from calcium fluorescence recordings of their network dynamics. To achieve this goal, we develop a suitable generalization of Transfer Entropy, an information-theoretic measure of causal influences between time series. Unlike previous algorithmic approaches to reconstruction, Transfer Entropy is data-driven and does not rely on specific assumptions about neuronal firing statistics or network topology. We generate simulated calcium signals from networks with controlled ground-truth topology and purely excitatory interactions and show that, by restricting the analysis to inter-bursts periods, Transfer Entropy robustly achieves a good reconstruction performance for disparate network connectivities. Finally, we apply our method to real data and find evidence of non-random features in cultured networks, such as the existence of highly connected hub excitatory neurons and of an elevated (but not extreme) level of clustering.
Collapse
Affiliation(s)
- Olav Stetter
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Georg August University, Physics Department, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Demian Battaglia
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
- * E-mail:
| | - Jordi Soriano
- Departament d'ECM , Facultat de F?sica, Universitat de Barcelona, Barcelona, Spain
| | - Theo Geisel
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Georg August University, Physics Department, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
8
|
Cooney CJ, Lynch E. Determining information flow through a network of simulated neurons. BMC Neurosci 2012. [PMCID: PMC3403325 DOI: 10.1186/1471-2202-13-s1-p92] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
9
|
Benish WA. The channel capacity of a diagnostic test as a function of test sensitivity and test specificity. Stat Methods Med Res 2012; 24:1044-52. [PMID: 22368178 DOI: 10.1177/0962280212439742] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
We apply the information theory concept of "channel capacity" to diagnostic test performance and derive an expression for channel capacity in terms of test sensitivity and test specificity. The expected value of the amount of information a diagnostic test will provide is equal to the "mutual information" between the test result and the disease state. For the case in which only two test results and two disease states are considered, mutual information, I(D;R), is a function of sensitivity, specificity, and the pretest probability of disease. The channel capacity of the test is the maximal value of I(D;R) for a given sensitivity and specificity. After deriving an expression for I(D;R) in terms of sensitivity, specificity, and pretest probability, we solve for the value of pretest probability that maximizes I(D;R). Channel capacity is obtained by using this value of pretest probability to calculate I(D;R). Channel capacity provides a convenient and meaningful single parameter measure of diagnostic test performance. It quantifies the upper limit of the amount of information a diagnostic test can be expected to provide about a patient's disease state.
Collapse
Affiliation(s)
- William A Benish
- Department of Internal Medicine, Louis Stokes Cleveland VA Medical Center and Case Western Reserve University, Cleveland, OH, USA
| |
Collapse
|