1
|
Wu N, Zhou B, Agrochao M, Clark DA. Broken time-reversal symmetry in visual motion detection. Proc Natl Acad Sci U S A 2025; 122:e2410768122. [PMID: 40048271 PMCID: PMC11912477 DOI: 10.1073/pnas.2410768122] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2024] [Accepted: 01/09/2025] [Indexed: 03/12/2025] Open
Abstract
Our intuition suggests that when a movie is played in reverse, our perception of motion at each location in the reversed movie will be perfectly inverted compared to the original. This intuition is also reflected in classical theoretical and practical models of motion estimation, in which velocity flow fields invert when inputs are reversed in time. However, here we report that this symmetry of motion perception upon time reversal is broken in real visual systems. We designed a set of visual stimuli to investigate time reversal symmetry breaking in the fruit fly Drosophila's well-studied optomotor rotation behavior. We identified a suite of stimuli with a wide variety of properties that can uncover broken time reversal symmetry in fly behavioral responses. We then trained neural network models to predict the velocity of scenes with both natural and artificial contrast distributions. Training with naturalistic contrast distributions yielded models that broke time reversal symmetry, even when the training data themselves were time reversal symmetric. We show analytically and numerically that the breaking of time reversal symmetry in the model responses can arise from contrast asymmetry in the training data, but can also arise from other features of the contrast distribution. Furthermore, shallower neural network models can exhibit stronger symmetry breaking than deeper ones, suggesting that less flexible neural networks may be more prone to time reversal symmetry breaking. Overall, these results reveal a surprising feature of biological motion detectors and suggest that it could arise from constrained optimization in natural environments.
Collapse
Affiliation(s)
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT06511
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT06511
| | - Damon A. Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT06511
- Department of Physics, Yale University, New Haven, CT06511
- Department of Neuroscience, Yale University, New Haven, CT06511
- Quantitative Biology Institute, Yale University, New Haven, CT06511
- Wu Tsai Institute, Yale University, New Haven, CT06511
| |
Collapse
|
2
|
Pang MM, Chen F, Xie M, Druckmann S, Clandinin TR, Yang HH. A recurrent neural circuit in Drosophila temporally sharpens visual inputs. Curr Biol 2025; 35:333-346.e6. [PMID: 39706173 PMCID: PMC11769683 DOI: 10.1016/j.cub.2024.11.064] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2024] [Revised: 10/28/2024] [Accepted: 11/26/2024] [Indexed: 12/23/2024]
Abstract
A critical goal of vision is to detect changes in light intensity, even when these changes are blurred by the spatial resolution of the eye and the motion of the animal. Here, we describe a recurrent neural circuit in Drosophila that compensates for blur and thereby selectively enhances the perceived contrast of moving edges. Using in vivo, two-photon voltage imaging, we measured the temporal response properties of L1 and L2, two cell types that receive direct synaptic input from photoreceptors. These neurons have biphasic responses to brief flashes of light, a hallmark of cells that encode changes in stimulus intensity. However, the second phase was often much larger in area than the first, creating an unusual temporal filter. Genetic dissection revealed that recurrent neural circuitry strongly shapes the second phase of the response, informing the structure of a dynamical model. By applying this model to moving natural images, we demonstrate that rather than veridically representing stimulus changes, this temporal processing strategy systematically enhances them, amplifying and sharpening responses. Comparing the measured responses of L2 to model predictions across both artificial and natural stimuli revealed that L2 tunes its properties as the model predicts to temporally sharpen visual inputs. Since this strategy is tunable to behavioral context, generalizable to any time-varying sensory input, and implementable with a common circuit motif, we propose that it could be broadly used to selectively enhance sharp and salient changes.
Collapse
Affiliation(s)
- Michelle M Pang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Feng Chen
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA; Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Marjorie Xie
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Shaul Druckmann
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA; Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, CA 94305, USA
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA; Chan Zuckerberg Biohub, San Francisco, CA, USA
| | - Helen H Yang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA.
| |
Collapse
|
3
|
Gou T, Matulis CA, Clark DA. Adaptation to visual sparsity enhances responses to isolated stimuli. Curr Biol 2024; 34:5697-5713.e8. [PMID: 39577424 PMCID: PMC11834764 DOI: 10.1016/j.cub.2024.10.053] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2024] [Revised: 09/17/2024] [Accepted: 10/18/2024] [Indexed: 11/24/2024]
Abstract
Sensory systems adapt their response properties to the statistics of their inputs. For instance, visual systems adapt to low-order statistics like mean and variance to encode stimuli efficiently or to facilitate specific downstream computations. However, it remains unclear how other statistical features affect sensory adaptation. Here, we explore how Drosophila's visual motion circuits adapt to stimulus sparsity, a measure of the signal's intermittency not captured by low-order statistics alone. Early visual neurons in both ON and OFF pathways alter their responses dramatically with stimulus sparsity, responding positively to both light and dark sparse stimuli but linearly to dense stimuli. These changes extend to downstream ON and OFF direction-selective neurons, which are activated by sparse stimuli of both polarities but respond with opposite signs to light and dark regions of dense stimuli. Thus, sparse stimuli activate both ON and OFF pathways, recruiting a larger fraction of the circuit and potentially enhancing the salience of isolated stimuli. Overall, our results reveal visual response properties that increase the fraction of the circuit responding to sparse, isolated stimuli.
Collapse
Affiliation(s)
- Tong Gou
- Department of Electrical Engineering, Yale University, New Haven, CT 06511, USA
| | | | - Damon A Clark
- Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA; Wu Tsai Institute, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
4
|
Choi K, Rosenbluth W, Graf IR, Kadakia N, Emonet T. Bifurcation enhances temporal information encoding in the olfactory periphery. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.05.27.596086. [PMID: 38853849 PMCID: PMC11160621 DOI: 10.1101/2024.05.27.596086] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2024]
Abstract
Living systems continually respond to signals from the surrounding environment. Survival requires that their responses adapt quickly and robustly to the changes in the environment. One particularly challenging example is olfactory navigation in turbulent plumes, where animals experience highly intermittent odor signals while odor concentration varies over many length- and timescales. Here, we show theoretically that Drosophila olfactory receptor neurons (ORNs) can exploit proximity to a bifurcation point of their firing dynamics to reliably extract information about the timing and intensity of fluctuations in the odor signal, which have been shown to be critical for odor-guided navigation. Close to the bifurcation, the system is intrinsically invariant to signal variance, and information about the timing, duration, and intensity of odor fluctuations is transferred efficiently. Importantly, we find that proximity to the bifurcation is maintained by mean adaptation alone and therefore does not require any additional feedback mechanism or fine-tuning. Using a biophysical model with calcium-based feedback, we demonstrate that this mechanism can explain the measured adaptation characteristics of Drosophila ORNs.
Collapse
|
5
|
Choi K, Rosenbluth W, Graf IR, Kadakia N, Emonet T. Bifurcation enhances temporal information encoding in the olfactory periphery. ARXIV 2024:arXiv:2405.20135v3. [PMID: 38855541 PMCID: PMC11160886] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 06/11/2024]
Abstract
Living systems continually respond to signals from the surrounding environment. Survival requires that their responses adapt quickly and robustly to the changes in the environment. One particularly challenging example is olfactory navigation in turbulent plumes, where animals experience highly intermittent odor signals while odor concentration varies over many length- and timescales. Here, we show theoretically that Drosophila olfactory receptor neurons (ORNs) can exploit proximity to a bifurcation point of their firing dynamics to reliably extract information about the timing and intensity of fluctuations in the odor signal, which have been shown to be critical for odor-guided navigation. Close to the bifurcation, the system is intrinsically invariant to signal variance, and information about the timing, duration, and intensity of odor fluctuations is transferred efficiently. Importantly, we find that proximity to the bifurcation is maintained by mean adaptation alone and therefore does not require any additional feedback mechanism or fine-tuning. Using a biophysical model with calcium-based feedback, we demonstrate that this mechanism can explain the measured adaptation characteristics of Drosophila ORNs.
Collapse
Affiliation(s)
- Kiri Choi
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, Connecticut 06511, USA
- Quantitative Biology Institute, Yale University, New Haven, Connecticut 06511, USA
- Swartz Foundation for Theoretical Neuroscience, Yale University, New Haven, Connecticut 06511, USA
| | - Will Rosenbluth
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, Connecticut 06511, USA
| | - Isabella R. Graf
- Quantitative Biology Institute, Yale University, New Haven, Connecticut 06511, USA
- Department of Physics, Yale University, New Haven, Connecticut 06511, USA
- Developmental Biology Unit, European Molecular Biology Laboratory, 69117 Heidelberg, Germany
| | - Nirag Kadakia
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, Connecticut 06511, USA
- Quantitative Biology Institute, Yale University, New Haven, Connecticut 06511, USA
- Swartz Foundation for Theoretical Neuroscience, Yale University, New Haven, Connecticut 06511, USA
| | - Thierry Emonet
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, Connecticut 06511, USA
- Quantitative Biology Institute, Yale University, New Haven, Connecticut 06511, USA
- Department of Physics, Yale University, New Haven, Connecticut 06511, USA
- Interdepartmental Neuroscience Program, Yale University, New Haven, Connecticut 06511, USA
| |
Collapse
|
6
|
Gür B, Ramirez L, Cornean J, Thurn F, Molina-Obando S, Ramos-Traslosheros G, Silies M. Neural pathways and computations that achieve stable contrast processing tuned to natural scenes. Nat Commun 2024; 15:8580. [PMID: 39362859 PMCID: PMC11450186 DOI: 10.1038/s41467-024-52724-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2024] [Accepted: 09/18/2024] [Indexed: 10/05/2024] Open
Abstract
Natural scenes are highly dynamic, challenging the reliability of visual processing. Yet, humans and many animals perform accurate visual behaviors, whereas computer vision devices struggle with rapidly changing background luminance. How does animal vision achieve this? Here, we reveal the algorithms and mechanisms of rapid luminance gain control in Drosophila, resulting in stable visual processing. We identify specific transmedullary neurons as the site of luminance gain control, which pass this property to direction-selective cells. The circuitry further involves wide-field neurons, matching computational predictions that local spatial pooling drive optimal contrast processing in natural scenes when light conditions change rapidly. Experiments and theory argue that a spatially pooled luminance signal achieves luminance gain control via divisive normalization. This process relies on shunting inhibition using the glutamate-gated chloride channel GluClα. Our work describes how the fly robustly processes visual information in dynamically changing natural scenes, a common challenge of all visual systems.
Collapse
Affiliation(s)
- Burak Gür
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
- The Friedrich Miescher Institute for Biomedical Research (FMI), Basel, Switzerland
| | - Luisa Ramirez
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Jacqueline Cornean
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Freya Thurn
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Sebastian Molina-Obando
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Giordano Ramos-Traslosheros
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
- Department of Neurobiology, Harvard Medical School, Boston, MA, USA
| | - Marion Silies
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany.
| |
Collapse
|
7
|
Clark DA, Fitzgerald JE. Optimization in Visual Motion Estimation. Annu Rev Vis Sci 2024; 10:23-46. [PMID: 38663426 PMCID: PMC11998607 DOI: 10.1146/annurev-vision-101623-025432] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2025]
Abstract
Sighted animals use visual signals to discern directional motion in their environment. Motion is not directly detected by visual neurons, and it must instead be computed from light signals that vary over space and time. This makes visual motion estimation a near universal neural computation, and decades of research have revealed much about the algorithms and mechanisms that generate directional signals. The idea that sensory systems are optimized for performance in natural environments has deeply impacted this research. In this article, we review the many ways that optimization has been used to quantitatively model visual motion estimation and reveal its underlying principles. We emphasize that no single optimization theory has dominated the literature. Instead, researchers have adeptly incorporated different computational demands and biological constraints that are pertinent to the specific brain system and animal model under study. The successes and failures of the resulting optimization models have thereby provided insights into how computational demands and biological constraints together shape neural computation.
Collapse
Affiliation(s)
- Damon A Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, Connecticut, USA;
| | - James E Fitzgerald
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, Virginia, USA
- Department of Neurobiology, Northwestern University, Evanston, Illinois, USA;
| |
Collapse
|
8
|
Wu N, Zhou B, Agrochao M, Clark DA. Broken time reversal symmetry in visual motion detection. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.06.08.598068. [PMID: 38915608 PMCID: PMC11195140 DOI: 10.1101/2024.06.08.598068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/26/2024]
Abstract
Our intuition suggests that when a movie is played in reverse, our perception of motion in the reversed movie will be perfectly inverted compared to the original. This intuition is also reflected in many classical theoretical and practical models of motion detection. However, here we demonstrate that this symmetry of motion perception upon time reversal is often broken in real visual systems. In this work, we designed a set of visual stimuli to investigate how stimulus symmetries affect time reversal symmetry breaking in the fruit fly Drosophila's well-studied optomotor rotation behavior. We discovered a suite of new stimuli with a wide variety of different properties that can lead to broken time reversal symmetries in fly behavioral responses. We then trained neural network models to predict the velocity of scenes with both natural and artificial contrast distributions. Training with naturalistic contrast distributions yielded models that break time reversal symmetry, even when the training data was time reversal symmetric. We show analytically and numerically that the breaking of time reversal symmetry in the model responses can arise from contrast asymmetry in the training data, but can also arise from other features of the contrast distribution. Furthermore, shallower neural network models can exhibit stronger symmetry breaking than deeper ones, suggesting that less flexible neural networks promote some forms of time reversal symmetry breaking. Overall, these results reveal a surprising feature of biological motion detectors and suggest that it could arise from constrained optimization in natural environments.
Collapse
Affiliation(s)
- Nathan Wu
- Yale College, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Damon A. Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
- Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
9
|
Pang MM, Chen F, Xie M, Druckmann S, Clandinin TR, Yang HH. A recurrent neural circuit in Drosophila deblurs visual inputs. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.04.19.590352. [PMID: 38712245 PMCID: PMC11071408 DOI: 10.1101/2024.04.19.590352] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2024]
Abstract
A critical goal of vision is to detect changes in light intensity, even when these changes are blurred by the spatial resolution of the eye and the motion of the animal. Here we describe a recurrent neural circuit in Drosophila that compensates for blur and thereby selectively enhances the perceived contrast of moving edges. Using in vivo, two-photon voltage imaging, we measured the temporal response properties of L1 and L2, two cell types that receive direct synaptic input from photoreceptors. These neurons have biphasic responses to brief flashes of light, a hallmark of cells that encode changes in stimulus intensity. However, the second phase was often much larger than the first, creating an unusual temporal filter. Genetic dissection revealed that recurrent neural circuitry strongly shapes the second phase of the response, informing the structure of a dynamical model. By applying this model to moving natural images, we demonstrate that rather than veridically representing stimulus changes, this temporal processing strategy systematically enhances them, amplifying and sharpening responses. Comparing the measured responses of L2 to model predictions across both artificial and natural stimuli revealed that L2 tunes its properties as the model predicts in order to deblur images. Since this strategy is tunable to behavioral context, generalizable to any time-varying sensory input, and implementable with a common circuit motif, we propose that it could be broadly used to selectively enhance sharp and salient changes.
Collapse
Affiliation(s)
- Michelle M. Pang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Feng Chen
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Marjorie Xie
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Current affiliation: School for the Future of Innovation of Society, Arizona State University, Tempe, AZ 85281, USA
| | - Shaul Druckmann
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, CA 94305, USA
| | | | - Helen H. Yang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Current affiliation: Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
- Lead contact
| |
Collapse
|
10
|
Prech S, Groschner LN, Borst A. An open platform for visual stimulation of insects. PLoS One 2024; 19:e0301999. [PMID: 38635686 PMCID: PMC11025907 DOI: 10.1371/journal.pone.0301999] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2024] [Accepted: 03/26/2024] [Indexed: 04/20/2024] Open
Abstract
To study how the nervous system processes visual information, experimenters must record neural activity while delivering visual stimuli in a controlled fashion. In animals with a nearly panoramic field of view, such as flies, precise stimulation of the entire visual field is challenging. We describe a projector-based device for stimulation of the insect visual system under a microscope. The device is based on a bowl-shaped screen that provides a wide and nearly distortion-free field of view. It is compact, cheap, easy to assemble, and easy to operate using the included open-source software for stimulus generation. We validate the virtual reality system technically and demonstrate its capabilities in a series of experiments at two levels: the cellular, by measuring the membrane potential responses of visual interneurons; and the organismal, by recording optomotor and fixation behavior of Drosophila melanogaster in tethered flight. Our experiments reveal the importance of stimulating the visual system of an insect with a wide field of view, and we provide a simple solution to do so.
Collapse
Affiliation(s)
- Stefan Prech
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Lukas N. Groschner
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Gottfried Schatz Research Center, Molecular Biology and Biochemistry, Medical University of Graz, Graz, Austria
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| |
Collapse
|
11
|
Tanaka R, Zhou B, Agrochao M, Badwan BA, Au B, Matos NCB, Clark DA. Neural mechanisms to incorporate visual counterevidence in self-movement estimation. Curr Biol 2023; 33:4960-4979.e7. [PMID: 37918398 PMCID: PMC10848174 DOI: 10.1016/j.cub.2023.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2023] [Revised: 10/07/2023] [Accepted: 10/09/2023] [Indexed: 11/04/2023]
Abstract
In selecting appropriate behaviors, animals should weigh sensory evidence both for and against specific beliefs about the world. For instance, animals measure optic flow to estimate and control their own rotation. However, existing models of flow detection can be spuriously triggered by visual motion created by objects moving in the world. Here, we show that stationary patterns on the retina, which constitute evidence against observer rotation, suppress inappropriate stabilizing rotational behavior in the fruit fly Drosophila. In silico experiments show that artificial neural networks (ANNs) that are optimized to distinguish observer movement from external object motion similarly detect stationarity and incorporate negative evidence. Employing neural measurements and genetic manipulations, we identified components of the circuitry for stationary pattern detection, which runs parallel to the fly's local motion and optic-flow detectors. Our results show how the fly brain incorporates negative evidence to improve heading stability, exemplifying how a compact brain exploits geometrical constraints of the visual world.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Statistics and Data Science, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Braedyn Au
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Natalia C B Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Wu Tsai Institute, Yale University, New Haven, CT 06511, USA; Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
12
|
Chen J, Gish CM, Fransen JW, Salazar-Gatzimas E, Clark DA, Borghuis BG. Direct comparison reveals algorithmic similarities in fly and mouse visual motion detection. iScience 2023; 26:107928. [PMID: 37810236 PMCID: PMC10550730 DOI: 10.1016/j.isci.2023.107928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 08/07/2023] [Accepted: 09/12/2023] [Indexed: 10/10/2023] Open
Abstract
Evolution has equipped vertebrates and invertebrates with neural circuits that selectively encode visual motion. While similarities in the computations performed by these circuits in mouse and fruit fly have been noted, direct experimental comparisons have been lacking. Because molecular mechanisms and neuronal morphology in the two species are distinct, we directly compared motion encoding in these two species at the algorithmic level, using matched stimuli and focusing on a pair of analogous neurons, the mouse ON starburst amacrine cell (ON SAC) and Drosophila T4 neurons. We find that the cells share similar spatiotemporal receptive field structures, sensitivity to spatiotemporal correlations, and tuning to sinusoidal drifting gratings, but differ in their responses to apparent motion stimuli. Both neuron types showed a response to summed sinusoids that deviates from models for motion processing in these cells, underscoring the similarities in their processing and identifying response features that remain to be explained.
Collapse
Affiliation(s)
- Juyue Chen
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
| | - Caitlin M Gish
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - James W Fransen
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| | | | - Damon A Clark
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Molecular, Cellular, Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Bart G Borghuis
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| |
Collapse
|
13
|
Mano O, Choi M, Tanaka R, Creamer MS, Matos NCB, Shomar JW, Badwan BA, Clandinin TR, Clark DA. Long-timescale anti-directional rotation in Drosophila optomotor behavior. eLife 2023; 12:e86076. [PMID: 37751469 PMCID: PMC10522332 DOI: 10.7554/elife.86076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 09/12/2023] [Indexed: 09/28/2023] Open
Abstract
Locomotor movements cause visual images to be displaced across the eye, a retinal slip that is counteracted by stabilizing reflexes in many animals. In insects, optomotor turning causes the animal to turn in the direction of rotating visual stimuli, thereby reducing retinal slip and stabilizing trajectories through the world. This behavior has formed the basis for extensive dissections of motion vision. Here, we report that under certain stimulus conditions, two Drosophila species, including the widely studied Drosophila melanogaster, can suppress and even reverse the optomotor turning response over several seconds. Such 'anti-directional turning' is most strongly evoked by long-lasting, high-contrast, slow-moving visual stimuli that are distinct from those that promote syn-directional optomotor turning. Anti-directional turning, like the syn-directional optomotor response, requires the local motion detecting neurons T4 and T5. A subset of lobula plate tangential cells, CH cells, show involvement in these responses. Imaging from a variety of direction-selective cells in the lobula plate shows no evidence of dynamics that match the behavior, suggesting that the observed inversion in turning direction emerges downstream of the lobula plate. Further, anti-directional turning declines with age and exposure to light. These results show that Drosophila optomotor turning behaviors contain rich, stimulus-dependent dynamics that are inconsistent with simple reflexive stabilization responses.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
| | - Minseung Choi
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Natalia CB Matos
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Joseph W Shomar
- Department of Physics, Yale UniversityNew HavenUnited States
| | - Bara A Badwan
- Department of Chemical Engineering, Yale UniversityNew HavenUnited States
| | | | - Damon A Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
- Department of Physics, Yale UniversityNew HavenUnited States
- Department of Neuroscience, Yale UniversityNew HavenUnited States
| |
Collapse
|
14
|
Tanaka R, Zhou B, Agrochao M, Badwan BA, Au B, Matos NCB, Clark DA. Neural mechanisms to incorporate visual counterevidence in self motion estimation. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.04.522814. [PMID: 36711843 PMCID: PMC9881891 DOI: 10.1101/2023.01.04.522814] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
In selecting appropriate behaviors, animals should weigh sensory evidence both for and against specific beliefs about the world. For instance, animals measure optic flow to estimate and control their own rotation. However, existing models of flow detection can confuse the movement of external objects with genuine self motion. Here, we show that stationary patterns on the retina, which constitute negative evidence against self rotation, are used by the fruit fly Drosophila to suppress inappropriate stabilizing rotational behavior. In silico experiments show that artificial neural networks optimized to distinguish self and world motion similarly detect stationarity and incorporate negative evidence. Employing neural measurements and genetic manipulations, we identified components of the circuitry for stationary pattern detection, which runs parallel to the fly's motion- and optic flow-detectors. Our results exemplify how the compact brain of the fly incorporates negative evidence to improve heading stability, exploiting geometrical constraints of the visual world.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Present Address: Institute of Neuroscience, Technical University of Munich, Munich 80802, Germany
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Statistics and Data Science, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Bara A. Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Braedyn Au
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Natalia C. B. Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A. Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
15
|
Abstract
How neurons detect the direction of motion is a prime example of neural computation: Motion vision is found in the visual systems of virtually all sighted animals, it is important for survival, and it requires interesting computations with well-defined linear and nonlinear processing steps-yet the whole process is of moderate complexity. The genetic methods available in the fruit fly Drosophila and the charting of a connectome of its visual system have led to rapid progress and unprecedented detail in our understanding of how neurons compute the direction of motion in this organism. The picture that emerged incorporates not only the identity, morphology, and synaptic connectivity of each neuron involved but also its neurotransmitters, its receptors, and their subcellular localization. Together with the neurons' membrane potential responses to visual stimulation, this information provides the basis for a biophysically realistic model of the circuit that computes the direction of visual motion.
Collapse
Affiliation(s)
- Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| | - Lukas N Groschner
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| |
Collapse
|
16
|
Pirogova N, Borst A. Contrast normalization affects response time-course of visual interneurons. PLoS One 2023; 18:e0285686. [PMID: 37294743 PMCID: PMC10256145 DOI: 10.1371/journal.pone.0285686] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2022] [Accepted: 04/28/2023] [Indexed: 06/11/2023] Open
Abstract
In natural environments, light intensities and visual contrasts vary widely, yet neurons have a limited response range for encoding them. Neurons accomplish that by flexibly adjusting their dynamic range to the statistics of the environment via contrast normalization. The effect of contrast normalization is usually measured as a reduction of neural signal amplitudes, but whether it influences response dynamics is unknown. Here, we show that contrast normalization in visual interneurons of Drosophila melanogaster not only suppresses the amplitude but also alters the dynamics of responses when a dynamic surround is present. We present a simple model that qualitatively reproduces the simultaneous effect of the visual surround on the response amplitude and temporal dynamics by altering the cells' input resistance and, thus, their membrane time constant. In conclusion, single-cell filtering properties as derived from artificial stimulus protocols like white-noise stimulation cannot be transferred one-to-one to predict responses under natural conditions.
Collapse
Affiliation(s)
- Nadezhda Pirogova
- Department Circuits-Computation-Models, Max Planck Institute for Biological Intelligence, Planegg, Martinsried, Germany
- Graduate School of Systemic Neurosciences, LMU Munich, Planegg, Martinsried, Germany
| | - Alexander Borst
- Department Circuits-Computation-Models, Max Planck Institute for Biological Intelligence, Planegg, Martinsried, Germany
| |
Collapse
|
17
|
Ketkar MD, Shao S, Gjorgjieva J, Silies M. Multifaceted luminance gain control beyond photoreceptors in Drosophila. Curr Biol 2023:S0960-9822(23)00619-X. [PMID: 37285845 DOI: 10.1016/j.cub.2023.05.024] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2023] [Revised: 05/10/2023] [Accepted: 05/11/2023] [Indexed: 06/09/2023]
Abstract
Animals navigating in natural environments must handle vast changes in their sensory input. Visual systems, for example, handle changes in luminance at many timescales, from slow changes across the day to rapid changes during active behavior. To maintain luminance-invariant perception, visual systems must adapt their sensitivity to changing luminance at different timescales. We demonstrate that luminance gain control in photoreceptors alone is insufficient to explain luminance invariance at both fast and slow timescales and reveal the algorithms that adjust gain past photoreceptors in the fly eye. We combined imaging and behavioral experiments with computational modeling to show that downstream of photoreceptors, circuitry taking input from the single luminance-sensitive neuron type L3 implements gain control at fast and slow timescales. This computation is bidirectional in that it prevents the underestimation of contrasts in low luminance and overestimation in high luminance. An algorithmic model disentangles these multifaceted contributions and shows that the bidirectional gain control occurs at both timescales. The model implements a nonlinear interaction of luminance and contrast to achieve gain correction at fast timescales and a dark-sensitive channel to improve the detection of dim stimuli at slow timescales. Together, our work demonstrates how a single neuronal channel performs diverse computations to implement gain control at multiple timescales that are together important for navigation in natural environments.
Collapse
Affiliation(s)
- Madhura D Ketkar
- Institute of Developmental and Neurobiology, Johannes-Gutenberg University Mainz, Hanns-Dieter-Hüsch-Weg 15, 55128 Mainz, Germany
| | - Shuai Shao
- Max Planck Institute for Brain Research, Max-von-Laue-Straße 4, 60438 Frankfurt am Main, Germany; Department of Neurophysiology, Radboud University, Heyendaalseweg 135, 6525 EN Nijmegen, the Netherlands
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain Research, Max-von-Laue-Straße 4, 60438 Frankfurt am Main, Germany; School of Life Sciences, Technical University Munich, Maximus-von-Imhof-Forum 3, 85354 Freising, Germany.
| | - Marion Silies
- Institute of Developmental and Neurobiology, Johannes-Gutenberg University Mainz, Hanns-Dieter-Hüsch-Weg 15, 55128 Mainz, Germany.
| |
Collapse
|
18
|
Currier TA, Pang MM, Clandinin TR. Visual processing in the fly, from photoreceptors to behavior. Genetics 2023; 224:iyad064. [PMID: 37128740 PMCID: PMC10213501 DOI: 10.1093/genetics/iyad064] [Citation(s) in RCA: 23] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Accepted: 03/22/2023] [Indexed: 05/03/2023] Open
Abstract
Originally a genetic model organism, the experimental use of Drosophila melanogaster has grown to include quantitative behavioral analyses, sophisticated perturbations of neuronal function, and detailed sensory physiology. A highlight of these developments can be seen in the context of vision, where pioneering studies have uncovered fundamental and generalizable principles of sensory processing. Here we begin with an overview of vision-guided behaviors and common methods for probing visual circuits. We then outline the anatomy and physiology of brain regions involved in visual processing, beginning at the sensory periphery and ending with descending motor control. Areas of focus include contrast and motion detection in the optic lobe, circuits for visual feature selectivity, computations in support of spatial navigation, and contextual associative learning. Finally, we look to the future of fly visual neuroscience and discuss promising topics for further study.
Collapse
Affiliation(s)
- Timothy A Currier
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Michelle M Pang
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| |
Collapse
|
19
|
Mano O, Choi M, Tanaka R, Creamer MS, Matos NC, Shomar J, Badwan BA, Clandinin TR, Clark DA. Long timescale anti-directional rotation in Drosophila optomotor behavior. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.06.523055. [PMID: 36711627 PMCID: PMC9882005 DOI: 10.1101/2023.01.06.523055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
Locomotor movements cause visual images to be displaced across the eye, a retinal slip that is counteracted by stabilizing reflexes in many animals. In insects, optomotor turning causes the animal to turn in the direction of rotating visual stimuli, thereby reducing retinal slip and stabilizing trajectories through the world. This behavior has formed the basis for extensive dissections of motion vision. Here, we report that under certain stimulus conditions, two Drosophila species, including the widely studied D. melanogaster, can suppress and even reverse the optomotor turning response over several seconds. Such "anti-directional turning" is most strongly evoked by long-lasting, high-contrast, slow-moving visual stimuli that are distinct from those that promote syn-directional optomotor turning. Anti-directional turning, like the syn-directional optomotor response, requires the local motion detecting neurons T4 and T5. A subset of lobula plate tangential cells, CH cells, show involvement in these responses. Imaging from a variety of direction-selective cells in the lobula plate shows no evidence of dynamics that match the behavior, suggesting that the observed inversion in turning direction emerges downstream of the lobula plate. Further, anti-directional turning declines with age and exposure to light. These results show that Drosophila optomotor turning behaviors contain rich, stimulus-dependent dynamics that are inconsistent with simple reflexive stabilization responses.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Minseung Choi
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Matthew S. Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Natalia C.B. Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Joseph Shomar
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Bara A. Badwan
- Department of Chemical Engineering, Yale University, New Haven, CT 06511, USA
| | | | - Damon A. Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
20
|
Skelton PSM, Finn A, Brinkworth RSA. Contrast independent biologically inspired translational optic flow estimation. BIOLOGICAL CYBERNETICS 2022; 116:635-660. [PMID: 36303043 PMCID: PMC9691503 DOI: 10.1007/s00422-022-00948-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Accepted: 10/11/2022] [Indexed: 06/16/2023]
Abstract
The visual systems of insects are relatively simple compared to humans. However, they enable navigation through complex environments where insects perform exceptional levels of obstacle avoidance. Biology uses two separable modes of optic flow to achieve this: rapid gaze fixation (rotational motion known as saccades); and the inter-saccadic translational motion. While the fundamental process of insect optic flow has been known since the 1950's, so too has its dependence on contrast. The surrounding visual pathways used to overcome environmental dependencies are less well known. Previous work has shown promise for low-speed rotational motion estimation, but a gap remained in the estimation of translational motion, in particular the estimation of the time to impact. To consistently estimate the time to impact during inter-saccadic translatory motion, the fundamental limitation of contrast dependence must be overcome. By adapting an elaborated rotational velocity estimator from literature to work for translational motion, this paper proposes a novel algorithm for overcoming the contrast dependence of time to impact estimation using nonlinear spatio-temporal feedforward filtering. By applying bioinspired processes, approximately 15 points per decade of statistical discrimination were achieved when estimating the time to impact to a target across 360 background, distance, and velocity combinations: a 17-fold increase over the fundamental process. These results show the contrast dependence of time to impact estimation can be overcome in a biologically plausible manner. This, combined with previous results for low-speed rotational motion estimation, allows for contrast invariant computational models designed on the principles found in the biological visual system, paving the way for future visually guided systems.
Collapse
Affiliation(s)
- Phillip S. M. Skelton
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| | - Anthony Finn
- Science, Technology, Engineering, and Mathematics, University of South Australia, 1 Mawson Lakes Boulevard, Mawson Lakes, South Australia 5095 Australia
| | - Russell S. A. Brinkworth
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| |
Collapse
|
21
|
Gonzalez-Suarez AD, Zavatone-Veth JA, Chen J, Matulis CA, Badwan BA, Clark DA. Excitatory and inhibitory neural dynamics jointly tune motion detection. Curr Biol 2022; 32:3659-3675.e8. [PMID: 35868321 PMCID: PMC9474608 DOI: 10.1016/j.cub.2022.06.075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2022] [Revised: 05/03/2022] [Accepted: 06/24/2022] [Indexed: 11/26/2022]
Abstract
Neurons integrate excitatory and inhibitory signals to produce their outputs, but the role of input timing in this integration remains poorly understood. Motion detection is a paradigmatic example of this integration, since theories of motion detection rely on different delays in visual signals. These delays allow circuits to compare scenes at different times to calculate the direction and speed of motion. Different motion detection circuits have different velocity sensitivity, but it remains untested how the response dynamics of individual cell types drive this tuning. Here, we sped up or slowed down specific neuron types in Drosophila's motion detection circuit by manipulating ion channel expression. Altering the dynamics of individual neuron types upstream of motion detectors increased their sensitivity to fast or slow visual motion, exposing distinct roles for excitatory and inhibitory dynamics in tuning directional signals, including a role for the amacrine cell CT1. A circuit model constrained by functional data and anatomy qualitatively reproduced the observed tuning changes. Overall, these results reveal how excitatory and inhibitory dynamics together tune a canonical circuit computation.
Collapse
Affiliation(s)
| | - Jacob A Zavatone-Veth
- Department of Physics, Harvard University, Cambridge, MA 02138, USA; Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Juyue Chen
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | | | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
22
|
Huang X, Kim AJ, Acarón Ledesma H, Ding J, Smith RG, Wei W. Visual Stimulation Induces Distinct Forms of Sensitization of On-Off Direction-Selective Ganglion Cell Responses in the Dorsal and Ventral Retina. J Neurosci 2022; 42:4449-4469. [PMID: 35474276 PMCID: PMC9172291 DOI: 10.1523/jneurosci.1391-21.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Revised: 04/15/2022] [Accepted: 04/19/2022] [Indexed: 11/21/2022] Open
Abstract
Experience-dependent modulation of neuronal responses is a key attribute in sensory processing. In the mammalian retina, the On-Off direction-selective ganglion cell (DSGC) is well known for its robust direction selectivity. However, how the On-Off DSGC light responsiveness dynamically adjusts to the changing visual environment is underexplored. Here, we report that On-Off DSGCs tuned to posterior motion direction [i.e. posterior DSGCs (pDSGCs)] in mice of both sexes can be transiently sensitized by prior stimuli. Notably, distinct sensitization patterns are found in dorsal and ventral pDSGCs. Although responses of both dorsal and ventral pDSGCs to dark stimuli (Off responses) are sensitized, only dorsal cells show the sensitization of responses to bright stimuli (On responses). Visual stimulation to the dorsal retina potentiates a sustained excitatory input from Off bipolar cells, leading to tonic depolarization of pDSGCs. Such tonic depolarization propagates from the Off to the On dendritic arbor of the pDSGC to sensitize its On response. We also identified a previously overlooked feature of DSGC dendritic architecture that can support dendritic integration between On and Off dendritic layers bypassing the soma. By contrast, ventral pDSGCs lack a sensitized tonic depolarization and thus do not exhibit sensitization of their On responses. Our results highlight a topographic difference in Off bipolar cell inputs underlying divergent sensitization patterns of dorsal and ventral pDSGCs. Moreover, substantial crossovers between dendritic layers of On-Off DSGCs suggest an interactive dendritic algorithm for processing On and Off signals before they reach the soma.SIGNIFICANCE STATEMENT Visual neuronal responses are dynamically influenced by the prior visual experience. This form of plasticity reflects the efficient coding of the naturalistic environment by the visual system. We found that a class of retinal output neurons, On-Off direction-selective ganglion cells, transiently increase their responsiveness after visual stimulation. Cells located in dorsal and ventral retinas exhibit distinct sensitization patterns because of different adaptive properties of Off bipolar cell signaling. A previously overlooked dendritic morphologic feature of the On-Off direction-selective ganglion cell is implicated in the cross talk between On and Off pathways during sensitization. Together, these findings uncover a topographic difference in the adaptive encoding of upper and lower visual fields and the underlying neural mechanism in the dorsal and ventral retinas.
Collapse
Affiliation(s)
- Xiaolin Huang
- Department of Neurobiology, The University of Chicago, Chicago, Illinois 60637
- The Committee on Neurobiology Graduate Program, The University of Chicago, Chicago, Illinois 60637
| | - Alan Jaehyun Kim
- Department of Neurobiology, The University of Chicago, Chicago, Illinois 60637
| | - Héctor Acarón Ledesma
- Department of Neurobiology, The University of Chicago, Chicago, Illinois 60637
- Graduate Program in Biophysical Sciences, University of Chicago, Chicago, Illinois 60637
| | - Jennifer Ding
- Department of Neurobiology, The University of Chicago, Chicago, Illinois 60637
- The Committee on Neurobiology Graduate Program, The University of Chicago, Chicago, Illinois 60637
| | - Robert G Smith
- Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania 19104
| | - Wei Wei
- Department of Neurobiology, The University of Chicago, Chicago, Illinois 60637
| |
Collapse
|
23
|
Ketkar MD, Gür B, Molina-Obando S, Ioannidou M, Martelli C, Silies M. First-order visual interneurons distribute distinct contrast and luminance information across ON and OFF pathways to achieve stable behavior. eLife 2022; 11:74937. [PMID: 35263247 PMCID: PMC8967382 DOI: 10.7554/elife.74937] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2021] [Accepted: 03/03/2022] [Indexed: 11/26/2022] Open
Abstract
The accurate processing of contrast is the basis for all visually guided behaviors. Visual scenes with rapidly changing illumination challenge contrast computation because photoreceptor adaptation is not fast enough to compensate for such changes. Yet, human perception of contrast is stable even when the visual environment is quickly changing, suggesting rapid post receptor luminance gain control. Similarly, in the fruit fly Drosophila, such gain control leads to luminance invariant behavior for moving OFF stimuli. Here, we show that behavioral responses to moving ON stimuli also utilize a luminance gain, and that ON-motion guided behavior depends on inputs from three first-order interneurons L1, L2, and L3. Each of these neurons encodes contrast and luminance differently and distributes information asymmetrically across both ON and OFF contrast-selective pathways. Behavioral responses to both ON and OFF stimuli rely on a luminance-based correction provided by L1 and L3, wherein L1 supports contrast computation linearly, and L3 non-linearly amplifies dim stimuli. Therefore, L1, L2, and L3 are not specific inputs to ON and OFF pathways but the lamina serves as a separate processing layer that distributes distinct luminance and contrast information across ON and OFF pathways to support behavior in varying conditions.
Collapse
Affiliation(s)
- Madhura D Ketkar
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Burak Gür
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Sebastian Molina-Obando
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Maria Ioannidou
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Carlotta Martelli
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Marion Silies
- Institute of Developmental Biology and Neurobiology, Johannes Gutenberg University of Mainz, Mainz, Germany
| |
Collapse
|
24
|
Zhou B, Li Z, Kim S, Lafferty J, Clark DA. Shallow neural networks trained to detect collisions recover features of visual loom-selective neurons. eLife 2022; 11:72067. [PMID: 35023828 PMCID: PMC8849349 DOI: 10.7554/elife.72067] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Accepted: 01/11/2022] [Indexed: 11/13/2022] Open
Abstract
Animals have evolved sophisticated visual circuits to solve a vital inference problem: detecting whether or not a visual signal corresponds to an object on a collision course. Such events are detected by specific circuits sensitive to visual looming, or objects increasing in size. Various computational models have been developed for these circuits, but how the collision-detection inference problem itself shapes the computational structures of these circuits remains unknown. Here, inspired by the distinctive structures of LPLC2 neurons in the visual system of Drosophila, we build anatomically-constrained shallow neural network models and train them to identify visual signals that correspond to impending collisions. Surprisingly, the optimization arrives at two distinct, opposing solutions, only one of which matches the actual dendritic weighting of LPLC2 neurons. Both solutions can solve the inference problem with high accuracy when the population size is large enough. The LPLC2-like solutions reproduces experimentally observed LPLC2 neuron responses for many stimuli, and reproduces canonical tuning of loom sensitive neurons, even though the models are never trained on neural data. Thus, LPLC2 neuron properties and tuning are predicted by optimizing an anatomically-constrained neural network to detect impending collisions. More generally, these results illustrate how optimizing inference tasks that are important for an animal's perceptual goals can reveal and explain computational properties of specific sensory neurons.
Collapse
Affiliation(s)
- Baohua Zhou
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| | - Zifan Li
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Sunnie Kim
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - John Lafferty
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| |
Collapse
|
25
|
Kohn JR, Portes JP, Christenson MP, Abbott LF, Behnia R. Flexible filtering by neural inputs supports motion computation across states and stimuli. Curr Biol 2021; 31:5249-5260.e5. [PMID: 34670114 DOI: 10.1016/j.cub.2021.09.061] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2021] [Revised: 08/10/2021] [Accepted: 09/22/2021] [Indexed: 01/05/2023]
Abstract
Sensory systems flexibly adapt their processing properties across a wide range of environmental and behavioral conditions. Such variable processing complicates attempts to extract a mechanistic understanding of sensory computations. This is evident in the highly constrained, canonical Drosophila motion detection circuit, where the core computation underlying direction selectivity is still debated despite extensive studies. Here we measured the filtering properties of neural inputs to the OFF motion-detecting T5 cell in Drosophila. We report state- and stimulus-dependent changes in the shape of these signals, which become more biphasic under specific conditions. Summing these inputs within the framework of a connectomic-constrained model of the circuit demonstrates that these shapes are sufficient to explain T5 responses to various motion stimuli. Thus, our stimulus- and state-dependent measurements reconcile motion computation with the anatomy of the circuit. These findings provide a clear example of how a basic circuit supports flexible sensory computation.
Collapse
Affiliation(s)
- Jessica R Kohn
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA
| | - Jacob P Portes
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Matthias P Christenson
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - L F Abbott
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Rudy Behnia
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Kavli Institute for Brain Science, Columbia University, New York, NY 10027, USA.
| |
Collapse
|
26
|
Nagel K. Motion vision: Pinning down motion computation in an ever-changing circuit. Curr Biol 2021; 31:R1523-R1525. [PMID: 34875241 DOI: 10.1016/j.cub.2021.10.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
A new electrophysiological study of the Drosophila visual system, recording from columnar inputs to motion-detecting neurons, has provided new insights into the computations that underlie motion vision.
Collapse
Affiliation(s)
- Katherine Nagel
- Neuroscience Institute, NYU School of Medicine, 435 E. 30(th) Street, Room 1102, New York, NY 10016, USA.
| |
Collapse
|
27
|
James JV, Cazzolato BS, Grainger S, Wiederman SD. Nonlinear, neuronal adaptation in insect vision models improves target discrimination within repetitively moving backgrounds. BIOINSPIRATION & BIOMIMETICS 2021; 16:066015. [PMID: 34555824 DOI: 10.1088/1748-3190/ac2988] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/23/2021] [Accepted: 09/23/2021] [Indexed: 06/13/2023]
Abstract
Neurons which respond selectively to small moving targets, even against a cluttered background, have been identified in several insect species. To investigate what underlies these robust and highly selective responses, researchers have probed the neuronal circuitry in target-detecting, visual pathways. Observations in flies reveal nonlinear adaptation over time, composed of a fast onset and gradual decay. This adaptive processing is seen in both of the independent, parallel pathways encoding either luminance increments (ON channel) or decrements (OFF channel). The functional significance of this adaptive phenomenon has not been determined from physiological studies, though the asymmetrical time course suggests a role in suppressing responses to repetitive stimuli. We tested this possibility by comparing an implementation of fast adaptation against alternatives, using a model of insect 'elementary small target motion detectors'. We conducted target-detecting simulations on various natural backgrounds, that were shifted via several movement profiles (and target velocities). Using performance metrics, we confirmed that the fast adaptation observed in neuronal systems enhances target detection against a repetitively moving background. Such background movement would be encountered via natural ego-motion as the insect travels through the world. These findings show that this form of nonlinear, fast-adaptation (suitably implementable via cellular biophysics) plays a role analogous to background subtraction techniques in conventional computer vision.
Collapse
Affiliation(s)
- John V James
- School of Mechanical Engineering, University of Adelaide, Adelaide SA, Australia
- Adelaide Medical School, University of Adelaide, Adelaide SA, Australia
| | - Benjamin S Cazzolato
- School of Mechanical Engineering, University of Adelaide, Adelaide SA, Australia
| | - Steven Grainger
- School of Mechanical Engineering, University of Adelaide, Adelaide SA, Australia
| | | |
Collapse
|
28
|
Mano O, Creamer MS, Badwan BA, Clark DA. Predicting individual neuron responses with anatomically constrained task optimization. Curr Biol 2021; 31:4062-4075.e4. [PMID: 34324832 DOI: 10.1016/j.cub.2021.06.090] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Revised: 05/24/2021] [Accepted: 06/29/2021] [Indexed: 01/28/2023]
Abstract
Artificial neural networks trained to solve sensory tasks can develop statistical representations that match those in biological circuits. However, it remains unclear whether they can reproduce properties of individual neurons. Here, we investigated how artificial networks predict individual neuron properties in the visual motion circuits of the fruit fly Drosophila. We trained anatomically constrained networks to predict movement in natural scenes, solving the same inference problem as fly motion detectors. Units in the artificial networks adopted many properties of analogous individual neurons, even though they were not explicitly trained to match these properties. Among these properties was the split into ON and OFF motion detectors, which is not predicted by classical motion detection models. The match between model and neurons was closest when models were trained to be robust to noise. These results demonstrate how anatomical, task, and noise constraints can explain properties of individual neurons in a small neural network.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
29
|
Ding J, Chen A, Chung J, Acaron Ledesma H, Wu M, Berson DM, Palmer SE, Wei W. Spatially displaced excitation contributes to the encoding of interrupted motion by a retinal direction-selective circuit. eLife 2021; 10:e68181. [PMID: 34096504 PMCID: PMC8211448 DOI: 10.7554/elife.68181] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Accepted: 06/06/2021] [Indexed: 12/19/2022] Open
Abstract
Spatially distributed excitation and inhibition collectively shape a visual neuron's receptive field (RF) properties. In the direction-selective circuit of the mammalian retina, the role of strong null-direction inhibition of On-Off direction-selective ganglion cells (On-Off DSGCs) on their direction selectivity is well-studied. However, how excitatory inputs influence the On-Off DSGC's visual response is underexplored. Here, we report that On-Off DSGCs have a spatially displaced glutamatergic receptive field along their horizontal preferred-null motion axes. This displaced receptive field contributes to DSGC null-direction spiking during interrupted motion trajectories. Theoretical analyses indicate that population responses during interrupted motion may help populations of On-Off DSGCs signal the spatial location of moving objects in complex, naturalistic visual environments. Our study highlights that the direction-selective circuit exploits separate sets of mechanisms under different stimulus conditions, and these mechanisms may help encode multiple visual features.
Collapse
Affiliation(s)
- Jennifer Ding
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - Albert Chen
- Department of Organismal Biology, The University of ChicagoChicagoUnited States
| | - Janet Chung
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - Hector Acaron Ledesma
- Graduate Program in Biophysical Sciences, The University of ChicagoChicagoUnited States
| | - Mofei Wu
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - David M Berson
- Department of Neuroscience and Carney Institute for Brain Science, Brown UniversityProvidenceUnited States
| | - Stephanie E Palmer
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Organismal Biology, The University of ChicagoChicagoUnited States
- Grossman Institute for Neuroscience, Quantitative Biology and Human Behavior, The University of ChicagoChicagoUnited States
| | - Wei Wei
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Neurobiology, The University of ChicagoChicagoUnited States
- Grossman Institute for Neuroscience, Quantitative Biology and Human Behavior, The University of ChicagoChicagoUnited States
| |
Collapse
|
30
|
Maximally efficient prediction in the early fly visual system may support evasive flight maneuvers. PLoS Comput Biol 2021; 17:e1008965. [PMID: 34014926 PMCID: PMC8136689 DOI: 10.1371/journal.pcbi.1008965] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 04/13/2021] [Indexed: 11/20/2022] Open
Abstract
The visual system must make predictions to compensate for inherent delays in its processing. Yet little is known, mechanistically, about how prediction aids natural behaviors. Here, we show that despite a 20-30ms intrinsic processing delay, the vertical motion sensitive (VS) network of the blowfly achieves maximally efficient prediction. This prediction enables the fly to fine-tune its complex, yet brief, evasive flight maneuvers according to its initial ego-rotation at the time of detection of the visual threat. Combining a rich database of behavioral recordings with detailed compartmental modeling of the VS network, we further show that the VS network has axonal gap junctions that are critical for optimal prediction. During evasive maneuvers, a VS subpopulation that directly innervates the neck motor center can convey predictive information about the fly’s future ego-rotation, potentially crucial for ongoing flight control. These results suggest a novel sensory-motor pathway that links sensory prediction to behavior. Survival-critical behaviors shape neural circuits to translate sensory information into strikingly fast predictions, e.g. in escaping from a predator faster than the system’s processing delay. We show that the fly visual system implements fast and accurate prediction of its visual experience. This provides crucial information for directing fast evasive maneuvers that unfold over just 40ms. Our work shows how this fast prediction is implemented, mechanistically, and suggests the existence of a novel sensory-motor pathway from the fly visual system to a wing steering motor neuron. Echoing and amplifying previous work in the retina, our work hypothesizes that the efficient encoding of predictive information is a universal design principle supporting fast, natural behaviors.
Collapse
|
31
|
Zavatone-Veth JA, Badwan BA, Clark DA. A minimal synaptic model for direction selective neurons in Drosophila. J Vis 2020; 20:2. [PMID: 32040161 PMCID: PMC7343402 DOI: 10.1167/jov.20.2.2] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/07/2023] Open
Abstract
Visual motion estimation is a canonical neural computation. In Drosophila, recent advances have identified anatomic and functional circuitry underlying direction-selective computations. Models with varying levels of abstraction have been proposed to explain specific experimental results but have rarely been compared across experiments. Here we use the wealth of available anatomical and physiological data to construct a minimal, biophysically inspired synaptic model for Drosophila’s first-order direction-selective T4 cells. We show how this model relates mathematically to classical models of motion detection, including the Hassenstein-Reichardt correlator model. We used numerical simulation to test how well this synaptic model could reproduce measurements of T4 cells across many datasets and stimulus modalities. These comparisons include responses to sinusoid gratings, to apparent motion stimuli, to stochastic stimuli, and to natural scenes. Without fine-tuning this model, it sufficed to reproduce many, but not all, response properties of T4 cells. Since this model is flexible and based on straightforward biophysical properties, it provides an extensible framework for developing a mechanistic understanding of T4 neural response properties. Moreover, it can be used to assess the sufficiency of simple biophysical mechanisms to describe features of the direction-selective computation and identify where our understanding must be improved.
Collapse
|
32
|
Tanaka R, Clark DA. Object-Displacement-Sensitive Visual Neurons Drive Freezing in Drosophila. Curr Biol 2020; 30:2532-2550.e8. [PMID: 32442466 PMCID: PMC8716191 DOI: 10.1016/j.cub.2020.04.068] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Revised: 04/22/2020] [Accepted: 04/24/2020] [Indexed: 11/26/2022]
Abstract
Visual systems are often equipped with neurons that detect small moving objects, which may represent prey, predators, or conspecifics. Although the processing properties of those neurons have been studied in diverse organisms, links between the proposed algorithms and animal behaviors or circuit mechanisms remain elusive. Here, we have investigated behavioral function, computational algorithm, and neurochemical mechanisms of an object-selective neuron, LC11, in Drosophila. With genetic silencing and optogenetic activation, we show that LC11 is necessary for a visual object-induced stopping behavior in walking flies, a form of short-term freezing, and its activity can promote stopping. We propose a new quantitative model for small object selectivity based on the physiology and anatomy of LC11 and its inputs. The model accurately reproduces LC11 responses by pooling fast-adapting, tightly size-tuned inputs. Direct visualization of neurotransmitter inputs to LC11 confirmed the model conjectures about upstream processing. Our results demonstrate how adaptation can enhance selectivity for behaviorally relevant, dynamic visual features.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
33
|
Schuetzenberger A, Borst A. Seeing Natural Images through the Eye of a Fly with Remote Focusing Two-Photon Microscopy. iScience 2020; 23:101170. [PMID: 32502966 PMCID: PMC7270611 DOI: 10.1016/j.isci.2020.101170] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Revised: 04/02/2020] [Accepted: 05/12/2020] [Indexed: 11/30/2022] Open
Abstract
Visual systems of many animals, including the fruit fly Drosophila, represent the surrounding space as 2D maps, formed by populations of neurons. Advanced genetic tools make the fly visual system especially well accessible. However, in typical in vivo preparations for two-photon calcium imaging, relatively few neurons can be recorded at the same time. Here, we present an extension to a conventional two-photon microscope, based on remote focusing, which enables real-time rotation of the imaging plane, and thus flexible alignment to cellular structures, without resolution or speed trade-off. We simultaneously record from over 100 neighboring cells spanning the 2D retinotopic map. We characterize its representation of moving natural images, which we find is comparable to noise predictions. Our method increases throughput 10-fold and allows us to visualize a significant fraction of the fly's visual field. Furthermore, our system can be applied in general for a more flexible investigation of neural circuits.
Collapse
Affiliation(s)
- Anna Schuetzenberger
- Department Circuits - Computation - Models, Max-Planck-Institute of Neurobiology, 82152 Planegg, Germany; Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität, 82152 Planegg, Germany.
| | - Alexander Borst
- Department Circuits - Computation - Models, Max-Planck-Institute of Neurobiology, 82152 Planegg, Germany; Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität, 82152 Planegg, Germany.
| |
Collapse
|
34
|
Wienecke CFR, Clandinin TR. Drosophila Vision: An Eye for Change. Curr Biol 2020; 30:R66-R68. [DOI: 10.1016/j.cub.2019.11.069] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|