1
|
Wu N, Zhou B, Agrochao M, Clark DA. Broken time-reversal symmetry in visual motion detection. Proc Natl Acad Sci U S A 2025; 122:e2410768122. [PMID: 40048271 PMCID: PMC11912477 DOI: 10.1073/pnas.2410768122] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2024] [Accepted: 01/09/2025] [Indexed: 03/12/2025] Open
Abstract
Our intuition suggests that when a movie is played in reverse, our perception of motion at each location in the reversed movie will be perfectly inverted compared to the original. This intuition is also reflected in classical theoretical and practical models of motion estimation, in which velocity flow fields invert when inputs are reversed in time. However, here we report that this symmetry of motion perception upon time reversal is broken in real visual systems. We designed a set of visual stimuli to investigate time reversal symmetry breaking in the fruit fly Drosophila's well-studied optomotor rotation behavior. We identified a suite of stimuli with a wide variety of properties that can uncover broken time reversal symmetry in fly behavioral responses. We then trained neural network models to predict the velocity of scenes with both natural and artificial contrast distributions. Training with naturalistic contrast distributions yielded models that broke time reversal symmetry, even when the training data themselves were time reversal symmetric. We show analytically and numerically that the breaking of time reversal symmetry in the model responses can arise from contrast asymmetry in the training data, but can also arise from other features of the contrast distribution. Furthermore, shallower neural network models can exhibit stronger symmetry breaking than deeper ones, suggesting that less flexible neural networks may be more prone to time reversal symmetry breaking. Overall, these results reveal a surprising feature of biological motion detectors and suggest that it could arise from constrained optimization in natural environments.
Collapse
Affiliation(s)
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT06511
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT06511
| | - Damon A. Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT06511
- Department of Physics, Yale University, New Haven, CT06511
- Department of Neuroscience, Yale University, New Haven, CT06511
- Quantitative Biology Institute, Yale University, New Haven, CT06511
- Wu Tsai Institute, Yale University, New Haven, CT06511
| |
Collapse
|
2
|
Pang MM, Chen F, Xie M, Druckmann S, Clandinin TR, Yang HH. A recurrent neural circuit in Drosophila temporally sharpens visual inputs. Curr Biol 2025; 35:333-346.e6. [PMID: 39706173 PMCID: PMC11769683 DOI: 10.1016/j.cub.2024.11.064] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2024] [Revised: 10/28/2024] [Accepted: 11/26/2024] [Indexed: 12/23/2024]
Abstract
A critical goal of vision is to detect changes in light intensity, even when these changes are blurred by the spatial resolution of the eye and the motion of the animal. Here, we describe a recurrent neural circuit in Drosophila that compensates for blur and thereby selectively enhances the perceived contrast of moving edges. Using in vivo, two-photon voltage imaging, we measured the temporal response properties of L1 and L2, two cell types that receive direct synaptic input from photoreceptors. These neurons have biphasic responses to brief flashes of light, a hallmark of cells that encode changes in stimulus intensity. However, the second phase was often much larger in area than the first, creating an unusual temporal filter. Genetic dissection revealed that recurrent neural circuitry strongly shapes the second phase of the response, informing the structure of a dynamical model. By applying this model to moving natural images, we demonstrate that rather than veridically representing stimulus changes, this temporal processing strategy systematically enhances them, amplifying and sharpening responses. Comparing the measured responses of L2 to model predictions across both artificial and natural stimuli revealed that L2 tunes its properties as the model predicts to temporally sharpen visual inputs. Since this strategy is tunable to behavioral context, generalizable to any time-varying sensory input, and implementable with a common circuit motif, we propose that it could be broadly used to selectively enhance sharp and salient changes.
Collapse
Affiliation(s)
- Michelle M Pang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Feng Chen
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA; Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Marjorie Xie
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Shaul Druckmann
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA; Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, CA 94305, USA
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA; Chan Zuckerberg Biohub, San Francisco, CA, USA
| | - Helen H Yang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA.
| |
Collapse
|
3
|
Zhang S, Li K, Luo Z, Xu M, Zheng S. A Bio-Inspired Visual Neural Model for Robustly and Steadily Detecting Motion Directions of Translating Objects Against Variable Contrast in the Figure-Ground and Noise Interference. Biomimetics (Basel) 2025; 10:51. [PMID: 39851767 PMCID: PMC11761596 DOI: 10.3390/biomimetics10010051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2024] [Revised: 12/26/2024] [Accepted: 01/07/2025] [Indexed: 01/26/2025] Open
Abstract
(1) Background: At present, the bio-inspired visual neural models have made significant achievements in detecting the motion direction of the translating object. Variable contrast in the figure-ground and environmental noise interference, however, have a strong influence on the existing model. The responses of the lobula plate tangential cell (LPTC) neurons of Drosophila are robust and stable in the face of variable contrast in the figure-ground and environmental noise interference, which provides an excellent paradigm for addressing these challenges. (2) Methods: To resolve these challenges, we propose a bio-inspired visual neural model, which consists of four stages. Firstly, the photoreceptors (R1-R6) are utilized to perceive the change in luminance. Secondly, the change in luminance is divided into parallel ON and OFF pathways based on the lamina monopolar cell (LMC), and the spatial denoising and the spatio-temporal lateral inhibition (LI) mechanisms can suppress environmental noise and improve motion boundaries, respectively. Thirdly, the non-linear instantaneous feedback mechanism in divisive contrast normalization is adopted to reduce local contrast sensitivity; further, the parallel ON and OFF contrast pathways are activated. Finally, the parallel motion and contrast pathways converge on the LPTC in the lobula complex. (3) Results: By comparing numerous experimental simulations with state-of-the-art (SotA) bio-inspired models, we can draw four conclusions. Firstly, the effectiveness of the contrast neural computation and the spatial denoising mechanism is verified by the ablation study. Secondly, this model can robustly detect the motion direction of the translating object against variable contrast in the figure-ground and environmental noise interference. Specifically, the average detection success rate of the proposed bio-inspired model under the pure and real-world complex noise datasets was increased by 5.38% and 5.30%. Thirdly, this model can effectively reduce the fluctuation in this model response against variable contrast in the figure-ground and environmental noise interference, which shows the stability of this model; specifically, the average inter-quartile range of the coefficient of variation in the proposed bio-inspired model under the pure and real-world complex noise datasets was reduced by 38.77% and 47.84%, respectively. The average decline ratio of the sum of the coefficient of variation in the proposed bio-inspired model under the pure and real-world complex noise datasets was 57.03% and 67.47%, respectively. Finally, the robustness and stability of this model are further verified by comparing other early visual pre-processing mechanisms and engineering denoising methods. (4) Conclusions: This model can robustly and steadily detect the motion direction of the translating object under variable contrast in the figure-ground and environmental noise interference.
Collapse
Affiliation(s)
- Sheng Zhang
- College of Information Science and Engineering, Hohai University, Nanjing 211100, China; (S.Z.); (S.Z.)
| | - Ke Li
- School of Mechanical and Electrical Engineering, Nanchang Institute of Technology, Nanchang 330044, China
| | - Zhonghua Luo
- School of Mechanical and Electrical Engineering, Nanchang Institute of Technology, Nanchang 330044, China
| | - Mengxi Xu
- School of Computer Engineering, Nanjing Institute of Technology, Nanjing 211167, China;
| | - Shengnan Zheng
- College of Information Science and Engineering, Hohai University, Nanjing 211100, China; (S.Z.); (S.Z.)
- School of Computer Engineering, Nanjing Institute of Technology, Nanjing 211167, China;
| |
Collapse
|
4
|
Gou T, Matulis CA, Clark DA. Adaptation to visual sparsity enhances responses to isolated stimuli. Curr Biol 2024; 34:5697-5713.e8. [PMID: 39577424 PMCID: PMC11834764 DOI: 10.1016/j.cub.2024.10.053] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2024] [Revised: 09/17/2024] [Accepted: 10/18/2024] [Indexed: 11/24/2024]
Abstract
Sensory systems adapt their response properties to the statistics of their inputs. For instance, visual systems adapt to low-order statistics like mean and variance to encode stimuli efficiently or to facilitate specific downstream computations. However, it remains unclear how other statistical features affect sensory adaptation. Here, we explore how Drosophila's visual motion circuits adapt to stimulus sparsity, a measure of the signal's intermittency not captured by low-order statistics alone. Early visual neurons in both ON and OFF pathways alter their responses dramatically with stimulus sparsity, responding positively to both light and dark sparse stimuli but linearly to dense stimuli. These changes extend to downstream ON and OFF direction-selective neurons, which are activated by sparse stimuli of both polarities but respond with opposite signs to light and dark regions of dense stimuli. Thus, sparse stimuli activate both ON and OFF pathways, recruiting a larger fraction of the circuit and potentially enhancing the salience of isolated stimuli. Overall, our results reveal visual response properties that increase the fraction of the circuit responding to sparse, isolated stimuli.
Collapse
Affiliation(s)
- Tong Gou
- Department of Electrical Engineering, Yale University, New Haven, CT 06511, USA
| | | | - Damon A Clark
- Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA; Wu Tsai Institute, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
5
|
Al-Shimaysawee LAH, Finn A, Weber D, Schebella MF, Brinkworth RSA. Evaluation of Automated Object-Detection Algorithms for Koala Detection in Infrared Aerial Imagery. SENSORS (BASEL, SWITZERLAND) 2024; 24:7048. [PMID: 39517943 PMCID: PMC11548612 DOI: 10.3390/s24217048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/03/2024] [Revised: 10/27/2024] [Accepted: 10/28/2024] [Indexed: 11/16/2024]
Abstract
Effective detection techniques are important for wildlife monitoring and conservation applications and are especially helpful for species that live in complex environments, such as arboreal animals like koalas (Phascolarctos cinereus). The implementation of infrared cameras and drones has demonstrated encouraging outcomes, regardless of whether the detection was performed by human observers or automated algorithms. In the case of koala detection in eucalyptus plantations, there is a risk to spotters during forestry operations. In addition, fatigue and tedium associated with the difficult and repetitive task of checking every tree means automated detection options are particularly desirable. However, obtaining high detection rates with minimal false alarms remains a challenging task, particularly when there is low contrast between the animals and their surroundings. Koalas are also small and often partially or fully occluded by canopy, tree stems, or branches, or the background is highly complex. Biologically inspired vision systems are known for their superior ability in suppressing clutter and enhancing the contrast of dim objects of interest against their surroundings. This paper introduces a biologically inspired detection algorithm to locate koalas in eucalyptus plantations and evaluates its performance against ten other detection techniques, including both image processing and neural-network-based approaches. The nature of koala occlusion by canopy cover in these plantations was also examined using a combination of simulated and real data. The results show that the biologically inspired approach significantly outperformed the competing neural-network- and computer-vision-based approaches by over 27%. The analysis of simulated and real data shows that koala occlusion by tree stems and canopy can have a significant impact on the potential detection of koalas, with koalas being fully occluded in up to 40% of images in which koalas were known to be present. Our analysis shows the koala's heat signature is more likely to be occluded when it is close to the centre of the image (i.e., it is directly under a drone) and less likely to be occluded off the zenith. This has implications for flight considerations. This paper also describes a new accurate ground-truth dataset of aerial high-dynamic-range infrared imagery containing instances of koala heat signatures. This dataset is made publicly available to support the research community.
Collapse
Affiliation(s)
| | - Anthony Finn
- UniSA STEM, University of South Australia, Mawson Lakes, SA 5095, Australia; (A.F.); (D.W.); (M.F.S.)
| | - Delene Weber
- UniSA STEM, University of South Australia, Mawson Lakes, SA 5095, Australia; (A.F.); (D.W.); (M.F.S.)
| | - Morgan F. Schebella
- UniSA STEM, University of South Australia, Mawson Lakes, SA 5095, Australia; (A.F.); (D.W.); (M.F.S.)
| | | |
Collapse
|
6
|
Gür B, Ramirez L, Cornean J, Thurn F, Molina-Obando S, Ramos-Traslosheros G, Silies M. Neural pathways and computations that achieve stable contrast processing tuned to natural scenes. Nat Commun 2024; 15:8580. [PMID: 39362859 PMCID: PMC11450186 DOI: 10.1038/s41467-024-52724-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2024] [Accepted: 09/18/2024] [Indexed: 10/05/2024] Open
Abstract
Natural scenes are highly dynamic, challenging the reliability of visual processing. Yet, humans and many animals perform accurate visual behaviors, whereas computer vision devices struggle with rapidly changing background luminance. How does animal vision achieve this? Here, we reveal the algorithms and mechanisms of rapid luminance gain control in Drosophila, resulting in stable visual processing. We identify specific transmedullary neurons as the site of luminance gain control, which pass this property to direction-selective cells. The circuitry further involves wide-field neurons, matching computational predictions that local spatial pooling drive optimal contrast processing in natural scenes when light conditions change rapidly. Experiments and theory argue that a spatially pooled luminance signal achieves luminance gain control via divisive normalization. This process relies on shunting inhibition using the glutamate-gated chloride channel GluClα. Our work describes how the fly robustly processes visual information in dynamically changing natural scenes, a common challenge of all visual systems.
Collapse
Affiliation(s)
- Burak Gür
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
- The Friedrich Miescher Institute for Biomedical Research (FMI), Basel, Switzerland
| | - Luisa Ramirez
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Jacqueline Cornean
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Freya Thurn
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Sebastian Molina-Obando
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
| | - Giordano Ramos-Traslosheros
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany
- Department of Neurobiology, Harvard Medical School, Boston, MA, USA
| | - Marion Silies
- Institute of Developmental Biology and Neurobiology, Johannes-Gutenberg University Mainz, Mainz, Germany.
| |
Collapse
|
7
|
Seung HS. Predicting visual function by interpreting a neuronal wiring diagram. Nature 2024; 634:113-123. [PMID: 39358524 PMCID: PMC11446822 DOI: 10.1038/s41586-024-07953-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2023] [Accepted: 08/15/2024] [Indexed: 10/04/2024]
Abstract
As connectomics advances, it will become commonplace to know far more about the structure of a nervous system than about its function. The starting point for many investigations will become neuronal wiring diagrams, which will be interpreted to make theoretical predictions about function. Here I demonstrate this emerging approach with the Drosophila optic lobe, analysing its structure to predict that three Dm3 (refs. 1-4) and three TmY (refs. 2,4) cell types are part of a circuit that serves the function of form vision. Receptive fields are predicted from connectivity, and suggest that the cell types encode the local orientation of a visual stimulus. Extraclassical5,6 receptive fields are also predicted, with implications for robust orientation tuning7, position invariance8,9 and completion of noisy or illusory contours10,11. The TmY types synapse onto neurons that project from the optic lobe to the central brain12,13, which are conjectured to compute conjunctions and disjunctions of oriented features. My predictions can be tested through neurophysiology, which would constrain the parameters and biophysical mechanisms in neural network models of fly vision14.
Collapse
Affiliation(s)
- H Sebastian Seung
- Neuroscience Institute and Computer Science Department, Princeton University, Princeton, NJ, USA.
| |
Collapse
|
8
|
Matsliah A, Yu SC, Kruk K, Bland D, Burke AT, Gager J, Hebditch J, Silverman B, Willie KP, Willie R, Sorek M, Sterling AR, Kind E, Garner D, Sancer G, Wernet MF, Kim SS, Murthy M, Seung HS. Neuronal parts list and wiring diagram for a visual system. Nature 2024; 634:166-180. [PMID: 39358525 PMCID: PMC11446827 DOI: 10.1038/s41586-024-07981-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2023] [Accepted: 08/21/2024] [Indexed: 10/04/2024]
Abstract
A catalogue of neuronal cell types has often been called a 'parts list' of the brain1, and regarded as a prerequisite for understanding brain function2,3. In the optic lobe of Drosophila, rules of connectivity between cell types have already proven to be essential for understanding fly vision4,5. Here we analyse the fly connectome to complete the list of cell types intrinsic to the optic lobe, as well as the rules governing their connectivity. Most new cell types contain 10 to 100 cells, and integrate information over medium distances in the visual field. Some existing type families (Tm, Li, and LPi)6-10 at least double in number of types. A new serpentine medulla (Sm) interneuron family contains more types than any other. Three families of cross-neuropil types are revealed. The consistency of types is demonstrated by analysing the distances in high-dimensional feature space, and is further validated by algorithms that select small subsets of discriminative features. We use connectivity to hypothesize about the functional roles of cell types in motion, object and colour vision. Connectivity with 'boundary types' that straddle the optic lobe and central brain is also quantified. We showcase the advantages of connectomic cell typing: complete and unbiased sampling, a rich array of features based on connectivity and reduction of the connectome to a substantially simpler wiring diagram of cell types, with immediate relevance for brain function and development.
Collapse
Affiliation(s)
- Arie Matsliah
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Szi-Chieh Yu
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Krzysztof Kruk
- Independent researcher, Kielce, Poland
- Eyewire, Boston, MA, USA
| | - Doug Bland
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Austin T Burke
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Jay Gager
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - James Hebditch
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Ben Silverman
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | | | - Ryan Willie
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Marissa Sorek
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
- Eyewire, Boston, MA, USA
| | - Amy R Sterling
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
- Eyewire, Boston, MA, USA
| | - Emil Kind
- Institut für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Dustin Garner
- Molecular, Cellular and Developmental Biology, University of California, Santa Barbara, Santa Barbara, CA, USA
| | - Gizem Sancer
- Department of Neuroscience, Yale University, New Haven, CT, USA
| | - Mathias F Wernet
- Institut für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Sung Soo Kim
- Molecular, Cellular and Developmental Biology, University of California, Santa Barbara, Santa Barbara, CA, USA
| | - Mala Murthy
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
| | - H Sebastian Seung
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
- Computer Science Department, Princeton University, Princeton, NJ, USA.
| |
Collapse
|
9
|
Clark DA, Fitzgerald JE. Optimization in Visual Motion Estimation. Annu Rev Vis Sci 2024; 10:23-46. [PMID: 38663426 PMCID: PMC11998607 DOI: 10.1146/annurev-vision-101623-025432] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2025]
Abstract
Sighted animals use visual signals to discern directional motion in their environment. Motion is not directly detected by visual neurons, and it must instead be computed from light signals that vary over space and time. This makes visual motion estimation a near universal neural computation, and decades of research have revealed much about the algorithms and mechanisms that generate directional signals. The idea that sensory systems are optimized for performance in natural environments has deeply impacted this research. In this article, we review the many ways that optimization has been used to quantitatively model visual motion estimation and reveal its underlying principles. We emphasize that no single optimization theory has dominated the literature. Instead, researchers have adeptly incorporated different computational demands and biological constraints that are pertinent to the specific brain system and animal model under study. The successes and failures of the resulting optimization models have thereby provided insights into how computational demands and biological constraints together shape neural computation.
Collapse
Affiliation(s)
- Damon A Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, Connecticut, USA;
| | - James E Fitzgerald
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, Virginia, USA
- Department of Neurobiology, Northwestern University, Evanston, Illinois, USA;
| |
Collapse
|
10
|
Schwarz MB, O'Carroll DC, Evans BJE, Fabian JM, Wiederman SD. Localized and Long-Lasting Adaptation in Dragonfly Target-Detecting Neurons. eNeuro 2024; 11:ENEURO.0036-24.2024. [PMID: 39256041 PMCID: PMC11419696 DOI: 10.1523/eneuro.0036-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2024] [Revised: 08/03/2024] [Accepted: 08/28/2024] [Indexed: 09/12/2024] Open
Abstract
Some visual neurons in the dragonfly (Hemicordulia tau) optic lobe respond to small, moving targets, likely underlying their fast pursuit of prey and conspecifics. In response to repetitive targets presented at short intervals, the spiking activity of these "small target motion detector" (STMD) neurons diminishes over time. Previous experiments limited this adaptation by including intertrial rest periods of varying durations. However, the characteristics of this effect have never been quantified. Here, using extracellular recording techniques lasting for several hours, we quantified both the spatial and temporal properties of STMD adaptation. We found that the time course of adaptation was variable across STMD units. In any one STMD, a repeated series led to more rapid adaptation, a minor accumulative effect more akin to habituation. Following an adapting stimulus, responses recovered quickly, though the rate of recovery decreased nonlinearly over time. We found that the region of adaptation is highly localized, with targets displaced by ∼2.5° eliciting a naive response. Higher frequencies of target stimulation converged to lower levels of sustained response activity. We determined that adaptation itself is a target-tuned property, not elicited by moving bars or luminance flicker. As STMD adaptation is a localized phenomenon, dependent on recent history, it is likely to play an important role in closed-loop behavior where a target is foveated in a localized region for extended periods of the pursuit duration.
Collapse
Affiliation(s)
- Matthew B Schwarz
- School of Biomedicine, The University of Adelaide, Adelaide, South Australia 5001, Australia
| | | | - Bernard J E Evans
- School of Biomedicine, The University of Adelaide, Adelaide, South Australia 5001, Australia
| | - Joseph M Fabian
- School of Biomedicine, The University of Adelaide, Adelaide, South Australia 5001, Australia
| | - Steven D Wiederman
- School of Biomedicine, The University of Adelaide, Adelaide, South Australia 5001, Australia
| |
Collapse
|
11
|
Wu N, Zhou B, Agrochao M, Clark DA. Broken time reversal symmetry in visual motion detection. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.06.08.598068. [PMID: 38915608 PMCID: PMC11195140 DOI: 10.1101/2024.06.08.598068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/26/2024]
Abstract
Our intuition suggests that when a movie is played in reverse, our perception of motion in the reversed movie will be perfectly inverted compared to the original. This intuition is also reflected in many classical theoretical and practical models of motion detection. However, here we demonstrate that this symmetry of motion perception upon time reversal is often broken in real visual systems. In this work, we designed a set of visual stimuli to investigate how stimulus symmetries affect time reversal symmetry breaking in the fruit fly Drosophila's well-studied optomotor rotation behavior. We discovered a suite of new stimuli with a wide variety of different properties that can lead to broken time reversal symmetries in fly behavioral responses. We then trained neural network models to predict the velocity of scenes with both natural and artificial contrast distributions. Training with naturalistic contrast distributions yielded models that break time reversal symmetry, even when the training data was time reversal symmetric. We show analytically and numerically that the breaking of time reversal symmetry in the model responses can arise from contrast asymmetry in the training data, but can also arise from other features of the contrast distribution. Furthermore, shallower neural network models can exhibit stronger symmetry breaking than deeper ones, suggesting that less flexible neural networks promote some forms of time reversal symmetry breaking. Overall, these results reveal a surprising feature of biological motion detectors and suggest that it could arise from constrained optimization in natural environments.
Collapse
Affiliation(s)
- Nathan Wu
- Yale College, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Damon A. Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
- Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
12
|
Pang MM, Chen F, Xie M, Druckmann S, Clandinin TR, Yang HH. A recurrent neural circuit in Drosophila deblurs visual inputs. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.04.19.590352. [PMID: 38712245 PMCID: PMC11071408 DOI: 10.1101/2024.04.19.590352] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2024]
Abstract
A critical goal of vision is to detect changes in light intensity, even when these changes are blurred by the spatial resolution of the eye and the motion of the animal. Here we describe a recurrent neural circuit in Drosophila that compensates for blur and thereby selectively enhances the perceived contrast of moving edges. Using in vivo, two-photon voltage imaging, we measured the temporal response properties of L1 and L2, two cell types that receive direct synaptic input from photoreceptors. These neurons have biphasic responses to brief flashes of light, a hallmark of cells that encode changes in stimulus intensity. However, the second phase was often much larger than the first, creating an unusual temporal filter. Genetic dissection revealed that recurrent neural circuitry strongly shapes the second phase of the response, informing the structure of a dynamical model. By applying this model to moving natural images, we demonstrate that rather than veridically representing stimulus changes, this temporal processing strategy systematically enhances them, amplifying and sharpening responses. Comparing the measured responses of L2 to model predictions across both artificial and natural stimuli revealed that L2 tunes its properties as the model predicts in order to deblur images. Since this strategy is tunable to behavioral context, generalizable to any time-varying sensory input, and implementable with a common circuit motif, we propose that it could be broadly used to selectively enhance sharp and salient changes.
Collapse
Affiliation(s)
- Michelle M. Pang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Feng Chen
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Marjorie Xie
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Current affiliation: School for the Future of Innovation of Society, Arizona State University, Tempe, AZ 85281, USA
| | - Shaul Druckmann
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, CA 94305, USA
| | | | - Helen H. Yang
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
- Current affiliation: Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
- Lead contact
| |
Collapse
|
13
|
Prech S, Groschner LN, Borst A. An open platform for visual stimulation of insects. PLoS One 2024; 19:e0301999. [PMID: 38635686 PMCID: PMC11025907 DOI: 10.1371/journal.pone.0301999] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2024] [Accepted: 03/26/2024] [Indexed: 04/20/2024] Open
Abstract
To study how the nervous system processes visual information, experimenters must record neural activity while delivering visual stimuli in a controlled fashion. In animals with a nearly panoramic field of view, such as flies, precise stimulation of the entire visual field is challenging. We describe a projector-based device for stimulation of the insect visual system under a microscope. The device is based on a bowl-shaped screen that provides a wide and nearly distortion-free field of view. It is compact, cheap, easy to assemble, and easy to operate using the included open-source software for stimulus generation. We validate the virtual reality system technically and demonstrate its capabilities in a series of experiments at two levels: the cellular, by measuring the membrane potential responses of visual interneurons; and the organismal, by recording optomotor and fixation behavior of Drosophila melanogaster in tethered flight. Our experiments reveal the importance of stimulating the visual system of an insect with a wide field of view, and we provide a simple solution to do so.
Collapse
Affiliation(s)
- Stefan Prech
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Lukas N. Groschner
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Gottfried Schatz Research Center, Molecular Biology and Biochemistry, Medical University of Graz, Graz, Austria
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| |
Collapse
|
14
|
Schoepe T, Janotte E, Milde MB, Bertrand OJN, Egelhaaf M, Chicca E. Finding the gap: neuromorphic motion-vision in dense environments. Nat Commun 2024; 15:817. [PMID: 38280859 PMCID: PMC10821932 DOI: 10.1038/s41467-024-45063-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 01/15/2024] [Indexed: 01/29/2024] Open
Abstract
Animals have evolved mechanisms to travel safely and efficiently within different habitats. On a journey in dense terrains animals avoid collisions and cross narrow passages while controlling an overall course. Multiple hypotheses target how animals solve challenges faced during such travel. Here we show that a single mechanism enables safe and efficient travel. We developed a robot inspired by insects. It has remarkable capabilities to travel in dense terrain, avoiding collisions, crossing gaps and selecting safe passages. These capabilities are accomplished by a neuromorphic network steering the robot toward regions of low apparent motion. Our system leverages knowledge about vision processing and obstacle avoidance in insects. Our results demonstrate how insects might safely travel through diverse habitats. We anticipate our system to be a working hypothesis to study insects' travels in dense terrains. Furthermore, it illustrates that we can design novel hardware systems by understanding the underlying mechanisms driving behaviour.
Collapse
Affiliation(s)
- Thorben Schoepe
- Peter Grünberg Institut 15, Forschungszentrum Jülich, Aachen, Germany.
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany.
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands.
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands.
| | - Ella Janotte
- Event Driven Perception for Robotics, Italian Institute of Technology, iCub facility, Genoa, Italy
| | - Moritz B Milde
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Penrith, Australia
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| | - Elisabetta Chicca
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands
| |
Collapse
|
15
|
Tanaka R, Zhou B, Agrochao M, Badwan BA, Au B, Matos NCB, Clark DA. Neural mechanisms to incorporate visual counterevidence in self-movement estimation. Curr Biol 2023; 33:4960-4979.e7. [PMID: 37918398 PMCID: PMC10848174 DOI: 10.1016/j.cub.2023.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2023] [Revised: 10/07/2023] [Accepted: 10/09/2023] [Indexed: 11/04/2023]
Abstract
In selecting appropriate behaviors, animals should weigh sensory evidence both for and against specific beliefs about the world. For instance, animals measure optic flow to estimate and control their own rotation. However, existing models of flow detection can be spuriously triggered by visual motion created by objects moving in the world. Here, we show that stationary patterns on the retina, which constitute evidence against observer rotation, suppress inappropriate stabilizing rotational behavior in the fruit fly Drosophila. In silico experiments show that artificial neural networks (ANNs) that are optimized to distinguish observer movement from external object motion similarly detect stationarity and incorporate negative evidence. Employing neural measurements and genetic manipulations, we identified components of the circuitry for stationary pattern detection, which runs parallel to the fly's local motion and optic-flow detectors. Our results show how the fly brain incorporates negative evidence to improve heading stability, exemplifying how a compact brain exploits geometrical constraints of the visual world.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Statistics and Data Science, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Braedyn Au
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Natalia C B Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Wu Tsai Institute, Yale University, New Haven, CT 06511, USA; Quantitative Biology Institute, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
16
|
Ammer G, Serbe-Kamp E, Mauss AS, Richter FG, Fendl S, Borst A. Multilevel visual motion opponency in Drosophila. Nat Neurosci 2023; 26:1894-1905. [PMID: 37783895 PMCID: PMC10620086 DOI: 10.1038/s41593-023-01443-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 08/30/2023] [Indexed: 10/04/2023]
Abstract
Inhibitory interactions between opponent neuronal pathways constitute a common circuit motif across brain areas and species. However, in most cases, synaptic wiring and biophysical, cellular and network mechanisms generating opponency are unknown. Here, we combine optogenetics, voltage and calcium imaging, connectomics, electrophysiology and modeling to reveal multilevel opponent inhibition in the fly visual system. We uncover a circuit architecture in which a single cell type implements direction-selective, motion-opponent inhibition at all three network levels. This inhibition, mediated by GluClα receptors, is balanced with excitation in strength, despite tenfold fewer synapses. The different opponent network levels constitute a nested, hierarchical structure operating at increasing spatiotemporal scales. Electrophysiology and modeling suggest that distributing this computation over consecutive network levels counteracts a reduction in gain, which would result from integrating large opposing conductances at a single instance. We propose that this neural architecture provides resilience to noise while enabling high selectivity for relevant sensory information.
Collapse
Affiliation(s)
- Georg Ammer
- Max Planck Institute for Biological Intelligence, Martinsried, Germany.
| | - Etienne Serbe-Kamp
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Ludwig Maximilian University of Munich, Munich, Germany
| | - Alex S Mauss
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Florian G Richter
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Sandra Fendl
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| |
Collapse
|
17
|
Chen J, Gish CM, Fransen JW, Salazar-Gatzimas E, Clark DA, Borghuis BG. Direct comparison reveals algorithmic similarities in fly and mouse visual motion detection. iScience 2023; 26:107928. [PMID: 37810236 PMCID: PMC10550730 DOI: 10.1016/j.isci.2023.107928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 08/07/2023] [Accepted: 09/12/2023] [Indexed: 10/10/2023] Open
Abstract
Evolution has equipped vertebrates and invertebrates with neural circuits that selectively encode visual motion. While similarities in the computations performed by these circuits in mouse and fruit fly have been noted, direct experimental comparisons have been lacking. Because molecular mechanisms and neuronal morphology in the two species are distinct, we directly compared motion encoding in these two species at the algorithmic level, using matched stimuli and focusing on a pair of analogous neurons, the mouse ON starburst amacrine cell (ON SAC) and Drosophila T4 neurons. We find that the cells share similar spatiotemporal receptive field structures, sensitivity to spatiotemporal correlations, and tuning to sinusoidal drifting gratings, but differ in their responses to apparent motion stimuli. Both neuron types showed a response to summed sinusoids that deviates from models for motion processing in these cells, underscoring the similarities in their processing and identifying response features that remain to be explained.
Collapse
Affiliation(s)
- Juyue Chen
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
| | - Caitlin M Gish
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - James W Fransen
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| | | | - Damon A Clark
- Interdepartmental Neurosciences Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Molecular, Cellular, Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Bart G Borghuis
- Department of Anatomical Sciences and Neurobiology, University of Louisville, Louisville, KY 40202, USA
| |
Collapse
|
18
|
Mano O, Choi M, Tanaka R, Creamer MS, Matos NCB, Shomar JW, Badwan BA, Clandinin TR, Clark DA. Long-timescale anti-directional rotation in Drosophila optomotor behavior. eLife 2023; 12:e86076. [PMID: 37751469 PMCID: PMC10522332 DOI: 10.7554/elife.86076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 09/12/2023] [Indexed: 09/28/2023] Open
Abstract
Locomotor movements cause visual images to be displaced across the eye, a retinal slip that is counteracted by stabilizing reflexes in many animals. In insects, optomotor turning causes the animal to turn in the direction of rotating visual stimuli, thereby reducing retinal slip and stabilizing trajectories through the world. This behavior has formed the basis for extensive dissections of motion vision. Here, we report that under certain stimulus conditions, two Drosophila species, including the widely studied Drosophila melanogaster, can suppress and even reverse the optomotor turning response over several seconds. Such 'anti-directional turning' is most strongly evoked by long-lasting, high-contrast, slow-moving visual stimuli that are distinct from those that promote syn-directional optomotor turning. Anti-directional turning, like the syn-directional optomotor response, requires the local motion detecting neurons T4 and T5. A subset of lobula plate tangential cells, CH cells, show involvement in these responses. Imaging from a variety of direction-selective cells in the lobula plate shows no evidence of dynamics that match the behavior, suggesting that the observed inversion in turning direction emerges downstream of the lobula plate. Further, anti-directional turning declines with age and exposure to light. These results show that Drosophila optomotor turning behaviors contain rich, stimulus-dependent dynamics that are inconsistent with simple reflexive stabilization responses.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
| | - Minseung Choi
- Department of Neurobiology, Stanford UniversityStanfordUnited States
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Natalia CB Matos
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
| | - Joseph W Shomar
- Department of Physics, Yale UniversityNew HavenUnited States
| | - Bara A Badwan
- Department of Chemical Engineering, Yale UniversityNew HavenUnited States
| | | | - Damon A Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale UniversityNew HavenUnited States
- Interdepartmental Neuroscience Program, Yale UniversityNew HavenUnited States
- Department of Physics, Yale UniversityNew HavenUnited States
- Department of Neuroscience, Yale UniversityNew HavenUnited States
| |
Collapse
|
19
|
Fu Q. Motion perception based on ON/OFF channels: A survey. Neural Netw 2023; 165:1-18. [PMID: 37263088 DOI: 10.1016/j.neunet.2023.05.031] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2022] [Revised: 04/02/2023] [Accepted: 05/17/2023] [Indexed: 06/03/2023]
Abstract
Motion perception is an essential ability for animals and artificially intelligent systems interacting effectively, safely with surrounding objects and environments. Biological visual systems, that have naturally evolved over hundreds-million years, are quite efficient and robust for motion perception, whereas artificial vision systems are far from such capability. This paper argues that the gap can be significantly reduced by formulation of ON/OFF channels in motion perception models encoding luminance increment (ON) and decrement (OFF) responses within receptive field, separately. Such signal-bifurcating structure has been found in neural systems of many animal species articulating early motion is split and processed in segregated pathways. However, the corresponding biological substrates, and the necessity for artificial vision systems have never been elucidated together, leaving concerns on uniqueness and advantages of ON/OFF channels upon building dynamic vision systems to address real world challenges. This paper highlights the importance of ON/OFF channels in motion perception through surveying current progress covering both neuroscience and computationally modelling works with applications. Compared to related literature, this paper for the first time provides insights into implementation of different selectivity to directional motion of looming, translating, and small-sized target movement based on ON/OFF channels in keeping with soundness and robustness of biological principles. Existing challenges and future trends of such bio-plausible computational structure for visual perception in connection with hotspots of machine learning, advanced vision sensors like event-driven camera finally are discussed.
Collapse
Affiliation(s)
- Qinbing Fu
- Machine Life and Intelligence Research Centre, School of Mathematics and Information Science, Guangzhou University, Guangzhou, 510006, China.
| |
Collapse
|
20
|
Tanaka R, Zhou B, Agrochao M, Badwan BA, Au B, Matos NCB, Clark DA. Neural mechanisms to incorporate visual counterevidence in self motion estimation. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.04.522814. [PMID: 36711843 PMCID: PMC9881891 DOI: 10.1101/2023.01.04.522814] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
In selecting appropriate behaviors, animals should weigh sensory evidence both for and against specific beliefs about the world. For instance, animals measure optic flow to estimate and control their own rotation. However, existing models of flow detection can confuse the movement of external objects with genuine self motion. Here, we show that stationary patterns on the retina, which constitute negative evidence against self rotation, are used by the fruit fly Drosophila to suppress inappropriate stabilizing rotational behavior. In silico experiments show that artificial neural networks optimized to distinguish self and world motion similarly detect stationarity and incorporate negative evidence. Employing neural measurements and genetic manipulations, we identified components of the circuitry for stationary pattern detection, which runs parallel to the fly's motion- and optic flow-detectors. Our results exemplify how the compact brain of the fly incorporates negative evidence to improve heading stability, exploiting geometrical constraints of the visual world.
Collapse
Affiliation(s)
- Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Present Address: Institute of Neuroscience, Technical University of Munich, Munich 80802, Germany
| | - Baohua Zhou
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Statistics and Data Science, Yale University, New Haven, CT 06511, USA
| | - Margarida Agrochao
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Bara A. Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Braedyn Au
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Natalia C. B. Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Damon A. Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
21
|
Abstract
How neurons detect the direction of motion is a prime example of neural computation: Motion vision is found in the visual systems of virtually all sighted animals, it is important for survival, and it requires interesting computations with well-defined linear and nonlinear processing steps-yet the whole process is of moderate complexity. The genetic methods available in the fruit fly Drosophila and the charting of a connectome of its visual system have led to rapid progress and unprecedented detail in our understanding of how neurons compute the direction of motion in this organism. The picture that emerged incorporates not only the identity, morphology, and synaptic connectivity of each neuron involved but also its neurotransmitters, its receptors, and their subcellular localization. Together with the neurons' membrane potential responses to visual stimulation, this information provides the basis for a biophysically realistic model of the circuit that computes the direction of visual motion.
Collapse
Affiliation(s)
- Alexander Borst
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| | - Lukas N Groschner
- Max Planck Institute for Biological Intelligence, Martinsried, Germany; ,
| |
Collapse
|
22
|
Pirogova N, Borst A. Contrast normalization affects response time-course of visual interneurons. PLoS One 2023; 18:e0285686. [PMID: 37294743 PMCID: PMC10256145 DOI: 10.1371/journal.pone.0285686] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2022] [Accepted: 04/28/2023] [Indexed: 06/11/2023] Open
Abstract
In natural environments, light intensities and visual contrasts vary widely, yet neurons have a limited response range for encoding them. Neurons accomplish that by flexibly adjusting their dynamic range to the statistics of the environment via contrast normalization. The effect of contrast normalization is usually measured as a reduction of neural signal amplitudes, but whether it influences response dynamics is unknown. Here, we show that contrast normalization in visual interneurons of Drosophila melanogaster not only suppresses the amplitude but also alters the dynamics of responses when a dynamic surround is present. We present a simple model that qualitatively reproduces the simultaneous effect of the visual surround on the response amplitude and temporal dynamics by altering the cells' input resistance and, thus, their membrane time constant. In conclusion, single-cell filtering properties as derived from artificial stimulus protocols like white-noise stimulation cannot be transferred one-to-one to predict responses under natural conditions.
Collapse
Affiliation(s)
- Nadezhda Pirogova
- Department Circuits-Computation-Models, Max Planck Institute for Biological Intelligence, Planegg, Martinsried, Germany
- Graduate School of Systemic Neurosciences, LMU Munich, Planegg, Martinsried, Germany
| | - Alexander Borst
- Department Circuits-Computation-Models, Max Planck Institute for Biological Intelligence, Planegg, Martinsried, Germany
| |
Collapse
|
23
|
Ketkar MD, Shao S, Gjorgjieva J, Silies M. Multifaceted luminance gain control beyond photoreceptors in Drosophila. Curr Biol 2023:S0960-9822(23)00619-X. [PMID: 37285845 DOI: 10.1016/j.cub.2023.05.024] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2023] [Revised: 05/10/2023] [Accepted: 05/11/2023] [Indexed: 06/09/2023]
Abstract
Animals navigating in natural environments must handle vast changes in their sensory input. Visual systems, for example, handle changes in luminance at many timescales, from slow changes across the day to rapid changes during active behavior. To maintain luminance-invariant perception, visual systems must adapt their sensitivity to changing luminance at different timescales. We demonstrate that luminance gain control in photoreceptors alone is insufficient to explain luminance invariance at both fast and slow timescales and reveal the algorithms that adjust gain past photoreceptors in the fly eye. We combined imaging and behavioral experiments with computational modeling to show that downstream of photoreceptors, circuitry taking input from the single luminance-sensitive neuron type L3 implements gain control at fast and slow timescales. This computation is bidirectional in that it prevents the underestimation of contrasts in low luminance and overestimation in high luminance. An algorithmic model disentangles these multifaceted contributions and shows that the bidirectional gain control occurs at both timescales. The model implements a nonlinear interaction of luminance and contrast to achieve gain correction at fast timescales and a dark-sensitive channel to improve the detection of dim stimuli at slow timescales. Together, our work demonstrates how a single neuronal channel performs diverse computations to implement gain control at multiple timescales that are together important for navigation in natural environments.
Collapse
Affiliation(s)
- Madhura D Ketkar
- Institute of Developmental and Neurobiology, Johannes-Gutenberg University Mainz, Hanns-Dieter-Hüsch-Weg 15, 55128 Mainz, Germany
| | - Shuai Shao
- Max Planck Institute for Brain Research, Max-von-Laue-Straße 4, 60438 Frankfurt am Main, Germany; Department of Neurophysiology, Radboud University, Heyendaalseweg 135, 6525 EN Nijmegen, the Netherlands
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain Research, Max-von-Laue-Straße 4, 60438 Frankfurt am Main, Germany; School of Life Sciences, Technical University Munich, Maximus-von-Imhof-Forum 3, 85354 Freising, Germany.
| | - Marion Silies
- Institute of Developmental and Neurobiology, Johannes-Gutenberg University Mainz, Hanns-Dieter-Hüsch-Weg 15, 55128 Mainz, Germany.
| |
Collapse
|
24
|
Currier TA, Pang MM, Clandinin TR. Visual processing in the fly, from photoreceptors to behavior. Genetics 2023; 224:iyad064. [PMID: 37128740 PMCID: PMC10213501 DOI: 10.1093/genetics/iyad064] [Citation(s) in RCA: 23] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Accepted: 03/22/2023] [Indexed: 05/03/2023] Open
Abstract
Originally a genetic model organism, the experimental use of Drosophila melanogaster has grown to include quantitative behavioral analyses, sophisticated perturbations of neuronal function, and detailed sensory physiology. A highlight of these developments can be seen in the context of vision, where pioneering studies have uncovered fundamental and generalizable principles of sensory processing. Here we begin with an overview of vision-guided behaviors and common methods for probing visual circuits. We then outline the anatomy and physiology of brain regions involved in visual processing, beginning at the sensory periphery and ending with descending motor control. Areas of focus include contrast and motion detection in the optic lobe, circuits for visual feature selectivity, computations in support of spatial navigation, and contextual associative learning. Finally, we look to the future of fly visual neuroscience and discuss promising topics for further study.
Collapse
Affiliation(s)
- Timothy A Currier
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Michelle M Pang
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| | - Thomas R Clandinin
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305, USA
| |
Collapse
|
25
|
Braun A, Borst A, Meier M. Disynaptic inhibition shapes tuning of OFF-motion detectors in Drosophila. Curr Biol 2023:S0960-9822(23)00601-2. [PMID: 37236181 DOI: 10.1016/j.cub.2023.05.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 04/02/2023] [Accepted: 05/03/2023] [Indexed: 05/28/2023]
Abstract
The circuitry underlying the detection of visual motion in Drosophila melanogaster is one of the best studied networks in neuroscience. Lately, electron microscopy reconstructions, algorithmic models, and functional studies have proposed a common motif for the cellular circuitry of an elementary motion detector based on both supralinear enhancement for preferred direction and sublinear suppression for null-direction motion. In T5 cells, however, all columnar input neurons (Tm1, Tm2, Tm4, and Tm9) are excitatory. So, how is null-direction suppression realized there? Using two-photon calcium imaging in combination with thermogenetics, optogenetics, apoptotics, and pharmacology, we discovered that it is via CT1, the GABAergic large-field amacrine cell, where the different processes have previously been shown to act in an electrically isolated way. Within each column, CT1 receives excitatory input from Tm9 and Tm1 and provides the sign-inverted, now inhibitory input signal onto T5. Ablating CT1 or knocking down GABA-receptor subunit Rdl significantly broadened the directional tuning of T5 cells. It thus appears that the signal of Tm1 and Tm9 is used both as an excitatory input for preferred direction enhancement and, through a sign inversion within the Tm1/Tm9-CT1 microcircuit, as an inhibitory input for null-direction suppression.
Collapse
Affiliation(s)
- Amalia Braun
- Max Planck Institute for Biological Intelligence, Department of Circuits - Computation - Models, Am Klopferspitz 18, 82152 Martinsried, Germany.
| | - Alexander Borst
- Max Planck Institute for Biological Intelligence, Department of Circuits - Computation - Models, Am Klopferspitz 18, 82152 Martinsried, Germany
| | - Matthias Meier
- Max Planck Institute for Biological Intelligence, Department of Circuits - Computation - Models, Am Klopferspitz 18, 82152 Martinsried, Germany.
| |
Collapse
|
26
|
Mano O, Choi M, Tanaka R, Creamer MS, Matos NC, Shomar J, Badwan BA, Clandinin TR, Clark DA. Long timescale anti-directional rotation in Drosophila optomotor behavior. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.06.523055. [PMID: 36711627 PMCID: PMC9882005 DOI: 10.1101/2023.01.06.523055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
Locomotor movements cause visual images to be displaced across the eye, a retinal slip that is counteracted by stabilizing reflexes in many animals. In insects, optomotor turning causes the animal to turn in the direction of rotating visual stimuli, thereby reducing retinal slip and stabilizing trajectories through the world. This behavior has formed the basis for extensive dissections of motion vision. Here, we report that under certain stimulus conditions, two Drosophila species, including the widely studied D. melanogaster, can suppress and even reverse the optomotor turning response over several seconds. Such "anti-directional turning" is most strongly evoked by long-lasting, high-contrast, slow-moving visual stimuli that are distinct from those that promote syn-directional optomotor turning. Anti-directional turning, like the syn-directional optomotor response, requires the local motion detecting neurons T4 and T5. A subset of lobula plate tangential cells, CH cells, show involvement in these responses. Imaging from a variety of direction-selective cells in the lobula plate shows no evidence of dynamics that match the behavior, suggesting that the observed inversion in turning direction emerges downstream of the lobula plate. Further, anti-directional turning declines with age and exposure to light. These results show that Drosophila optomotor turning behaviors contain rich, stimulus-dependent dynamics that are inconsistent with simple reflexive stabilization responses.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
| | - Minseung Choi
- Department of Neurobiology, Stanford University, Stanford, CA 94305, USA
| | - Ryosuke Tanaka
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Matthew S. Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Natalia C.B. Matos
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Joseph Shomar
- Department of Physics, Yale University, New Haven, CT 06511, USA
| | - Bara A. Badwan
- Department of Chemical Engineering, Yale University, New Haven, CT 06511, USA
| | | | - Damon A. Clark
- Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
- Department of Physics, Yale University, New Haven, CT 06511, USA
- Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
27
|
Fu Q, Li Z, Peng J. Harmonizing motion and contrast vision for robust looming detection. ARRAY 2023. [DOI: 10.1016/j.array.2022.100272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
|
28
|
Skelton PSM, Finn A, Brinkworth RSA. Contrast independent biologically inspired translational optic flow estimation. BIOLOGICAL CYBERNETICS 2022; 116:635-660. [PMID: 36303043 PMCID: PMC9691503 DOI: 10.1007/s00422-022-00948-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Accepted: 10/11/2022] [Indexed: 06/16/2023]
Abstract
The visual systems of insects are relatively simple compared to humans. However, they enable navigation through complex environments where insects perform exceptional levels of obstacle avoidance. Biology uses two separable modes of optic flow to achieve this: rapid gaze fixation (rotational motion known as saccades); and the inter-saccadic translational motion. While the fundamental process of insect optic flow has been known since the 1950's, so too has its dependence on contrast. The surrounding visual pathways used to overcome environmental dependencies are less well known. Previous work has shown promise for low-speed rotational motion estimation, but a gap remained in the estimation of translational motion, in particular the estimation of the time to impact. To consistently estimate the time to impact during inter-saccadic translatory motion, the fundamental limitation of contrast dependence must be overcome. By adapting an elaborated rotational velocity estimator from literature to work for translational motion, this paper proposes a novel algorithm for overcoming the contrast dependence of time to impact estimation using nonlinear spatio-temporal feedforward filtering. By applying bioinspired processes, approximately 15 points per decade of statistical discrimination were achieved when estimating the time to impact to a target across 360 background, distance, and velocity combinations: a 17-fold increase over the fundamental process. These results show the contrast dependence of time to impact estimation can be overcome in a biologically plausible manner. This, combined with previous results for low-speed rotational motion estimation, allows for contrast invariant computational models designed on the principles found in the biological visual system, paving the way for future visually guided systems.
Collapse
Affiliation(s)
- Phillip S. M. Skelton
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| | - Anthony Finn
- Science, Technology, Engineering, and Mathematics, University of South Australia, 1 Mawson Lakes Boulevard, Mawson Lakes, South Australia 5095 Australia
| | - Russell S. A. Brinkworth
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| |
Collapse
|
29
|
Gonzalez-Suarez AD, Zavatone-Veth JA, Chen J, Matulis CA, Badwan BA, Clark DA. Excitatory and inhibitory neural dynamics jointly tune motion detection. Curr Biol 2022; 32:3659-3675.e8. [PMID: 35868321 PMCID: PMC9474608 DOI: 10.1016/j.cub.2022.06.075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2022] [Revised: 05/03/2022] [Accepted: 06/24/2022] [Indexed: 11/26/2022]
Abstract
Neurons integrate excitatory and inhibitory signals to produce their outputs, but the role of input timing in this integration remains poorly understood. Motion detection is a paradigmatic example of this integration, since theories of motion detection rely on different delays in visual signals. These delays allow circuits to compare scenes at different times to calculate the direction and speed of motion. Different motion detection circuits have different velocity sensitivity, but it remains untested how the response dynamics of individual cell types drive this tuning. Here, we sped up or slowed down specific neuron types in Drosophila's motion detection circuit by manipulating ion channel expression. Altering the dynamics of individual neuron types upstream of motion detectors increased their sensitivity to fast or slow visual motion, exposing distinct roles for excitatory and inhibitory dynamics in tuning directional signals, including a role for the amacrine cell CT1. A circuit model constrained by functional data and anatomy qualitatively reproduced the observed tuning changes. Overall, these results reveal how excitatory and inhibitory dynamics together tune a canonical circuit computation.
Collapse
Affiliation(s)
| | - Jacob A Zavatone-Veth
- Department of Physics, Harvard University, Cambridge, MA 02138, USA; Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Juyue Chen
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | | | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA; Department of Molecular, Cellular, and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
30
|
Chen PJ, Li Y, Lee CH. Calcium Imaging of Neural Activity in Fly Photoreceptors. Cold Spring Harb Protoc 2022; 2022:Pdb.top107800. [PMID: 35641092 DOI: 10.1101/pdb.top107800] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Functional imaging methodologies allow researchers to simultaneously monitor the neural activities of all single neurons in a population, and this ability has led to great advances in neuroscience research. Taking advantage of a genetically tractable model organism, functional imaging in Drosophila provides opportunities to probe scientific questions that were previously unanswerable by electrophysiological recordings. Here, we introduce comprehensive protocols for two-photon calcium imaging in fly visual neurons. We also discuss some challenges in applying optical imaging techniques to study visual systems and consider the best practices for making comparisons between different neuron groups.
Collapse
Affiliation(s)
- Pei-Ju Chen
- Institute of Cellular and Organismic Biology, Academia Sinica, Taipei 11529, Taiwan, Republic of China
| | - Yan Li
- Institute of Cellular and Organismic Biology, Academia Sinica, Taipei 11529, Taiwan, Republic of China
| | - Chi-Hon Lee
- Institute of Cellular and Organismic Biology, Academia Sinica, Taipei 11529, Taiwan, Republic of China
| |
Collapse
|
31
|
Zhou B, Li Z, Kim S, Lafferty J, Clark DA. Shallow neural networks trained to detect collisions recover features of visual loom-selective neurons. eLife 2022; 11:72067. [PMID: 35023828 PMCID: PMC8849349 DOI: 10.7554/elife.72067] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Accepted: 01/11/2022] [Indexed: 11/13/2022] Open
Abstract
Animals have evolved sophisticated visual circuits to solve a vital inference problem: detecting whether or not a visual signal corresponds to an object on a collision course. Such events are detected by specific circuits sensitive to visual looming, or objects increasing in size. Various computational models have been developed for these circuits, but how the collision-detection inference problem itself shapes the computational structures of these circuits remains unknown. Here, inspired by the distinctive structures of LPLC2 neurons in the visual system of Drosophila, we build anatomically-constrained shallow neural network models and train them to identify visual signals that correspond to impending collisions. Surprisingly, the optimization arrives at two distinct, opposing solutions, only one of which matches the actual dendritic weighting of LPLC2 neurons. Both solutions can solve the inference problem with high accuracy when the population size is large enough. The LPLC2-like solutions reproduces experimentally observed LPLC2 neuron responses for many stimuli, and reproduces canonical tuning of loom sensitive neurons, even though the models are never trained on neural data. Thus, LPLC2 neuron properties and tuning are predicted by optimizing an anatomically-constrained neural network to detect impending collisions. More generally, these results illustrate how optimizing inference tasks that are important for an animal's perceptual goals can reveal and explain computational properties of specific sensory neurons.
Collapse
Affiliation(s)
- Baohua Zhou
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| | - Zifan Li
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Sunnie Kim
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - John Lafferty
- Department of Statistics and Data Science, Yale University, New Haven, United States
| | - Damon A Clark
- Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, United States
| |
Collapse
|
32
|
Kohn JR, Portes JP, Christenson MP, Abbott LF, Behnia R. Flexible filtering by neural inputs supports motion computation across states and stimuli. Curr Biol 2021; 31:5249-5260.e5. [PMID: 34670114 DOI: 10.1016/j.cub.2021.09.061] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2021] [Revised: 08/10/2021] [Accepted: 09/22/2021] [Indexed: 01/05/2023]
Abstract
Sensory systems flexibly adapt their processing properties across a wide range of environmental and behavioral conditions. Such variable processing complicates attempts to extract a mechanistic understanding of sensory computations. This is evident in the highly constrained, canonical Drosophila motion detection circuit, where the core computation underlying direction selectivity is still debated despite extensive studies. Here we measured the filtering properties of neural inputs to the OFF motion-detecting T5 cell in Drosophila. We report state- and stimulus-dependent changes in the shape of these signals, which become more biphasic under specific conditions. Summing these inputs within the framework of a connectomic-constrained model of the circuit demonstrates that these shapes are sufficient to explain T5 responses to various motion stimuli. Thus, our stimulus- and state-dependent measurements reconcile motion computation with the anatomy of the circuit. These findings provide a clear example of how a basic circuit supports flexible sensory computation.
Collapse
Affiliation(s)
- Jessica R Kohn
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA
| | - Jacob P Portes
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Matthias P Christenson
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - L F Abbott
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA
| | - Rudy Behnia
- The Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Kavli Institute for Brain Science, Columbia University, New York, NY 10027, USA.
| |
Collapse
|
33
|
Nagel K. Motion vision: Pinning down motion computation in an ever-changing circuit. Curr Biol 2021; 31:R1523-R1525. [PMID: 34875241 DOI: 10.1016/j.cub.2021.10.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
A new electrophysiological study of the Drosophila visual system, recording from columnar inputs to motion-detecting neurons, has provided new insights into the computations that underlie motion vision.
Collapse
Affiliation(s)
- Katherine Nagel
- Neuroscience Institute, NYU School of Medicine, 435 E. 30(th) Street, Room 1102, New York, NY 10016, USA.
| |
Collapse
|
34
|
Zhang Z, Xiao T, Qin X. Fly visual evolutionary neural network solving large‐scale global optimization. INT J INTELL SYST 2021. [DOI: 10.1002/int.22564] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Affiliation(s)
- Zhuhong Zhang
- Department of Big Data Science and Engineering, College of Big Data and Information Engineering Guizhou University Guiyang Guizhou China
| | - Tianyu Xiao
- Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computation Guizhou University Guiyang Guizhou China
| | - Xiuchang Qin
- Guizhou Provincial Characteristic Key Laboratory of System Optimization and Scientific Computation Guizhou University Guiyang Guizhou China
| |
Collapse
|
35
|
James JV, Cazzolato BS, Grainger S, Wiederman SD. Nonlinear, neuronal adaptation in insect vision models improves target discrimination within repetitively moving backgrounds. BIOINSPIRATION & BIOMIMETICS 2021; 16:066015. [PMID: 34555824 DOI: 10.1088/1748-3190/ac2988] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/23/2021] [Accepted: 09/23/2021] [Indexed: 06/13/2023]
Abstract
Neurons which respond selectively to small moving targets, even against a cluttered background, have been identified in several insect species. To investigate what underlies these robust and highly selective responses, researchers have probed the neuronal circuitry in target-detecting, visual pathways. Observations in flies reveal nonlinear adaptation over time, composed of a fast onset and gradual decay. This adaptive processing is seen in both of the independent, parallel pathways encoding either luminance increments (ON channel) or decrements (OFF channel). The functional significance of this adaptive phenomenon has not been determined from physiological studies, though the asymmetrical time course suggests a role in suppressing responses to repetitive stimuli. We tested this possibility by comparing an implementation of fast adaptation against alternatives, using a model of insect 'elementary small target motion detectors'. We conducted target-detecting simulations on various natural backgrounds, that were shifted via several movement profiles (and target velocities). Using performance metrics, we confirmed that the fast adaptation observed in neuronal systems enhances target detection against a repetitively moving background. Such background movement would be encountered via natural ego-motion as the insect travels through the world. These findings show that this form of nonlinear, fast-adaptation (suitably implementable via cellular biophysics) plays a role analogous to background subtraction techniques in conventional computer vision.
Collapse
Affiliation(s)
- John V James
- School of Mechanical Engineering, University of Adelaide, Adelaide SA, Australia
- Adelaide Medical School, University of Adelaide, Adelaide SA, Australia
| | - Benjamin S Cazzolato
- School of Mechanical Engineering, University of Adelaide, Adelaide SA, Australia
| | - Steven Grainger
- School of Mechanical Engineering, University of Adelaide, Adelaide SA, Australia
| | | |
Collapse
|
36
|
Mano O, Creamer MS, Badwan BA, Clark DA. Predicting individual neuron responses with anatomically constrained task optimization. Curr Biol 2021; 31:4062-4075.e4. [PMID: 34324832 DOI: 10.1016/j.cub.2021.06.090] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Revised: 05/24/2021] [Accepted: 06/29/2021] [Indexed: 01/28/2023]
Abstract
Artificial neural networks trained to solve sensory tasks can develop statistical representations that match those in biological circuits. However, it remains unclear whether they can reproduce properties of individual neurons. Here, we investigated how artificial networks predict individual neuron properties in the visual motion circuits of the fruit fly Drosophila. We trained anatomically constrained networks to predict movement in natural scenes, solving the same inference problem as fly motion detectors. Units in the artificial networks adopted many properties of analogous individual neurons, even though they were not explicitly trained to match these properties. Among these properties was the split into ON and OFF motion detectors, which is not predicted by classical motion detection models. The match between model and neurons was closest when models were trained to be robust to noise. These results demonstrate how anatomical, task, and noise constraints can explain properties of individual neurons in a small neural network.
Collapse
Affiliation(s)
- Omer Mano
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA
| | - Matthew S Creamer
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA
| | - Bara A Badwan
- School of Engineering and Applied Science, Yale University, New Haven, CT 06511, USA
| | - Damon A Clark
- Department of Molecular Cellular and Developmental Biology, Yale University, New Haven, CT 06511, USA; Department of Neuroscience, Yale University, New Haven, CT 06511, USA; Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06511, USA; Department of Physics, Yale University, New Haven, CT 06511, USA.
| |
Collapse
|
37
|
Li Y, Chen PJ, Lin TY, Ting CY, Muthuirulan P, Pursley R, Ilić M, Pirih P, Drews MS, Menon KP, Zinn KG, Pohida T, Borst A, Lee CH. Neural mechanism of spatio-chromatic opponency in the Drosophila amacrine neurons. Curr Biol 2021; 31:3040-3052.e9. [PMID: 34033749 DOI: 10.1016/j.cub.2021.04.068] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Revised: 04/26/2021] [Accepted: 04/27/2021] [Indexed: 12/18/2022]
Abstract
Visual animals detect spatial variations of light intensity and wavelength composition. Opponent coding is a common strategy for reducing information redundancy. Neurons equipped with both spatial and spectral opponency have been identified in vertebrates but not yet in insects. The Drosophila amacrine neuron Dm8 was recently reported to show color opponency. Here, we demonstrate Dm8 exhibits spatio-chromatic opponency. Antagonistic convergence of the direct input from the UV-sensing R7s and indirect input from the broadband receptors R1-R6 through Tm3 and Mi1 is sufficient to confer Dm8's UV/Vis (ultraviolet/visible light) opponency. Using high resolution monochromatic stimuli, we show the pale and yellow subtypes of Dm8s, inheriting retinal mosaic characteristics, have distinct spectral tuning properties. Using 2D white-noise stimulus and reverse correlation analysis, we found that the UV receptive field (RF) of Dm8 has a center-inhibition/surround-excitation structure. In the absence of UV-sensing R7 inputs, the polarity of the RF is inverted owing to the excitatory input from the broadband photoreceptors R1-R6. Using a new synGRASP method based on endogenous neurotransmitter receptors, we show that neighboring Dm8s form mutual inhibitory connections mediated by the glutamate-gated chloride channel GluClα, which is essential for both Dm8's spatial opponency and animals' phototactic behavior. Our study shows spatio-chromatic opponency could arise in the early visual stage, suggesting a common information processing strategy in both invertebrates and vertebrates.
Collapse
Affiliation(s)
- Yan Li
- Institute of Cellular and Organismic Biology, Academia Sinica, Taipei, Taiwan, Republic of China
| | - Pei-Ju Chen
- Institute of Cellular and Organismic Biology, Academia Sinica, Taipei, Taiwan, Republic of China
| | - Tzu-Yang Lin
- Institute of Cellular and Organismic Biology, Academia Sinica, Taipei, Taiwan, Republic of China
| | - Chun-Yuan Ting
- Section on Neuronal Connectivity, Eunice Kennedy Shriver National Institute of Child Health and Human Development, National Institutes of Health, Bethesda, MD 20892, USA
| | - Pushpanathan Muthuirulan
- Section on Neuronal Connectivity, Eunice Kennedy Shriver National Institute of Child Health and Human Development, National Institutes of Health, Bethesda, MD 20892, USA
| | - Randall Pursley
- Signal Processing and Instrumentation Section, Division of Computational Bioscience, Center for Information Technology, National Institutes of Health, Bethesda, MD 20892, USA
| | - Marko Ilić
- Department of Biology, Biotechnical Faculty, University of Ljubljana, 1000 Ljubljana, Slovenia
| | - Primož Pirih
- Department of Biology, Biotechnical Faculty, University of Ljubljana, 1000 Ljubljana, Slovenia
| | - Michael S Drews
- Department Circuits-Computation-Models, Max-Planck-Institute of Neurobiology, 82152 Martinsried, Germany
| | - Kaushiki P Menon
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, CA 91125, USA
| | - Kai G Zinn
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, CA 91125, USA
| | - Thomas Pohida
- Signal Processing and Instrumentation Section, Division of Computational Bioscience, Center for Information Technology, National Institutes of Health, Bethesda, MD 20892, USA
| | - Alexander Borst
- Department Circuits-Computation-Models, Max-Planck-Institute of Neurobiology, 82152 Martinsried, Germany
| | - Chi-Hon Lee
- Institute of Cellular and Organismic Biology, Academia Sinica, Taipei, Taiwan, Republic of China.
| |
Collapse
|
38
|
Ding J, Chen A, Chung J, Acaron Ledesma H, Wu M, Berson DM, Palmer SE, Wei W. Spatially displaced excitation contributes to the encoding of interrupted motion by a retinal direction-selective circuit. eLife 2021; 10:e68181. [PMID: 34096504 PMCID: PMC8211448 DOI: 10.7554/elife.68181] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Accepted: 06/06/2021] [Indexed: 12/19/2022] Open
Abstract
Spatially distributed excitation and inhibition collectively shape a visual neuron's receptive field (RF) properties. In the direction-selective circuit of the mammalian retina, the role of strong null-direction inhibition of On-Off direction-selective ganglion cells (On-Off DSGCs) on their direction selectivity is well-studied. However, how excitatory inputs influence the On-Off DSGC's visual response is underexplored. Here, we report that On-Off DSGCs have a spatially displaced glutamatergic receptive field along their horizontal preferred-null motion axes. This displaced receptive field contributes to DSGC null-direction spiking during interrupted motion trajectories. Theoretical analyses indicate that population responses during interrupted motion may help populations of On-Off DSGCs signal the spatial location of moving objects in complex, naturalistic visual environments. Our study highlights that the direction-selective circuit exploits separate sets of mechanisms under different stimulus conditions, and these mechanisms may help encode multiple visual features.
Collapse
Affiliation(s)
- Jennifer Ding
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - Albert Chen
- Department of Organismal Biology, The University of ChicagoChicagoUnited States
| | - Janet Chung
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - Hector Acaron Ledesma
- Graduate Program in Biophysical Sciences, The University of ChicagoChicagoUnited States
| | - Mofei Wu
- Department of Neurobiology, The University of ChicagoChicagoUnited States
| | - David M Berson
- Department of Neuroscience and Carney Institute for Brain Science, Brown UniversityProvidenceUnited States
| | - Stephanie E Palmer
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Organismal Biology, The University of ChicagoChicagoUnited States
- Grossman Institute for Neuroscience, Quantitative Biology and Human Behavior, The University of ChicagoChicagoUnited States
| | - Wei Wei
- Committee on Neurobiology Graduate Program, The University of ChicagoChicagoUnited States
- Department of Neurobiology, The University of ChicagoChicagoUnited States
- Grossman Institute for Neuroscience, Quantitative Biology and Human Behavior, The University of ChicagoChicagoUnited States
| |
Collapse
|
39
|
Maximally efficient prediction in the early fly visual system may support evasive flight maneuvers. PLoS Comput Biol 2021; 17:e1008965. [PMID: 34014926 PMCID: PMC8136689 DOI: 10.1371/journal.pcbi.1008965] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 04/13/2021] [Indexed: 11/20/2022] Open
Abstract
The visual system must make predictions to compensate for inherent delays in its processing. Yet little is known, mechanistically, about how prediction aids natural behaviors. Here, we show that despite a 20-30ms intrinsic processing delay, the vertical motion sensitive (VS) network of the blowfly achieves maximally efficient prediction. This prediction enables the fly to fine-tune its complex, yet brief, evasive flight maneuvers according to its initial ego-rotation at the time of detection of the visual threat. Combining a rich database of behavioral recordings with detailed compartmental modeling of the VS network, we further show that the VS network has axonal gap junctions that are critical for optimal prediction. During evasive maneuvers, a VS subpopulation that directly innervates the neck motor center can convey predictive information about the fly’s future ego-rotation, potentially crucial for ongoing flight control. These results suggest a novel sensory-motor pathway that links sensory prediction to behavior. Survival-critical behaviors shape neural circuits to translate sensory information into strikingly fast predictions, e.g. in escaping from a predator faster than the system’s processing delay. We show that the fly visual system implements fast and accurate prediction of its visual experience. This provides crucial information for directing fast evasive maneuvers that unfold over just 40ms. Our work shows how this fast prediction is implemented, mechanistically, and suggests the existence of a novel sensory-motor pathway from the fly visual system to a wing steering motor neuron. Echoing and amplifying previous work in the retina, our work hypothesizes that the efficient encoding of predictive information is a universal design principle supporting fast, natural behaviors.
Collapse
|
40
|
Pagni M, Haikala V, Oberhauser V, Meyer PB, Reiff DF, Schnaitmann C. Interaction of “chromatic” and “achromatic” circuits in Drosophila color opponent processing. Curr Biol 2021; 31:1687-1698.e4. [DOI: 10.1016/j.cub.2021.01.105] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2020] [Revised: 01/22/2021] [Accepted: 01/28/2021] [Indexed: 02/07/2023]
|
41
|
Fendl S, Vieira RM, Borst A. Conditional protein tagging methods reveal highly specific subcellular distribution of ion channels in motion-sensing neurons. eLife 2020; 9:62953. [PMID: 33079061 PMCID: PMC7655108 DOI: 10.7554/elife.62953] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2020] [Accepted: 10/14/2020] [Indexed: 11/25/2022] Open
Abstract
Neurotransmitter receptors and ion channels shape the biophysical properties of neurons, from the sign of the response mediated by neurotransmitter receptors to the dynamics shaped by voltage-gated ion channels. Therefore, knowing the localizations and types of receptors and channels present in neurons is fundamental to our understanding of neural computation. Here, we developed two approaches to visualize the subcellular localization of specific proteins in Drosophila: The flippase-dependent expression of GFP-tagged receptor subunits in single neurons and ‘FlpTag’, a versatile new tool for the conditional labelling of endogenous proteins. Using these methods, we investigated the subcellular distribution of the receptors GluClα, Rdl, and Dα7 and the ion channels para and Ih in motion-sensing T4/T5 neurons of the Drosophila visual system. We discovered a strictly segregated subcellular distribution of these proteins and a sequential spatial arrangement of glutamate, acetylcholine, and GABA receptors along the dendrite that matched the previously reported EM-reconstructed synapse distributions.
Collapse
Affiliation(s)
- Sandra Fendl
- Max Planck Institute of Neurobiology, Martinsried, Germany.,Graduate School of Systemic Neurosciences, LMU Munich, Martinsried, Germany
| | | | - Alexander Borst
- Max Planck Institute of Neurobiology, Martinsried, Germany.,Graduate School of Systemic Neurosciences, LMU Munich, Martinsried, Germany
| |
Collapse
|
42
|
Fu Q, Yue S. Modelling Drosophila motion vision pathways for decoding the direction of translating objects against cluttered moving backgrounds. BIOLOGICAL CYBERNETICS 2020; 114:443-460. [PMID: 32623517 PMCID: PMC7554016 DOI: 10.1007/s00422-020-00841-x] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/31/2020] [Accepted: 06/19/2020] [Indexed: 06/03/2023]
Abstract
Decoding the direction of translating objects in front of cluttered moving backgrounds, accurately and efficiently, is still a challenging problem. In nature, lightweight and low-powered flying insects apply motion vision to detect a moving target in highly variable environments during flight, which are excellent paradigms to learn motion perception strategies. This paper investigates the fruit fly Drosophila motion vision pathways and presents computational modelling based on cutting-edge physiological researches. The proposed visual system model features bio-plausible ON and OFF pathways, wide-field horizontal-sensitive (HS) and vertical-sensitive (VS) systems. The main contributions of this research are on two aspects: (1) the proposed model articulates the forming of both direction-selective and direction-opponent responses, revealed as principal features of motion perception neural circuits, in a feed-forward manner; (2) it also shows robust direction selectivity to translating objects in front of cluttered moving backgrounds, via the modelling of spatiotemporal dynamics including combination of motion pre-filtering mechanisms and ensembles of local correlators inside both the ON and OFF pathways, which works effectively to suppress irrelevant background motion or distractors, and to improve the dynamic response. Accordingly, the direction of translating objects is decoded as global responses of both the HS and VS systems with positive or negative output indicating preferred-direction or null-direction translation. The experiments have verified the effectiveness of the proposed neural system model, and demonstrated its responsive preference to faster-moving, higher-contrast and larger-size targets embedded in cluttered moving backgrounds.
Collapse
Affiliation(s)
- Qinbing Fu
- Machine Life and Intelligence Research Centre, Guangzhou University, Guangzhou, China.
- Computational Intelligence Lab/Lincoln Centre for Autonomous Systems, University of Lincoln, Lincoln, UK.
| | - Shigang Yue
- Machine Life and Intelligence Research Centre, Guangzhou University, Guangzhou, China.
- Computational Intelligence Lab/Lincoln Centre for Autonomous Systems, University of Lincoln, Lincoln, UK.
| |
Collapse
|
43
|
Zavatone-Veth JA, Badwan BA, Clark DA. A minimal synaptic model for direction selective neurons in Drosophila. J Vis 2020; 20:2. [PMID: 32040161 PMCID: PMC7343402 DOI: 10.1167/jov.20.2.2] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/07/2023] Open
Abstract
Visual motion estimation is a canonical neural computation. In Drosophila, recent advances have identified anatomic and functional circuitry underlying direction-selective computations. Models with varying levels of abstraction have been proposed to explain specific experimental results but have rarely been compared across experiments. Here we use the wealth of available anatomical and physiological data to construct a minimal, biophysically inspired synaptic model for Drosophila’s first-order direction-selective T4 cells. We show how this model relates mathematically to classical models of motion detection, including the Hassenstein-Reichardt correlator model. We used numerical simulation to test how well this synaptic model could reproduce measurements of T4 cells across many datasets and stimulus modalities. These comparisons include responses to sinusoid gratings, to apparent motion stimuli, to stochastic stimuli, and to natural scenes. Without fine-tuning this model, it sufficed to reproduce many, but not all, response properties of T4 cells. Since this model is flexible and based on straightforward biophysical properties, it provides an extensible framework for developing a mechanistic understanding of T4 neural response properties. Moreover, it can be used to assess the sufficiency of simple biophysical mechanisms to describe features of the direction-selective computation and identify where our understanding must be improved.
Collapse
|
44
|
|
45
|
Schuetzenberger A, Borst A. Seeing Natural Images through the Eye of a Fly with Remote Focusing Two-Photon Microscopy. iScience 2020; 23:101170. [PMID: 32502966 PMCID: PMC7270611 DOI: 10.1016/j.isci.2020.101170] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Revised: 04/02/2020] [Accepted: 05/12/2020] [Indexed: 11/30/2022] Open
Abstract
Visual systems of many animals, including the fruit fly Drosophila, represent the surrounding space as 2D maps, formed by populations of neurons. Advanced genetic tools make the fly visual system especially well accessible. However, in typical in vivo preparations for two-photon calcium imaging, relatively few neurons can be recorded at the same time. Here, we present an extension to a conventional two-photon microscope, based on remote focusing, which enables real-time rotation of the imaging plane, and thus flexible alignment to cellular structures, without resolution or speed trade-off. We simultaneously record from over 100 neighboring cells spanning the 2D retinotopic map. We characterize its representation of moving natural images, which we find is comparable to noise predictions. Our method increases throughput 10-fold and allows us to visualize a significant fraction of the fly's visual field. Furthermore, our system can be applied in general for a more flexible investigation of neural circuits.
Collapse
Affiliation(s)
- Anna Schuetzenberger
- Department Circuits - Computation - Models, Max-Planck-Institute of Neurobiology, 82152 Planegg, Germany; Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität, 82152 Planegg, Germany.
| | - Alexander Borst
- Department Circuits - Computation - Models, Max-Planck-Institute of Neurobiology, 82152 Planegg, Germany; Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität, 82152 Planegg, Germany.
| |
Collapse
|
46
|
Wienecke CFR, Clandinin TR. Drosophila Vision: An Eye for Change. Curr Biol 2020; 30:R66-R68. [DOI: 10.1016/j.cub.2019.11.069] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|