1
|
Rucci M, Ahissar E, Burr DC, Kagan I, Poletti M, Victor JD. The visual system does not operate like a camera. J Vis 2025; 25:2. [PMID: 40035715 DOI: 10.1167/jov.25.3.2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2025] Open
Affiliation(s)
- Michele Rucci
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| | - Ehud Ahissar
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel
| | - David C Burr
- University of Florence, Italy
- Institute of Neuroscience, National Research Council, Pisa, Italy
| | - Igor Kagan
- Decision and Awareness Group, Cognitive Neuroscience Laboratory, German Primate Center, Goettingen, Germany
| | - Martina Poletti
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| | - Jonathan D Victor
- Feil Family Brain and Mind Research Institute, Weill Cornell Medical College, New York, NY, USA
| |
Collapse
|
2
|
Nyström M, Hooge ITC, Hessels RS, Andersson R, Hansen DW, Johansson R, Niehorster DC. The fundamentals of eye tracking part 3: How to choose an eye tracker. Behav Res Methods 2025; 57:67. [PMID: 39843609 PMCID: PMC11754381 DOI: 10.3758/s13428-024-02587-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/08/2024] [Indexed: 01/24/2025]
Abstract
There is an abundance of commercial and open-source eye trackers available for researchers interested in gaze and eye movements. Which aspects should be considered when choosing an eye tracker? The paper describes what distinguishes different types of eye trackers, their suitability for different types of research questions, and highlights questions researchers should ask themselves to make an informed choice.
Collapse
Affiliation(s)
- Marcus Nyström
- Lund University Humanities Lab, Box 201, SE, 221 00, Lund, Sweden.
| | - Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | | | | | | | - Diederick C Niehorster
- Lund University Humanities Lab, Box 201, SE, 221 00, Lund, Sweden
- Department of Psychology, Lund University, Lund, Sweden
| |
Collapse
|
3
|
Niehorster DC, Nyström M, Hessels RS, Andersson R, Benjamins JS, Hansen DW, Hooge ITC. The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study. Behav Res Methods 2025; 57:46. [PMID: 39762687 PMCID: PMC11703944 DOI: 10.3758/s13428-024-02529-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/14/2024] [Indexed: 01/11/2025]
Abstract
Researchers using eye tracking are heavily dependent on software and hardware tools to perform their studies, from recording eye tracking data and visualizing it, to processing and analyzing it. This article provides an overview of available tools for research using eye trackers and discusses considerations to make when choosing which tools to adopt for one's study.
Collapse
Affiliation(s)
- Diederick C Niehorster
- Lund University Humanities Lab and Department of Psychology, Lund University, Lund, Sweden.
| | - Marcus Nyström
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Roy S Hessels
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, the Netherlands
| | | | - Jeroen S Benjamins
- Experimental Psychology, Helmholtz Institute & Social, Health and Organizational Psychology, Utrecht University, Utrecht, the Netherlands
| | - Dan Witzner Hansen
- Eye Information Laboratory, IT University of Copenhagen, Copenhagen, Denmark
| | - Ignace T C Hooge
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, the Netherlands
| |
Collapse
|
4
|
Jenks SK, Carrasco M, Poletti M. Asymmetries in foveal vision. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.12.20.629715. [PMID: 39763996 PMCID: PMC11702834 DOI: 10.1101/2024.12.20.629715] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/14/2025]
Abstract
Visual perception is characterized by known asymmetries in the visual field; human's visual sensitivity is higher along the horizontal than the vertical meridian, and along the lower than the upper vertical meridian. These asymmetries decrease with decreasing eccentricity from the periphery to the center of gaze, suggesting that they may be absent in the 1-deg foveola, the retinal region used to explore scenes at high-resolution. Using high-precision eyetracking and gaze-contingent display, allowing for accurate control over the stimulated foveolar location despite the continuous eye motion at fixation, we investigated fine visual discrimination at different isoeccentric locations across the foveola and parafovea. Although the tested foveolar locations were only 0.3 deg away from the center of gaze, we show that, similar to more eccentric locations, humans are more sensitive to stimuli presented along the horizontal than the vertical meridian. Whereas the magnitude of this asymmetry is reduced in the foveola, the magnitude of the vertical meridian asymmetry is comparable but, interestingly, reversed: objects presented slightly above the center of gaze are more easily discerned than when presented at the same eccentricity below the center of gaze. Therefore, far from being uniform, as often assumed, foveolar vision is characterized by perceptual asymmetries. Further, these asymmetries differ not only in magnitude but also in direction compared to those present just ~4deg away from the center of gaze, resulting in overall different foveal and extrafoveal perceptual fields.
Collapse
Affiliation(s)
- Samantha K. Jenks
- Department of Brain and Cognitive Sciences, University of Rochester
- Center for Visual Science, University of Rochester
| | - Marisa Carrasco
- Department of Psychology, New York University
- Center for Neural Science, New York University
| | - Martina Poletti
- Department of Brain and Cognitive Sciences, University of Rochester
- Department of Neuroscience, University of Rochester
- Center for Visual Science, University of Rochester
| |
Collapse
|
5
|
Moon B, Linebach G, Yang A, Jenks SK, Rucci M, Poletti M, Rolland JP. High refresh rate display for natural monocular viewing in AOSLO psychophysics experiments. OPTICS EXPRESS 2024; 32:31142-31161. [PMID: 39573257 PMCID: PMC11595291 DOI: 10.1364/oe.529199] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/14/2024] [Revised: 07/31/2024] [Accepted: 07/31/2024] [Indexed: 11/27/2024]
Abstract
By combining an external display operating at 360 frames per second with an adaptive optics scanning laser ophthalmoscope (AOSLO) for human foveal imaging, we demonstrate color stimulus delivery at high spatial and temporal resolution in AOSLO psychophysics experiments. A custom pupil relay enables viewing of the stimulus through a 3-mm effective pupil diameter and provides refractive error correction from -8 to +4 diopters. Performance of the assembled and aligned pupil relay was validated by measuring the wavefront error across the field of view and correction range, and the as-built Strehl ratio was 0.64 or better. High-acuity stimuli were rendered on the external display and imaged through the pupil relay to demonstrate that spatial frequencies up to 54 cycles per degree, corresponding to 20/11 visual acuity, are resolved. The completed external display was then used to render fixation markers across the field of view of the monitor, and a continuous retinal montage spanning 9.4 by 5.4 degrees of visual angle was acquired with the AOSLO. We conducted eye-tracking experiments during free-viewing and high-acuity tasks with polychromatic images presented on the external display. Sub-arcminute eye position uncertainty was achieved over a 1.5 by 1.5-degree trackable range, enabling precise localization of the line of sight on the stimulus while simultaneously imaging the fine structure of the human central fovea. This high refresh rate display overcomes the temporal, spectral, and field of view limitations of AOSLO-based stimulus presentation, enabling natural monocular viewing of stimuli in psychophysics experiments conducted with AOSLO.
Collapse
Affiliation(s)
- Benjamin Moon
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
| | - Glory Linebach
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
| | - Angelina Yang
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
| | - Samantha K. Jenks
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627, USA
| | - Michele Rucci
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627, USA
| | - Martina Poletti
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627, USA
- Department of Neuroscience, University of Rochester, Rochester, NY 14627, USA
| | - Jannick P. Rolland
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
- Department of Biomedical Engineering, University of Rochester, Rochester, NY 14627, USA
| |
Collapse
|
6
|
Sadeghi R, Ressmeyer R, Yates J, Otero-Millan J. Open Iris - An Open Source Framework for Video-Based Eye-Tracking Research and Development. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.27.582401. [PMID: 38463977 PMCID: PMC10925248 DOI: 10.1101/2024.02.27.582401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2024]
Abstract
Eye-tracking is an essential tool in many fields, yet existing solutions are often limited for customized applications due to cost or lack of flexibility. We present OpenIris, an adaptable and user-friendly open-source framework for video-based eye-tracking. OpenIris is developed in C# with modular design that allows further extension and customization through plugins for different hardware systems, tracking, and calibration pipelines. It can be remotely controlled via a network interface from other devices or programs. Eye movements can be recorded online from camera stream or offline post-processing recorded videos. Example plugins have been developed to track eye motion in 3-D, including torsion. Currently implemented binocular pupil tracking pipelines can achieve frame rates of more than 500Hz. With the OpenIris framework, we aim to fill a gap in the research tools available for high-precision and high-speed eye-tracking, especially in environments that require custom solutions that are not currently well-served by commercial eye-trackers.
Collapse
Affiliation(s)
- Roksana Sadeghi
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, California, USA
| | - Ryan Ressmeyer
- Bioengineering, University of Washington, Seattle, Washington, USA
| | - Jacob Yates
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, California, USA
| | - Jorge Otero-Millan
- Herbert Wertheim School of Optometry and Vision Science, University of California, Berkeley, California, USA
- Department of Neurology, Johns Hopkins University, Baltimore, Maryland, USA
| |
Collapse
|
7
|
Yang B, Intoy J, Rucci M. Eye blinks as a visual processing stage. Proc Natl Acad Sci U S A 2024; 121:e2310291121. [PMID: 38564641 PMCID: PMC11009678 DOI: 10.1073/pnas.2310291121] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 02/12/2024] [Indexed: 04/04/2024] Open
Abstract
Humans blink their eyes frequently during normal viewing, more often than it seems necessary for keeping the cornea well lubricated. Since the closure of the eyelid disrupts the image on the retina, eye blinks are commonly assumed to be detrimental to visual processing. However, blinks also provide luminance transients rich in spatial information to neural pathways highly sensitive to temporal changes. Here, we report that the luminance modulations from blinks enhance visual sensitivity. By coupling high-resolution eye tracking in human observers with modeling of blink transients and spectral analysis of visual input signals, we show that blinking increases the power of retinal stimulation and that this effect significantly enhances visibility despite the time lost in exposure to the external scene. We further show that, as predicted from the spectral content of input signals, this enhancement is selective for stimuli at low spatial frequencies and occurs irrespective of whether the luminance transients are actively generated or passively experienced. These findings indicate that, like eye movements, blinking acts as a computational component of a visual processing strategy that uses motor behavior to reformat spatial information into the temporal domain.
Collapse
Affiliation(s)
- Bin Yang
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY14627
- Center for Visual Science, University of Rochester, Rochester, NY14627
| | - Janis Intoy
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY14627
- Center for Visual Science, University of Rochester, Rochester, NY14627
| | - Michele Rucci
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY14627
- Center for Visual Science, University of Rochester, Rochester, NY14627
| |
Collapse
|
8
|
Moon B, Poletti M, Roorda A, Tiruveedhula P, Liu SH, Linebach G, Rucci M, Rolland JP. Alignment, calibration, and validation of an adaptive optics scanning laser ophthalmoscope for high-resolution human foveal imaging. APPLIED OPTICS 2024; 63:730-742. [PMID: 38294386 PMCID: PMC11062499 DOI: 10.1364/ao.504283] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Accepted: 12/26/2023] [Indexed: 02/01/2024]
Abstract
In prior art, advances in adaptive optics scanning laser ophthalmoscope (AOSLO) technology have enabled cones in the human fovea to be resolved in healthy eyes with normal vision and low to moderate refractive errors, providing new insight into human foveal anatomy, visual perception, and retinal degenerative diseases. These high-resolution ophthalmoscopes require careful alignment of each optical subsystem to ensure diffraction-limited imaging performance, which is necessary for resolving the smallest foveal cones. This paper presents a systematic and rigorous methodology for building, aligning, calibrating, and testing an AOSLO designed for imaging the cone mosaic of the central fovea in humans with cellular resolution. This methodology uses a two-stage alignment procedure and thorough system testing to achieve diffraction-limited performance. Results from retinal imaging of healthy human subjects under 30 years of age with refractive errors of less than 3.5 diopters using either 680 nm or 840 nm light show that the system can resolve cones at the very center of the fovea, the region where the cones are smallest and most densely packed.
Collapse
Affiliation(s)
- Benjamin Moon
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
| | - Martina Poletti
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627, USA
- Department of Neuroscience, University of Rochester, Rochester, NY 14627, USA
| | - Austin Roorda
- Herbert Wertheim School of Optometry and Vision Science, University of California Berkeley, Berkeley, CA 94720, USA
| | - Pavan Tiruveedhula
- Herbert Wertheim School of Optometry and Vision Science, University of California Berkeley, Berkeley, CA 94720, USA
| | - Soh Hang Liu
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
| | - Glory Linebach
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
| | - Michele Rucci
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627, USA
| | - Jannick P. Rolland
- The Institute of Optics, University of Rochester, Rochester, NY 14627, USA
- Center for Visual Science, University of Rochester, Rochester, NY 14627, USA
- Department of Biomedical Engineering, University of Rochester, Rochester, NY 14627, USA
| |
Collapse
|
9
|
Poletti M. An eye for detail: Eye movements and attention at the foveal scale. Vision Res 2023; 211:108277. [PMID: 37379763 PMCID: PMC10528557 DOI: 10.1016/j.visres.2023.108277] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Revised: 06/08/2023] [Accepted: 06/08/2023] [Indexed: 06/30/2023]
Abstract
Human vision relies on a tiny region of the retina, the 1-deg foveola, to achieve high spatial resolution. Foveal vision is of paramount importance in daily activities, yet its study is challenging, as eye movements incessantly displace stimuli across this region. Here I will review work that, building on recent advances in eye-tracking and gaze-contingent display, examines how attention and eye movements operate at the foveal level. This research highlights how exploration of fine spatial detail unfolds following visuomotor strategies reminiscent of those occurring at larger scales. It shows that, together with highly precise control of attention, this motor activity is linked to non-homogenous processing within the foveola and selectively modulates sensitivity both in space and time. Overall, the picture emerges of a highly dynamic foveal perception in which fine spatial vision, rather than simply being the result of placing a stimulus at the center of gaze, is the result of a finely tuned and orchestrated synergy of motor, cognitive, and attentional processes.
Collapse
Affiliation(s)
- Martina Poletti
- Department of Brain and Cognitive Sciences, University of Rochester, United States; Center for Visual Science, University of Rochester, United States; Department of Neuroscience, University of Rochester, United States.
| |
Collapse
|
10
|
Meermeier A, Lappe M, Li YH, Rifai K, Wahl S, Rucci M. Fine-scale measurement of the blind spot borders. Vision Res 2023; 211:108208. [PMID: 37454560 PMCID: PMC10494866 DOI: 10.1016/j.visres.2023.108208] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Revised: 12/04/2022] [Accepted: 02/20/2023] [Indexed: 07/18/2023]
Abstract
The blind spot is both a necessity and a nuisance for seeing. It is the portion of the visual field projecting to where the optic nerve crosses the retina, a region devoid of photoreceptors and hence visual input. The precise way in which vision transitions into blindness at the blind spot border is to date unknown. A chief challenge to map this transition is the incessant movement of the eye, which unavoidably smears measurements across space. In this study, we used high-resolution eye-tracking and state-of-the-art retinal stabilization to finely map the blind spot borders. Participants reported the onset of tiny high-contrast probes that were briefly flashed at precise positions around the blind spot. This method has sufficient resolution to enable mapping of blood vessels from psychophysical measurements. Our data show that, even after accounting for eye movements, the transition zones at the edges of the blind spot are considerable. On the horizontal meridian, the regions with detection rates between 80% and 20% span approximately 25% of the overall width of the blind spot. These borders also vary considerably in size across different axes. These data show that the transition from full visibility to blindness at the blind spot border is not abrupt but occurs over a broad area.
Collapse
Affiliation(s)
- Annegret Meermeier
- Institute for Psychology, University of Muenster, Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Muenster, Germany
| | - Markus Lappe
- Institute for Psychology, University of Muenster, Muenster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Muenster, Germany
| | - Yuanhao H Li
- Department of Brain & Cognitive Sciences, University of Rochester, New York, USA; Center for Visual Science, University of Rochester, New York, USA
| | | | - Siegfried Wahl
- Carl Zeiss Vision International GmbH, Aalen, Germany; Institute for Ophthalmic Research, University Tübingen, Tübingen, Germany
| | - Michele Rucci
- Department of Brain & Cognitive Sciences, University of Rochester, New York, USA; Center for Visual Science, University of Rochester, New York, USA
| |
Collapse
|
11
|
Yates JL, Coop SH, Sarch GH, Wu RJ, Butts DA, Rucci M, Mitchell JF. Detailed characterization of neural selectivity in free viewing primates. Nat Commun 2023; 14:3656. [PMID: 37339973 PMCID: PMC10282080 DOI: 10.1038/s41467-023-38564-9] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Accepted: 05/08/2023] [Indexed: 06/22/2023] Open
Abstract
Fixation constraints in visual tasks are ubiquitous in visual and cognitive neuroscience. Despite its widespread use, fixation requires trained subjects, is limited by the accuracy of fixational eye movements, and ignores the role of eye movements in shaping visual input. To overcome these limitations, we developed a suite of hardware and software tools to study vision during natural behavior in untrained subjects. We measured visual receptive fields and tuning properties from multiple cortical areas of marmoset monkeys who freely viewed full-field noise stimuli. The resulting receptive fields and tuning curves from primary visual cortex (V1) and area MT match reported selectivity from the literature which was measured using conventional approaches. We then combined free viewing with high-resolution eye tracking to make the first detailed 2D spatiotemporal measurements of foveal receptive fields in V1. These findings demonstrate the power of free viewing to characterize neural responses in untrained animals while simultaneously studying the dynamics of natural behavior.
Collapse
Affiliation(s)
- Jacob L Yates
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA.
- Center for Visual Science, University of Rochester, Rochester, NY, USA.
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA.
- Herbert Wertheim School of Optometry and Vision Science, UC Berkeley, Berkeley, CA, USA.
| | - Shanna H Coop
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
- Neurobiology, Stanford University, Stanford, CA, USA
| | - Gabriel H Sarch
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Neuroscience Institute, Carnegie Mellon University, Pittsburgh, PA, USA
| | - Ruei-Jr Wu
- Center for Visual Science, University of Rochester, Rochester, NY, USA
- Institute of Optics, University of Rochester, Rochester, NY, USA
| | - Daniel A Butts
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
| | - Michele Rucci
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| | - Jude F Mitchell
- Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
- Center for Visual Science, University of Rochester, Rochester, NY, USA
| |
Collapse
|