1
|
Kreyenmeier P, Bhuiyan I, Gian M, Chow HM, Spering M. Smooth pursuit inhibition reveals audiovisual enhancement of fast movement control. J Vis 2024; 24:3. [PMID: 38558158 PMCID: PMC10996987 DOI: 10.1167/jov.24.4.3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Accepted: 02/03/2024] [Indexed: 04/04/2024] Open
Abstract
The sudden onset of a visual object or event elicits an inhibition of eye movements at latencies approaching the minimum delay of visuomotor conductance in the brain. Typically, information presented via multiple sensory modalities, such as sound and vision, evokes stronger and more robust responses than unisensory information. Whether and how multisensory information affects ultra-short latency oculomotor inhibition is unknown. In two experiments, we investigate smooth pursuit and saccadic inhibition in response to multisensory distractors. Observers tracked a horizontally moving dot and were interrupted by an unpredictable visual, auditory, or audiovisual distractor. Distractors elicited a transient inhibition of pursuit eye velocity and catch-up saccade rate within ∼100 ms of their onset. Audiovisual distractors evoked stronger oculomotor inhibition than visual- or auditory-only distractors, indicating multisensory response enhancement. Multisensory response enhancement magnitudes were equal to the linear sum of responses to component stimuli. These results demonstrate that multisensory information affects eye movements even at ultra-short latencies, establishing a lower time boundary for multisensory-guided behavior. We conclude that oculomotor circuits must have privileged access to sensory information from multiple modalities, presumably via a fast, subcortical pathway.
Collapse
Affiliation(s)
- Philipp Kreyenmeier
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
- Graduate Program in Neuroscience, University of British Columbia, Vancouver, British Columbia, Canada
| | - Ishmam Bhuiyan
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Mathew Gian
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Hiu Mei Chow
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
- Department of Psychology, St. Thomas University, Fredericton, New Brunswick, Canada
| | - Miriam Spering
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
- Graduate Program in Neuroscience, University of British Columbia, Vancouver, British Columbia, Canada
- Djavad Mowafaghian Center for Brain Health, University of British Columbia, BC, Vancouver, Canada
- Institute for Computing, Information, and Cognitive Systems, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
2
|
Wang X, Wu Y, Xing Z, Cui X, Gao M, Tang X. Modal-based attention modulates the redundant-signals effect: Role of unimodal target probability. Perception 2023; 52:97-115. [PMID: 36415087 DOI: 10.1177/03010066221136675] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Multisensory integration includes two behavioral manifestations: the modality dominance effect and the redundant-signals effect (RSE). RSE is a multisensory improvement effect in which individuals respond more quickly and accurately to bimodal audiovisual (AV) targets than to unimodal auditory (A) or visual (V) targets. Previous studies have confirmed that RSE is the product of modality interactions between different modalities. The goal of this study was to systematically investigate the effects of the modality dominance manipulated by modal-based attention and unimodal target probability on RSE. The results showed that when paying attention to both the A and V modalities (Exp. 1), RSE was not significantly different between unimodal target probabilities. When selectively paying attention to the A modality (Exp. 2A), RSE was also not significantly different between unimodal target probabilities. However, when selectively paying attention to the V modality (Exp. 2B), the magnitude of RSE showed a significant decreasing trend with the increasing probability of V targets. Our study is the first to reveal that the unimodal target probability significantly modulates RSE in visual selective attention, and this modulatory effect of the unimodal target probability on RSE is opposite to the modulatory effect on the modality dominance effect.
Collapse
Affiliation(s)
| | | | | | | | - Min Gao
- 66523Liaoning Normal University, China
| | | |
Collapse
|
3
|
Portengen BL, Porro GL, Imhof SM, Naber M. The Trade-Off Between Luminance and Color Contrast Assessed With Pupil Responses. Transl Vis Sci Technol 2023; 12:15. [PMID: 36622687 PMCID: PMC9838585 DOI: 10.1167/tvst.12.1.15] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023] Open
Abstract
Purpose A scene consisting of a white stimulus on a black background incorporates strong luminance contrast. When both stimulus and background receive different colors, luminance contrast decreases but color contrast increases. Here, we sought to characterize the pattern of stimulus salience across varying trade-offs of color and luminance contrasts by using the pupil light response. Methods Three experiments were conducted with 17, 16, and 17 healthy adults. For all experiments, a flickering stimulus (2 Hz; alternating color to black) was presented superimposed on a background with a complementary color to the stimulus (i.e., opponency colors in human color perception: blue and yellow for Experiment 1, red and green for Experiment 2, and equiluminant red and green for Experiment 3). Background luminance varied between 0% and 45% to trade off luminance and color contrast with the stimulus. By comparing the locus of the optimal trade-off between color and luminance across different color axes, we explored the generality of the trade-off. Results The strongest pupil responses were found when a substantial amount of color contrast was present (at the expense of luminance contrast). Pupil response amplitudes increased by 15% to 30% after the addition of color contrast. An optimal pupillary responsiveness was reached at a background luminance setting of 20% to 35% color contrast across several color axes. Conclusions These findings suggest that a substantial component of pupil light responses incorporates color processing. More sensitive pupil responses and more salient stimulus designs can be achieved by adding subtle levels of color contrast between stimulus and background. Translational Relevance More robust pupil responses will enhance tests of the visual field with pupil perimetry.
Collapse
Affiliation(s)
- Brendan L. Portengen
- Department of Ophthalmology, University Medical Center Utrecht, Utrecht, The Netherlands,Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - Giorgio L. Porro
- Department of Ophthalmology, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Saskia M. Imhof
- Department of Ophthalmology, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Marnix Naber
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
4
|
Zhang Y, Malaval F, Lehmann A, Deroche MLD. Luminance effects on pupil dilation in speech-in-noise recognition. PLoS One 2022; 17:e0278506. [PMID: 36459511 PMCID: PMC9718387 DOI: 10.1371/journal.pone.0278506] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Accepted: 11/17/2022] [Indexed: 12/03/2022] Open
Abstract
There is an increasing interest in the field of audiology and speech communication to measure the effort that it takes to listen in noisy environments, with obvious implications for populations suffering from hearing loss. Pupillometry offers one avenue to make progress in this enterprise but important methodological questions remain to be addressed before such tools can serve practical applications. Typically, cocktail-party situations may occur in less-than-ideal lighting conditions, e.g. a pub or a restaurant, and it is unclear how robust pupil dynamics are to luminance changes. In this study, we first used a well-known paradigm where sentences were presented at different signal-to-noise ratios (SNR), all conducive of good intelligibility. This enabled us to replicate findings, e.g. a larger and later peak pupil dilation (PPD) at adverse SNR, or when the sentences were misunderstood, and to investigate the dependency of the PPD on sentence duration. A second experiment reiterated two of the SNR levels, 0 and +14 dB, but measured at 0, 75, and 220 lux. The results showed that the impact of luminance on the SNR effect was non-monotonic (sub-optimal in darkness or in bright light), and as such, there is no trivial way to derive pupillary metrics that are robust to differences in background light, posing considerable constraints for applications of pupillometry in daily life. Our findings raise an under-examined but crucial issue when designing and understanding listening effort studies using pupillometry, and offer important insights to future clinical application of pupillometry across sites.
Collapse
Affiliation(s)
- Yue Zhang
- Department of Otolaryngology, McGill University, Montreal, Canada
- Centre for Research on Brain, Language and Music, Montreal, Canada
- Centre for Interdisciplinary Research in Music Media and Technology, Montreal, Canada
- * E-mail:
| | - Florian Malaval
- Department of Otolaryngology, McGill University, Montreal, Canada
| | - Alexandre Lehmann
- Department of Otolaryngology, McGill University, Montreal, Canada
- Centre for Research on Brain, Language and Music, Montreal, Canada
- Centre for Interdisciplinary Research in Music Media and Technology, Montreal, Canada
| | - Mickael L. D. Deroche
- Department of Otolaryngology, McGill University, Montreal, Canada
- Centre for Research on Brain, Language and Music, Montreal, Canada
- Centre for Interdisciplinary Research in Music Media and Technology, Montreal, Canada
- Department of Psychology, Concordia University, Montreal, Canada
| |
Collapse
|
5
|
Strauch C, Wang CA, Einhäuser W, Van der Stigchel S, Naber M. Pupillometry as an integrated readout of distinct attentional networks. Trends Neurosci 2022; 45:635-647. [PMID: 35662511 DOI: 10.1016/j.tins.2022.05.003] [Citation(s) in RCA: 46] [Impact Index Per Article: 23.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Revised: 04/15/2022] [Accepted: 05/09/2022] [Indexed: 10/18/2022]
Abstract
The course of pupillary constriction and dilation provides an easy-to-access, inexpensive, and noninvasive readout of brain activity. We propose a new taxonomy of factors affecting the pupil and link these to associated neural underpinnings in an ascending hierarchy. In addition to two well-established low-level factors (light level and focal distance), we suggest two further intermediate-level factors, alerting and orienting, and a higher-level factor, executive functioning. Alerting, orienting, and executive functioning - including their respective underlying neural circuitries - overlap with the three principal attentional networks, making pupil size an integrated readout of distinct states of attention. As a now widespread technique, pupillometry is ready to provide meaningful applications and constitutes a viable part of the psychophysiological toolbox.
Collapse
Affiliation(s)
- Christoph Strauch
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands.
| | - Chin-An Wang
- Institute of Cognitive Neuroscience, National Central University, Taoyuan City, Taiwan; Cognitive Intelligence and Precision Healthcare Center, National Central University, Taoyuan City, Taiwan
| | - Wolfgang Einhäuser
- Physics of Cognition Group, Chemnitz University of Technology, Chemnitz, Germany
| | | | - Marnix Naber
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
6
|
Oster J, Huang J, White BJ, Radach R, Itti L, Munoz DP, Wang CA. Pupillary responses to differences in luminance, color and set size. Exp Brain Res 2022; 240:1873-1885. [PMID: 35445861 DOI: 10.1007/s00221-022-06367-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Accepted: 04/05/2022] [Indexed: 11/26/2022]
Abstract
The pupil responds to a salient stimulus appearing in the environment, in addition to its modulation by global luminance. These pupillary responses can be evoked by visual or auditory stimuli, scaled with stimulus salience, and enhanced by multisensory presentation. In addition, pupil size is modulated by various visual stimulus attributes, such as color, area, and motion. However, research that concurrently examines the influence of different factors on pupillary responses is limited. To explore how presentation of multiple visual stimuli influences human pupillary responses, we presented arrays of visual stimuli and systematically varied their luminance, color, and set size. Saliency level, computed by the saliency model, systematically changed with set size across all conditions, with higher saliency levels in larger set sizes. Pupillary constriction responses were evoked by the appearance of visual stimuli, with larger pupillary responses observed in larger set size. These effects were pronounced even though the global luminance level was unchanged using isoluminant chromatic stimuli. Furthermore, larger pupillary constriction responses were obtained in the blue, compared to other color conditions. Together, we argue that both cortical and subcortical areas contribute to the observed pupillary constriction modulated by set size and color.
Collapse
Affiliation(s)
- Julia Oster
- Department of General and Biological Psychology, University of Wuppertal, Wuppertal, Germany
| | - Jeff Huang
- Centre for Neuroscience Studies, Queen's University, Room 234, Botterell Hall, 18 Stuart Street, Kingston, ON, K7L 3N6, Canada
| | - Brian J White
- Centre for Neuroscience Studies, Queen's University, Room 234, Botterell Hall, 18 Stuart Street, Kingston, ON, K7L 3N6, Canada
| | - Ralph Radach
- Department of General and Biological Psychology, University of Wuppertal, Wuppertal, Germany
| | - Laurent Itti
- Department of Computer Science, University of Southern California, Los Angeles, CA, USA
| | - Douglas P Munoz
- Centre for Neuroscience Studies, Queen's University, Room 234, Botterell Hall, 18 Stuart Street, Kingston, ON, K7L 3N6, Canada.
| | - Chin-An Wang
- Institute of Cognitive Neuroscience, College of Health Science and Technology, National Central University, Taoyuan City, Taiwan.
- Cognitive Intelligence and Precision Healthcare Research Center, National Central University, Taoyuan City, Taiwan.
| |
Collapse
|
7
|
Ball F, Nentwich A, Noesselt T. Cross-modal perceptual enhancement of unisensory targets is uni-directional and does not affect temporal expectations. Vision Res 2021; 190:107962. [PMID: 34757275 DOI: 10.1016/j.visres.2021.107962] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2021] [Revised: 10/05/2021] [Accepted: 10/15/2021] [Indexed: 10/20/2022]
Abstract
Temporal structures in the environment can shape temporal expectations (TE); and previous studies demonstrated that TEs interact with multisensory interplay (MSI) when multisensory stimuli are presented synchronously. Here, we tested whether other types of MSI - evoked by asynchronous yet temporally flanking irrelevant stimuli - result in similar performance patterns. To this end, we presented sequences of 12 stimuli (10 Hz) which consisted of auditory (A), visual (V) or alternating auditory-visual stimuli (e.g. A-V-A-V-…) with either auditory or visual targets (Exp. 1). Participants discriminated target frequencies (auditory pitch or visual spatial frequency) embedded in these sequences. To test effects of TE, the proportion of early and late temporal target positions was manipulated run-wise. Performance for unisensory targets was affected by temporally flanking distractors, with auditory temporal flankers selectively improving visual target perception (Exp. 1). However, no effect of temporal expectation was observed. Control experiments (Exp. 2-3) tested whether this lack of TE effect was due to the higher presentation frequency in Exp. 1 relative to previous experiments. Importantly, even at higher stimulation frequencies redundant multisensory targets (Exp. 2-3) reliably modulated TEs. Together, our results indicate that visual target detection was enhanced by MSI. However, this cross-modal enhancement - in contrast to the redundant target effect - was still insufficient to generate TEs. We posit that unisensory target representations were either instable or insufficient for the generation of TEs while less demanding MSI still occurred; highlighting the need for robust stimulus representations when generating temporal expectations.
Collapse
Affiliation(s)
- Felix Ball
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Germany; Center for Behavioral Brain Sciences, Otto-von-Guericke-University Magdeburg, Germany.
| | - Annika Nentwich
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Germany
| | - Toemme Noesselt
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Germany; Center for Behavioral Brain Sciences, Otto-von-Guericke-University Magdeburg, Germany
| |
Collapse
|
8
|
Strauch C, Hirzle T, Van der Stigchel S, Bulling A. Decoding binary decisions under differential target probabilities from pupil dilation: A random forest approach. J Vis 2021; 21:6. [PMID: 34259827 PMCID: PMC8288052 DOI: 10.1167/jov.21.7.6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2021] [Accepted: 06/11/2021] [Indexed: 11/24/2022] Open
Abstract
Although our pupils slightly dilate when we look at an intended target, they do not when we look at irrelevant distractors. This finding suggests that it may be possible to decode the intention of an observer, understood as the outcome of implicit covert binary decisions, from the pupillary dynamics over time. However, few previous works have investigated the feasibility of this approach and the few that did, did not control for possible confounds such as motor-execution, changes in brightness, or target and distractor probability. We report on our efforts to decode intentions from pupil dilation obtained under strict experimental control on a single trial basis using a machine learning approach. The basis for our analyses are data of 69 participants who looked at letters that needed to be selected with stimulus probabilities that varied systematically in a blockwise manner (n = 19,417 trials). We confirm earlier findings that pupil dilation is indicative of intentions and show that these can be decoded with a classification performance of up to 76% area under the curve for receiver operating characteristic curves if targets are rarer than distractors. To better understand which characteristics of the pupillary signal are most informative, we finally compare relative feature importances. The first derivative of pupil size changes was found to be most relevant, allowing us to decode intention within only about 800 ms of trial onset. Taken together, our results provide credible insights into the potential of decoding intentions from pupil dilation and may soon form the basis for new applications in visual search, gaze-based interaction, or human-robot interaction.
Collapse
Affiliation(s)
- Christoph Strauch
- Experimental Psychology, Helmholtz Institute, Utrecht University, the Netherlands
| | - Teresa Hirzle
- Institute of Media Informatics, Ulm University, Germany
| | | | - Andreas Bulling
- Institute for Visualisation and Interactive Systems, University of Stuttgart, Germany
| |
Collapse
|
9
|
Zandi B, Lode M, Herzog A, Sakas G, Khanh TQ. PupilEXT: Flexible Open-Source Platform for High-Resolution Pupillometry in Vision Research. Front Neurosci 2021; 15:676220. [PMID: 34220432 PMCID: PMC8249868 DOI: 10.3389/fnins.2021.676220] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 04/28/2021] [Indexed: 12/12/2022] Open
Abstract
The human pupil behavior has gained increased attention due to the discovery of the intrinsically photosensitive retinal ganglion cells and the afferent pupil control path's role as a biomarker for cognitive processes. Diameter changes in the range of 10-2 mm are of interest, requiring reliable and characterized measurement equipment to accurately detect neurocognitive effects on the pupil. Mostly commercial solutions are used as measurement devices in pupillometry which is associated with high investments. Moreover, commercial systems rely on closed software, restricting conclusions about the used pupil-tracking algorithms. Here, we developed an open-source pupillometry platform consisting of hardware and software competitive with high-end commercial stereo eye-tracking systems. Our goal was to make a professional remote pupil measurement pipeline for laboratory conditions accessible for everyone. This work's core outcome is an integrated cross-platform (macOS, Windows and Linux) pupillometry software called PupilEXT, featuring a user-friendly graphical interface covering the relevant requirements of professional pupil response research. We offer a selection of six state-of-the-art open-source pupil detection algorithms (Starburst, Swirski, ExCuSe, ElSe, PuRe and PuReST) to perform the pupil measurement. A developed 120-fps pupillometry demo system was able to achieve a calibration accuracy of 0.003 mm and an averaged temporal pupil measurement detection accuracy of 0.0059 mm in stereo mode. The PupilEXT software has extended features in pupil detection, measurement validation, image acquisition, data acquisition, offline pupil measurement, camera calibration, stereo vision, data visualization and system independence, all combined in a single open-source interface, available at https://github.com/openPupil/Open-PupilEXT.
Collapse
Affiliation(s)
- Babak Zandi
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Moritz Lode
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Alexander Herzog
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| | - Georgios Sakas
- Interactive Graphic Systems, Department of Computer Science, Technical University of Darmstadt, Darmstadt, Germany
| | - Tran Quoc Khanh
- Laboratory of Lighting Technology, Department of Electrical Engineering and Information Technology, Technical University of Darmstadt, Darmstadt, Germany
| |
Collapse
|
10
|
Yuan X, Cheng Y, Jiang Y. Multisensory signals inhibit pupillary light reflex: Evidence from pupil oscillation. Psychophysiology 2021; 58:e13848. [PMID: 34002397 DOI: 10.1111/psyp.13848] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2020] [Revised: 04/18/2021] [Accepted: 04/26/2021] [Indexed: 11/26/2022]
Abstract
Multisensory integration, which enhances stimulus saliency at the early stage of the processing hierarchy, has been recently shown to produce a larger pupil size than its unisensory constituents. Theoretically, any modulation on pupil size ought to be associated with the sympathetic and parasympathetic pathways that are sensitive to light. But it remains poorly understood how the pupillary light reflex is changed in a multisensory context. The present study evoked an oscillation of the pupillary light reflex by periodically changing the luminance of a visual stimulus at 1.25 Hz. It was found that such induced pupil size oscillation was substantially attenuated when the bright but not the dark phase of the visual flicker was periodically and synchronously presented with a burst of tones. This inhibition effect persisted when the visual flicker was task-irrelevant and out of attentional focus, but disappeared when the visual flicker was moved from the central field to the periphery. These findings not only offer a comprehensive characterization of the multisensory impact on pupil response to light, but also provide valuable clues about the individual contributions of the sympathetic and parasympathetic pathways to multisensory modulation of pupil size.
Collapse
Affiliation(s)
- Xiangyong Yuan
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.,Chinese Institute for Brain Research, Beijing, China
| | - Yuhui Cheng
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.,Chinese Institute for Brain Research, Beijing, China
| | - Yi Jiang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.,Chinese Institute for Brain Research, Beijing, China.,Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei, China
| |
Collapse
|