1
|
Zhu S, Li Y, Fu Y, Yin J, Shen M, Chen H. The object as the unit for state switching in visual working memory. Cognition 2024; 249:105808. [PMID: 38776622 DOI: 10.1016/j.cognition.2024.105808] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Revised: 04/11/2024] [Accepted: 04/26/2024] [Indexed: 05/25/2024]
Abstract
This study aimed to determine the unit for switching representational states in visual working memory (VWM). Two opposing hypotheses were investigated: (a) the unit of switching being a feature (feature-based hypothesis), and (b) the unit of switching being an object (object-based hypothesis). Participants (N = 180) were instructed to hold two features from either one or two objects in their VWM. The memory-driven attentional capture effect, suggesting that actively held information in VWM can cause attention to be drawn towards matched distractors, was employed to assess representational states of the first and second probed colors (indicated by a retro-cue). The results showed that only the feature indicated to be probed first could elicit memory related capture for the condition of separate objects. Importantly, features from an integrated object could guide attention regardless of the probe order. These findings were observed across three experiments involving features of different dimensions, same dimensions, or perceptual objects defined by Gestalt principles. They provide convergent evidence supporting the object-based hypothesis by indicating that features within a single object cannot exist in different states.
Collapse
Affiliation(s)
- Shengnan Zhu
- Department of Psychology and Behavioral Sciences, Zhejiang University, PR China
| | - Yongqi Li
- Department of Psychology and Behavioral Sciences, Zhejiang University, PR China
| | - Yingtao Fu
- Department of Psychology and Behavioral Sciences, Zhejiang University, PR China
| | - Jun Yin
- Department of Psychology, Ningbo University, Ningbo, PR China.
| | - Mowei Shen
- Department of Psychology and Behavioral Sciences, Zhejiang University, PR China.
| | - Hui Chen
- Department of Psychology and Behavioral Sciences, Zhejiang University, PR China.
| |
Collapse
|
2
|
Moriya J. Visual mental imagery of atypical color objects attracts attention to an imagery-matching object. Atten Percept Psychophys 2024; 86:49-61. [PMID: 37872433 DOI: 10.3758/s13414-023-02804-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/01/2023] [Indexed: 10/25/2023]
Abstract
Mental imagery attracts attention to imagery-matching stimuli. However, it remains unknown whether voluntarily imagined atypical color also attracts attention to a stimulus that matches the imagery when the imagined stimuli are color-diagnostic objects, which are strongly associated with typical color. This study investigated whether people can voluntarily imagine atypical colors of such objects and attend to imagery-matching stimuli. Participants in the imagery group were instructed to imagine an atypical color of the black-white objects according to the instructed color or voluntarily selected color, whereas participants in the control group were instructed to attend to the objects without any instruction of imagery. Thereafter, they detected a color target in a visual search task. Results revealed that participants in the imagery group directed attention to the imagery-matching atypical color, not to the original color of the object in the search. Meanwhile, participants in the control group did not demonstrate any attentional guidance. These results suggest that voluntarily imagining atypical color can attenuate mental representations of the original color imagery and change attention to a stimulus that matches imagery.
Collapse
Affiliation(s)
- Jun Moriya
- Faculty of Sociology, Kansai University, 3-3-35 Yamate-cho, Suita-shi, Osaka, Japan.
| |
Collapse
|
3
|
Park HB, Zhang W. The dynamics of attentional guidance by working memory contents. Cognition 2024; 242:105638. [PMID: 37839251 PMCID: PMC10843273 DOI: 10.1016/j.cognition.2023.105638] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Revised: 07/16/2023] [Accepted: 10/05/2023] [Indexed: 10/17/2023]
Abstract
Working memory (WM) contents can guide attention toward matching sensory information in the environment, but there are mixed findings regarding whether only a single prioritized item or multiple items held in WM can guide attention. The present study examines the limit of WM-guided attention with a novel task procedure and mouse trajectory analysis. Specifically, we introduced a perceptual-matching task utilizing the continuous estimation procedure within the maintenance interval of a WM task for one or two colors. We found that the overall perceptual matching mouse trajectory were robustly biased toward the location of WM-match color on the color-wheel (i.e., attraction bias), but only at memory set size one. However, the analysis of circular mouse trajectory distributions, through hierarchical Bayesian modeling, revealed two separable central peaks at both memory set sizes. Furthermore, model-free analysis demonstrated that the perceptual matching mouse trajectory patterns were similar regardless of memory set sizes. Together, these results support the single-item account and highlight the utility of mouse trajectory analyses in hypothesis testing in experimental psychology.
Collapse
Affiliation(s)
- Hyung-Bum Park
- Institute for Mind and Biology, The University of Chicago, Biopsychological Sciences Building (BPSB), 940 E 57th St., Chicago, IL 60637, USA; Department of Psychology, University of California, 900 University Ave., Riverside, CA 92521, USA.
| | - Weiwei Zhang
- Department of Psychology, University of California, 900 University Ave., Riverside, CA 92521, USA
| |
Collapse
|
4
|
Zhu P, Yang Q, Chen L, Guan C, Zhou J, Shen M, Chen H. Working-Memory-Guided Attention Competes with Exogenous Attention but Not with Endogenous Attention. Behav Sci (Basel) 2023; 13:bs13050426. [PMID: 37232663 DOI: 10.3390/bs13050426] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2023] [Revised: 05/09/2023] [Accepted: 05/16/2023] [Indexed: 05/27/2023] Open
Abstract
Recent research has extensively investigated working memory (WM)-guided attention, which is the phenomenon of attention being directed towards information in the external environment that matches the content stored in WM. While prior studies have focused on the potential influencing factors of WM-guided attention, little is known about the nature of it. This attention system exhibits characteristics of two classical distinct attention systems: exogenous attention and endogenous attention, as it can operate automatically like exogenous attention yet persist for a long time and be modulated by cognitive resources like endogenous attention. Thus, the current study aimed to explore the mechanism of WM-guided attention by testing whether it competed with exogenous attention, endogenous attention, or both. Two experiments were conducted within a classic WM-guided attention paradigm. Experiment 1 included an exogenous cue and revealed an interaction between WM-guided attention and exogenous attention. Experiment 2 replaced the exogenous cue with an endogenous cue and demonstrated that endogenous attention had no impact on WM-guided attention. These findings indicate that WM-guided attention shares mechanisms with exogenous attention to some extent while operating in parallel with endogenous attention.
Collapse
Affiliation(s)
- Ping Zhu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310030, China
| | - Qingqing Yang
- Department of Psychology, New York University, New York, NY 10003, USA
| | - Luo Chen
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310030, China
| | - Chenxiao Guan
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310030, China
| | - Jifan Zhou
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310030, China
| | - Mowei Shen
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310030, China
| | - Hui Chen
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou 310030, China
| |
Collapse
|
5
|
Long-term memory and working memory compete and cooperate to guide attention. Atten Percept Psychophys 2022:10.3758/s13414-022-02593-1. [PMID: 36303020 DOI: 10.3758/s13414-022-02593-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/02/2022] [Indexed: 11/08/2022]
Abstract
Multiple types of memory guide attention: Both long-term memory (LTM) and working memory (WM) effectively guide visual search. Furthermore, both types of memories can capture attention automatically, even when detrimental to performance. It is less clear, however, how LTM and WM cooperate or compete to guide attention in the same task. In a series of behavioral experiments, we show that LTM and WM reliably cooperate to guide attention: Visual search is faster when both memories cue attention to the same spatial location (relative to when only one memory can guide attention). LTM and WM competed to guide attention in more limited circumstances: Competition only occurred when these memories were in different dimensions - particularly when participants searched for a shape and held an accessory color in mind. Finally, we found no evidence for asymmetry in either cooperation or competition: There was no evidence that WM helped (or hindered) LTM-guided search more than the other way around. This lack of asymmetry was found despite differences in LTM-guided and WM-guided search overall, and differences in how two LTMs and two WMs compete or cooperate with each other to guide attention. This work suggests that, even if only one memory is currently task-relevant, WM and LTM can cooperate to guide attention; they can also compete when distracting features are salient enough. This work elucidates interactions between WM and LTM during attentional guidance, adding to the literature on costs and benefits to attention from multiple active memories.
Collapse
|
6
|
Fan L, Diao L, Xu M, Zhang X. Multiple representations in visual working memory can simultaneously guide attention. CURRENT PSYCHOLOGY 2022. [DOI: 10.1007/s12144-022-03332-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
7
|
Trial-by-trial mouse trajectory predicts variance in precision across working memory representations: A critical reanalysis of Hao et al. (2021). Psychon Bull Rev 2022; 29:2181-2191. [PMID: 35668294 DOI: 10.3758/s13423-022-02128-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/16/2022] [Indexed: 11/08/2022]
Abstract
Multiple representations in visual working memory (VWM) can vary in mnemonic precision. This inhomogeneity of VWM precision has received some support from recent studies with the whole-report procedure, in which all memory items are recalled in free or forced orders. Recently, Hao et al. (2021, Cognition, 214, 104739) added a novel item-selection stage before each memory recall and found smaller between-trial variance in mouse trajectory during the selection stage in free-recall condition as compared with forced recall, which was taken as evidence for less between-item interference and the resulting precision benefit under free recall. Here, we reanalyzed the original dataset with a different analytic approach and attempted independent hypothesis testing focusing on within-trial trajectory deviations. We found that the direction of trial-by-trial trajectory bias for the first to-be-recalled item was predictive of the relative mnemonic precision of the remaining items. Critically, this relationship was only present for forced recall but not for free recall. Hierarchical Bayesian modeling of recall errors further identified that this relationship was selectively driven by VWM precision. Together, our reanalysis provides evidence for the source of between-item interference and its direct association with variable precision of VWM representations, and further highlights the novel methodological benefits of probing memory decisional processes using mouse trajectory data.
Collapse
|
8
|
Interactions Between Visual Working Memory, Attention, and Color Categories: A Pupillometry Study. J Cogn 2022; 5:16. [PMID: 36072094 PMCID: PMC9400663 DOI: 10.5334/joc.208] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2021] [Accepted: 01/26/2022] [Indexed: 11/20/2022] Open
Abstract
Recent studies have found that visual working memory (VWM) for color shows a categorical bias: observers typically remember colors as more prototypical to the category they belong to than they actually are. Here, we further examine color-category effects on VWM using pupillometry. Participants remembered a color for later reproduction on a color wheel. During the retention interval, a colored probe was presented, and we measured the pupil constriction in response to this probe, assuming that the strength of constriction reflects the visual saliency of the probe. We found that the pupil initially constricted most strongly for non-matching colors that were maximally different from the memorized color; this likely reflects a lack of visual adaptation for these colors, which renders them more salient than memory-matching colors (which were shown before). Strikingly, this effect reversed later in time, such that pupil constriction was more prolonged for memory-matching colors as compared to non-matching colors; this likely reflects that memory-matching colors capture attention more strongly, and perhaps for a longer time, than non-matching colors do. We found no effects of color categories on pupil constriction: after controlling for color distance, (non-matching) colors from the same category as the memory color did not result in a different pupil response as compared to colors from a different category; however, we did find that behavioral responses were biased by color categories. In summary, we found that pupil constriction to colored probes reflects both visual adaptation and VWM content, but, unlike behavioral measures, is not notably affected by color categories.
Collapse
|
9
|
Allocation of resources in working memory: Theoretical and empirical implications for visual search. Psychon Bull Rev 2021; 28:1093-1111. [PMID: 33733298 PMCID: PMC8367923 DOI: 10.3758/s13423-021-01881-5] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/08/2021] [Indexed: 01/09/2023]
Abstract
Recently, working memory (WM) has been conceptualized as a limited resource, distributed flexibly and strategically between an unlimited number of representations. In addition to improving the precision of representations in WM, the allocation of resources may also shape how these representations act as attentional templates to guide visual search. Here, we reviewed recent evidence in favor of this assumption and proposed three main principles that govern the relationship between WM resources and template-guided visual search. First, the allocation of resources to an attentional template has an effect on visual search, as it may improve the guidance of visual attention, facilitate target recognition, and/or protect the attentional template against interference. Second, the allocation of the largest amount of resources to a representation in WM is not sufficient to give this representation the status of attentional template and thus, the ability to guide visual search. Third, the representation obtaining the status of attentional template, whether at encoding or during maintenance, receives an amount of WM resources proportional to its relevance for visual search. Thus defined, the resource hypothesis of visual search constitutes a parsimonious and powerful framework, which provides new perspectives on previous debates and complements existing models of template-guided visual search.
Collapse
|
10
|
Abstract
The pupil responds reflexively to changes in brightness and focal distance to maintain the smallest pupil (and thus the highest visual acuity) that still allows sufficient light to reach the retina. The pupil also responds to a wide variety of cognitive processes, but the functions of these cognitive responses are still poorly understood. In this review, I propose that cognitive pupil responses, like their reflexive counterparts, serve to optimize vision. Specifically, an emphasis on central vision over peripheral vision results in pupil constriction, and this likely reflects the fact that central vision benefits most from the increased visual acuity provided by small pupils. Furthermore, an intention to act with a bright stimulus results in preparatory pupil constriction, which allows the pupil to respond quickly when that bright stimulus is subsequently brought into view. More generally, cognitively driven pupil responses are likely a form of sensory tuning: a subtle adjustment of the eyes to optimize their properties for the current situation and the immediate future. Expected final online publication date for the Annual Review of Vision Science, Volume 6 is September 15, 2020. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Sebastiaan Mathôt
- Department of Psychology, University of Groningen, 9712TS Groningen, The Netherlands;
| |
Collapse
|