1
|
Wohltjen S, Wheatley T. Interpersonal eye-tracking reveals the dynamics of interacting minds. Front Hum Neurosci 2024; 18:1356680. [PMID: 38532792 PMCID: PMC10963423 DOI: 10.3389/fnhum.2024.1356680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2023] [Accepted: 02/20/2024] [Indexed: 03/28/2024] Open
Abstract
The human eye is a rich source of information about where, when, and how we attend. Our gaze paths indicate where and what captures our attention, while changes in pupil size can signal surprise, revealing our expectations. Similarly, the pattern of our blinks suggests levels of alertness and when our attention shifts between external engagement and internal thought. During interactions with others, these cues reveal how we coordinate and share our mental states. To leverage these insights effectively, we need accurate, timely methods to observe these cues as they naturally unfold. Advances in eye-tracking technology now enable real-time observation of these cues, shedding light on mutual cognitive processes that foster shared understanding, collaborative thought, and social connection. This brief review highlights these advances and the new opportunities they present for future research.
Collapse
Affiliation(s)
- Sophie Wohltjen
- Department of Psychology, University of Wisconsin–Madison, Madison, WI, United States
| | - Thalia Wheatley
- Department of Psychological and Brain Sciences, Consortium for Interacting Minds, Dartmouth College, Hanover, NH, United States
- Santa Fe Institute, Santa Fe, NM, United States
| |
Collapse
|
2
|
Mayrand F, Capozzi F, Ristic J. A dual mobile eye tracking study on natural eye contact during live interactions. Sci Rep 2023; 13:11385. [PMID: 37452135 PMCID: PMC10349108 DOI: 10.1038/s41598-023-38346-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Accepted: 07/06/2023] [Indexed: 07/18/2023] Open
Abstract
Human eyes convey a wealth of social information, with mutual looks representing one of the hallmark gaze communication behaviors. However, it remains relatively unknown if such reciprocal communication requires eye-to-eye contact or if general face-to-face looking is sufficient. To address this question, while recording looking behavior in live interacting dyads using dual mobile eye trackers, we analyzed how often participants engaged in mutual looks as a function of looking towards the top (i.e., the Eye region) and bottom half of the face (i.e., the Mouth region). We further examined how these different types of mutual looks during an interaction connected with later gaze-following behavior elicited in an individual experimental task. The results indicated that dyads engaged in mutual looks in various looking combinations (Eye-to-eye, Eye-to-mouth, and Mouth-to-Mouth) but proportionately spent little time in direct eye-to-eye gaze contact. However, the time spent in eye-to-eye contact significantly predicted the magnitude of later gaze following response elicited by the partner's gaze direction. Thus, humans engage in looking patterns toward different face parts during interactions, with direct eye-to-eye looks occurring relatively infrequently; however, social messages relayed during eye-to-eye contact appear to carry key information that propagates to affect subsequent individual social behavior.
Collapse
Affiliation(s)
- Florence Mayrand
- Department of Psychology, McGill University, 1205 Dr Penfield Avenue, Montreal, QC, H3A 1B1, Canada.
| | - Francesca Capozzi
- Department of Psychology , Université du Québec à Montréal (UQAM), Montreal, Canada
| | - Jelena Ristic
- Department of Psychology, McGill University, 1205 Dr Penfield Avenue, Montreal, QC, H3A 1B1, Canada.
| |
Collapse
|
3
|
Thorsson M, Galazka MA, Åsberg Johnels J, Hadjikhani N. A novel end-to-end dual-camera system for eye gaze synchrony assessment in face-to-face interaction. Atten Percept Psychophys 2023:10.3758/s13414-023-02679-4. [PMID: 37099200 DOI: 10.3758/s13414-023-02679-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/11/2023] [Indexed: 04/27/2023]
Abstract
Quantification of face-to-face interaction can provide highly relevant information in cognitive and psychological science research. Current commercial glint-dependent solutions suffer from several disadvantages and limitations when applied in face-to-face interaction, including data loss, parallax errors, the inconvenience and distracting effect of wearables, and/or the need for several cameras to capture each person. Here we present a novel eye-tracking solution, consisting of a dual-camera system used in conjunction with an individually optimized deep learning approach that aims to overcome some of these limitations. Our data show that this system can accurately classify gaze location within different areas of the face of two interlocutors, and capture subtle differences in interpersonal gaze synchrony between two individuals during a (semi-)naturalistic face-to-face interaction.
Collapse
Affiliation(s)
- Max Thorsson
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden.
| | - Martyna A Galazka
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Jakob Åsberg Johnels
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Section of Speech and Language Pathology, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Nouchine Hadjikhani
- Gillberg Neuropsychiatry Centre, Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
4
|
Robotic Gaze Responsiveness in Multiparty Teamwork. Int J Soc Robot 2022. [DOI: 10.1007/s12369-022-00955-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
|
5
|
Correia F, Melo FS, Paiva A. When a Robot Is Your Teammate. Top Cogn Sci 2022. [PMID: 36573665 DOI: 10.1111/tops.12634] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Revised: 08/19/2022] [Accepted: 11/02/2022] [Indexed: 12/28/2022]
Abstract
Creating effective teamwork between humans and robots involves not only addressing their performance as a team but also sustaining the quality and sense of unity among teammates, also known as cohesion. This paper explores the research problem of: how can we endow robotic teammates with social capabilities to improve the cohesive alliance with humans? By defining the concept of a human-robot cohesive alliance in the light of the multidimensional construct of cohesion from the social sciences, we propose to address this problem through the idea of multifaceted human-robot cohesion. We present our preliminary effort from previous works to examine each of the five dimensions of cohesion: social, collective, emotional, structural, and task. We finish the paper with a discussion on how human-robot cohesion contributes to the key questions and ongoing challenges of creating robotic teammates. Overall, cohesion in human-robot teams might be a key factor to propel team performance and it should be considered in the design, development, and evaluation of robotic teammates.
Collapse
Affiliation(s)
- Filipa Correia
- INESC-ID, Instituto Superior Técnico, Universidade de Lisboa
- ITI, LARSyS, Instituto Superior Técnico, Universidade de Lisboa
| | | | - Ana Paiva
- INESC-ID, Instituto Superior Técnico, Universidade de Lisboa
| |
Collapse
|
6
|
Boukarras S, Ferri D, Frisanco A, Farnese ML, Consiglio C, Alvino I, Bianchi F, D’Acunto A, Borgogni L, Aglioti SM. Bringing social interaction at the core of organizational neuroscience. Front Psychol 2022; 13:1034454. [PMID: 36467198 PMCID: PMC9714489 DOI: 10.3389/fpsyg.2022.1034454] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2022] [Accepted: 10/31/2022] [Indexed: 10/25/2023] Open
Abstract
Organizations are composed of individuals working together for achieving specific goals, and interpersonal dynamics do exert a strong influence on workplace behaviour. Nevertheless, the dual and multiple perspective of interactions has been scarcely considered by Organizational Neuroscience (ON), the emerging field of study that aims at incorporating findings from cognitive and brain sciences into the investigation of organizational behaviour. This perspective article aims to highlight the potential benefits of adopting experimental settings involving two or more participants (the so-called "second person" approach) for studying the neural bases of organizational behaviour. Specifically, we stress the idea that moving beyond the individual perspective and capturing the dynamical relationships occurring within dyads or groups (e.g., leaders and followers, salespersons and clients, teams) might bring novel insights into the rising field of ON. In addition, designing research paradigms that reliably recreate real work and life situations might increase the generalizability and ecological validity of its results. We start with a brief overview of the current state of ON research and we continue by describing the second-person approach to social neuroscience. In the last paragraph, we try and outline how this approach could be extended to ON. To this end, we focus on leadership, group processes and emotional contagion as potential targets of interpersonal ON research.
Collapse
Affiliation(s)
- Sarah Boukarras
- Department of Psychology, Sapienza University of Rome, Rome, Italy
- Santa Lucia Foundation, IRCCS, Rome, Italy
| | - Donato Ferri
- Department of Psychology, Sapienza University of Rome, Rome, Italy
- EY, Rome, Italy
| | - Althea Frisanco
- Santa Lucia Foundation, IRCCS, Rome, Italy
- Sapienza University of Rome and CLNS@Sapienza, Italian Institute of Technology, Rome, Italy
| | | | - Chiara Consiglio
- Department of Psychology, Sapienza University of Rome, Rome, Italy
| | - Ilario Alvino
- Department of Legal Sciences, Sapienza University of Rome, Rome, Italy
| | - Francesco Bianchi
- Department of Psychology, Sapienza University of Rome, Rome, Italy
- EY, Rome, Italy
| | | | - Laura Borgogni
- Department of Psychology, Sapienza University of Rome, Rome, Italy
| | - Salvatore Maria Aglioti
- Santa Lucia Foundation, IRCCS, Rome, Italy
- Sapienza University of Rome and CLNS@Sapienza, Italian Institute of Technology, Rome, Italy
| |
Collapse
|
7
|
Eye gaze and visual attention as a window into leadership and followership: A review of empirical insights and future directions. THE LEADERSHIP QUARTERLY 2022. [DOI: 10.1016/j.leaqua.2022.101654] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
8
|
Wang X, Lu K, He Y, Gao Z, Hao N. Close spatial distance and direct gaze bring better communication outcomes and more intertwined neural networks. Neuroimage 2022; 261:119515. [PMID: 35932994 DOI: 10.1016/j.neuroimage.2022.119515] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Revised: 03/31/2022] [Accepted: 07/24/2022] [Indexed: 10/31/2022] Open
Abstract
Non-verbal cues tone our communication. Previous studies found that non-verbal factors, such as spatial distance and gaze direction, significantly impact interpersonal communication. However, little is known about the behind multi-brain neural correlates and whether it could affect high-level creative group communication. Here, we provided a new, scalable, and neuro-based approach to explore the effects of non-verbal factors on different communication tasks, and revealed the underlying multi-brain neural correlates using fNIRS-based hyperscanning technique. Across two experiments, we found that closer spatial distance and more direct gaze angle could promote collaborative behaviors, improve both creative and non-creative communication outcomes, and enhance inter-brain neural synchronization. Moreover, compared to the non-creative communication task, participants' inter-brain network was more intertwined when performing the creative communication task. These findings suggest that close spatial distance and direct gaze serve as positive social cues, bringing interacting brains into alignment and optimizing inter-brain information transfer, thus improving communication outcomes.
Collapse
Affiliation(s)
- Xinyue Wang
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China, 200062
| | - Kelong Lu
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China, 200062
| | - Yingyao He
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China, 200062
| | - Zhenni Gao
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China, 200062
| | - Ning Hao
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China, 200062.
| |
Collapse
|
9
|
Wang X, Zhang J, Zhang H, Zhao S, Liu H. Vision-Based Gaze Estimation: A Review. IEEE Trans Cogn Dev Syst 2022. [DOI: 10.1109/tcds.2021.3066465] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Affiliation(s)
- Xinming Wang
- School of Mechanical and Automation, State Key Laboratory of Robotics and Systems, Harbin Institute of Technology (Shenzhen), Shenzhen, China
| | - Jianhua Zhang
- School of Computer Science and Engineering, Tianjin University of Technology, Tianjin, China
| | - Hanlin Zhang
- School of Mechanical and Automation, State Key Laboratory of Robotics and Systems, Harbin Institute of Technology (Shenzhen), Shenzhen, China
| | - Shuwen Zhao
- School of Computing, University of Portsmouth, Portsmouth, U.K
| | - Honghai Liu
- School of Mechanical and Automation, State Key Laboratory of Robotics and Systems, Harbin Institute of Technology (Shenzhen), Shenzhen, China
| |
Collapse
|
10
|
Parra E, García Delgado A, Carrasco-Ribelles LA, Chicchi Giglioli IA, Marín-Morales J, Giglio C, Alcañiz Raya M. Combining Virtual Reality and Machine Learning for Leadership Styles Recognition. Front Psychol 2022; 13:864266. [PMID: 35712148 PMCID: PMC9197484 DOI: 10.3389/fpsyg.2022.864266] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 04/19/2022] [Indexed: 11/13/2022] Open
Abstract
The aim of this study was to evaluate the viability of a new selection procedure based on machine learning (ML) and virtual reality (VR). Specifically, decision-making behaviours and eye-gaze patterns were used to classify individuals based on their leadership styles while immersed in virtual environments that represented social workplace situations. The virtual environments were designed using an evidence-centred design approach. Interaction and gaze patterns were recorded in 83 subjects, who were classified as having either high or low leadership style, which was assessed using the Multifactor leadership questionnaire. A ML model that combined behaviour outputs and eye-gaze patterns was developed to predict subjects' leadership styles (high vs low). The results indicated that the different styles could be differentiated by eye-gaze patterns and behaviours carried out during immersive VR. Eye-tracking measures contributed more significantly to this differentiation than behavioural metrics. Although the results should be taken with caution as the small sample does not allow generalization of the data, this study illustrates the potential for a future research roadmap that combines VR, implicit measures, and ML for personnel selection.
Collapse
Affiliation(s)
- Elena Parra
- Institute for Research and Innovation in Bioengineering, Polytechnic University of Valencia, Valencia, Spain
| | - Aitana García Delgado
- Institute for Research and Innovation in Bioengineering, Polytechnic University of Valencia, Valencia, Spain
| | - Lucía Amalia Carrasco-Ribelles
- Institute for Research and Innovation in Bioengineering, Polytechnic University of Valencia, Valencia, Spain
- Fundació Institut Universitari per a la Recerca a l’Atenció Primària de Salut Jordi Gol i Gurina, Cornellà de Llobregat, Spain
| | | | - Javier Marín-Morales
- Institute for Research and Innovation in Bioengineering, Polytechnic University of Valencia, Valencia, Spain
| | - Cristina Giglio
- Institute for Research and Innovation in Bioengineering, Polytechnic University of Valencia, Valencia, Spain
| | - Mariano Alcañiz Raya
- Institute for Research and Innovation in Bioengineering, Polytechnic University of Valencia, Valencia, Spain
| |
Collapse
|
11
|
Xue P, wang C, Huang W, Zhou G. ORB Features and Isophotes Curvature Information for Eye Center Accurate Localization. INT J PATTERN RECOGN 2022. [DOI: 10.1142/s0218001422560055] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Pupil center recognition and location is an essential branch of ergonomics. It can be applied to emotion analysis and attention judgment. How to get the position of the pupil center from eye photos is the core of this field. Previous studies provided a helpful method, using scale-invariant feature transform (SIFT) to extract relevant features and combine them with the K-Nearest Neighbor (KNN) classifier. However, this method’s accuracy is not satisfying, and under some conditions, it will be position drift and other problems. We put forward a new idea to solve it by using Oriented FAST and Rotated BRIEF (ORB) features and Random Forest (RF) classifies. It is proved by experiment that our method improves the robustness of localization and the use of isophotes yields low computational cost, allowing for real-time processing. Meanwhile, we found that the ORB and RF are nearly as good, yielding an accuracy of 92.88% (BioID database).
Collapse
Affiliation(s)
- Pengxiang Xue
- College of Optoelectronic Engineering Xi’an Technological University Xi’an, P. R. China
| | - Changyuan wang
- College of Computer Science Xi’an Technological University Xi’an, P. R. China
| | - Wenbo Huang
- College of Optoelectronic Engineering Xi’an Technological University Xi’an, P. R. China
| | - Guanghao Zhou
- College of Computer Science Xi’an Technological University Xi’an, P. R. China
| |
Collapse
|
12
|
Ristic J, Capozzi F. Interactive Cognition: An introduction. VISUAL COGNITION 2022. [DOI: 10.1080/13506285.2021.2013146] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Jelena Ristic
- Department of Psychology, McGill University, Montreal, Canada
| | | |
Collapse
|
13
|
Capozzi F, Bayliss AP, Ristic J. Standing out from the crowd: Both cue numerosity and social information affect attention in multi-agent contexts. Q J Exp Psychol (Hove) 2021; 74:1737-1746. [PMID: 33845707 PMCID: PMC8392755 DOI: 10.1177/17470218211013028] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2020] [Revised: 02/28/2021] [Accepted: 03/09/2021] [Indexed: 01/10/2023]
Abstract
Groups of people offer abundant opportunities for social interactions. We used a two-phase task to investigate how social cue numerosity and social information about an individual affected attentional allocation in such multi-agent settings. The learning phase was a standard gaze-cuing procedure in which a stimulus face could be either uninformative or informative about the upcoming target. The test phase was a group-cuing procedure in which the stimulus faces from the learning phase were presented in groups of three. The target could either be cued by the group minority (i.e., one face) or majority (i.e., two faces) or by uninformative or informative stimulus faces. Results showed an effect of cue numerosity, whereby responses were faster to targets cued by the group majority than the group minority. However, responses to targets cued by informative identities included in the group minority were as fast as responses to targets cued by the group majority. Thus, previously learned social information about an individual was able to offset the general enhancement of cue numerosity, revealing a complex interplay between cue numerosity and social information in guiding attention in multi-agent settings.
Collapse
Affiliation(s)
| | | | - Jelena Ristic
- Department of Psychology, McGill University, Montreal, QC, Canada
| |
Collapse
|
14
|
Affiliation(s)
| | - Jelena Ristic
- Department of Psychology, McGill University, Montreal, Canada
| |
Collapse
|
15
|
Gobel MS, Giesbrecht B. Social information rapidly prioritizes overt but not covert attention in a joint spatial cueing task. Acta Psychol (Amst) 2020; 211:103188. [PMID: 33080443 DOI: 10.1016/j.actpsy.2020.103188] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2020] [Revised: 09/02/2020] [Accepted: 09/13/2020] [Indexed: 10/23/2022] Open
Abstract
Coordinating actions with others is crucial for our survival. Our ability to see what others are seeing and to align our visual attention with them facilitates these joint actions. In the present research, we set out to increase our understanding of such joint attention by investigating the extent to which social information would be able to prioritize overt (when moving the eyes to attend) and covert (when shifting attention without eye movements) attention in a joint spatial cueing task. Participants saw a cue and detected a target at the same or a different location alongside an unseen partner of either higher or lower social rank. In a novel twist, participants were led to believe that the cue was connected to the gaze location of their partner. In Experiment 1, where participants were told to not move their eyes (covert attention), the partner's social rank did not change how quickly participants detected targets. But in Experiment 2, where participants were free to move their eyes naturally (overt attention), inhibition of return effects (slower responses to cued than uncued targets) were modulated by their partner's social rank. These social top-down effects occurred already at a short SOA of 150 ms. Our findings suggest that overt attention might provide a key tool for joint action, as it is penetrable for social information at the early stages of information processing.
Collapse
|
16
|
Schepisi M, Porciello G, Aglioti SM, Panasiti MS. Oculomotor behavior tracks the effect of ideological priming on deception. Sci Rep 2020; 10:9555. [PMID: 32533078 PMCID: PMC7293254 DOI: 10.1038/s41598-020-66151-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2020] [Accepted: 05/14/2020] [Indexed: 12/03/2022] Open
Abstract
The decision to lie to another person involves a conflict between one's own and others' interest. Political ideology may foster self-promoting or self-transcending values and thus may balance or fuel self vs. other related conflicts. Here, we explored in politically non-aligned participants whether oculomotor behavior may index the influence on moral decision-making of prime stimuli related to left and right-wing ideologies. We presented pictures of Italian politicians and ideological words in a paradigm where participants could lie to opponents with high vs. low socio-economic status to obtain a monetary reward. Results show that left-wing words decreased self-gain lies and increased other-gain ones. Oculomotor behavior revealed that gazing longer at politicians' pictures led participants to look longer at opponent's status-related information than at game's outcome-related information before the decision. This, in turn, caused participants to lie less to low status opponents. Moreover, after lying, participants averted their gaze from high status opponents and maintained it towards low status ones. Our results offer novel evidence that ideological priming influences moral decision-making and suggest that oculomotor behavior may provide crucial insights on how this process takes place.
Collapse
Affiliation(s)
- Michael Schepisi
- Department of Psychology, "Sapienza" University of Rome, Rome, Italy.
- IRCCS, Santa Lucia Foundation, Rome, Italy.
| | - Giuseppina Porciello
- Department of Psychology, "Sapienza" University of Rome, Rome, Italy
- IRCCS, Santa Lucia Foundation, Rome, Italy
| | - Salvatore Maria Aglioti
- IRCCS, Santa Lucia Foundation, Rome, Italy
- Department of Psychology, "Sapienza" University of Rome and CNLS@Sapienza Istituto Italiano di Tecnologia, Rome, Italy
| | - Maria Serena Panasiti
- Department of Psychology, "Sapienza" University of Rome, Rome, Italy.
- IRCCS, Santa Lucia Foundation, Rome, Italy.
| |
Collapse
|