1
|
Zhang Y, Martinez-Cedillo AP, Mason HT, Vuong QC, Garcia-de-Soria MC, Mullineaux D, Knight MI, Geangu E. An automatic sustained attention prediction (ASAP) method for infants and toddlers using wearable device signals. Sci Rep 2025; 15:13298. [PMID: 40247023 PMCID: PMC12006380 DOI: 10.1038/s41598-025-96794-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2024] [Accepted: 03/28/2025] [Indexed: 04/19/2025] Open
Abstract
Sustained attention (SA) is a critical cognitive ability that emerges in infancy and affects various aspects of development. Research on SA typically occurs in lab settings, which may not reflect infants' real-world experiences. Infant wearable technology can collect multimodal data in natural environments, including physiological signals for measuring SA. Here we introduce an automatic sustained attention prediction (ASAP) method that harnesses electrocardiogram (ECG) and accelerometer (Acc) signals. Data from 75 infants (6- to 36-months) were recorded during different activities, with some activities emulating those occurring in the natural environment (i.e., free play). Human coders annotated the ECG data for SA periods validated by fixation data. ASAP was trained on temporal and spectral features from the ECG and Acc signals to detect SA, performing consistently across age groups. To demonstrate ASAP's applicability, we investigated the relationship between SA and perceptual features-saliency and clutter-measured from egocentric free-play videos. Results showed that saliency in infants' and toddlers' views increased during attention periods and decreased with age for attention but not inattention. We observed no differences between ASAP attention detection and human-coded SA periods, demonstrating that ASAP effectively detects SA in infants during free play. Coupled with wearable sensors, ASAP provides unprecedented opportunities for studying infant development in real-world settings.
Collapse
Affiliation(s)
- Yisi Zhang
- Department of Psychological and Cognitive Sciences, Tsinghua University, Beijing, 100084, People's Republic of China
| | - A Priscilla Martinez-Cedillo
- Department of Psychology, University of York, York, YO10 5DD, England
- Department of Psychology, University of Essex, Wivenhoe Park, Colchester, Essex, CO4 3SQ, England
| | - Harry T Mason
- School of Physics, Engineering and Technology, University of York, York, YO10 5DD, England
- Bristol Medical School, University of Bristol, Oakfield House, Bristol, BS8 2BN, England
| | - Quoc C Vuong
- Bioscience Institute, Newcastle University, Newcastle Upon Tyne, NE1 7RU, England
- School of Psychology, Newcastle University, Newcastle Upon Tyne, NE1 7RU, England
| | - M Carmen Garcia-de-Soria
- Department of Psychology, University of York, York, YO10 5DD, England
- Department of Psychology, University of Aberdeen, Aberdeen, UK
| | - David Mullineaux
- Department of Mathematics, University of York, York, YO10 5DD, England
| | - Marina I Knight
- Department of Mathematics, University of York, York, YO10 5DD, England
| | - Elena Geangu
- Department of Psychology, University of York, York, YO10 5DD, England.
| |
Collapse
|
2
|
Geangu E, Vuong QC. Seven-months-old infants show increased arousal to static emotion body expressions: Evidence from pupil dilation. INFANCY 2023. [PMID: 36917082 DOI: 10.1111/infa.12535] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2022] [Revised: 12/23/2022] [Accepted: 02/10/2023] [Indexed: 03/16/2023]
Abstract
Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds' fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).
Collapse
Affiliation(s)
- Elena Geangu
- Department of Psychology, University of York, York, UK
| | - Quoc C Vuong
- Biosciences Institute and School of Psychology, Newcastle University, Newcastle upon Tyne, UK
| |
Collapse
|
3
|
Ke H, Vuong QC, Geangu E. Three- and six-year-old children are sensitive to natural body expressions of emotion: An event-related potential emotional priming study. J Exp Child Psychol 2022; 224:105497. [PMID: 35850023 DOI: 10.1016/j.jecp.2022.105497] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2021] [Revised: 03/23/2022] [Accepted: 06/06/2022] [Indexed: 12/01/2022]
Abstract
Body movements provide a rich source of emotional information during social interactions. Although the ability to perceive biological motion cues related to those movements begins to develop during infancy, processing those cues to identify emotions likely continues to develop into childhood. Previous studies used posed or exaggerated body movements, which might not reflect the kind of body expressions children experience. The current study used an event-related potential (ERP) priming paradigm to investigate the development of emotion recognition from more naturalistic body movements. Point-light displays (PLDs) of male adult bodies expressing happy or angry emotional movements while narrating a story were used as prime stimuli, whereas audio recordings of the words "happy" and "angry" spoken with an emotionally neutral prosody were used as targets. We recorded the ERPs time-locked to the onset of the auditory target from 3- and 6-year-old children, and we compared amplitude and latency of the N300 and N400 responses between the two age groups in the different prime-target conditions. There was an overall effect of prime for the N300 amplitude, with more negative-going responses for happy PLDs compared with angry PLDs. There was also an interaction between prime and target for the N300 latency, suggesting that all children were sensitive to the emotional congruency between body movements and words. For the N400 component, there was only an interaction among age, prime, and target for latency, suggesting an age-dependent modulation of this component when prime and target did not match in emotional information. Overall, our results suggest that the emergence of more complex emotion processing of body expressions occurs around 6 years of age, but it is not fully developed at this point in ontogeny.
Collapse
Affiliation(s)
- Han Ke
- Department of Psychology, Lancaster University, Lancaster LA1 4YF, UK.
| | - Quoc C Vuong
- Biosciences Institute, Newcastle University, Newcastle upon Tyne NE2 4HH, UK
| | - Elena Geangu
- Department of Psychology, University of York, York YO10 5DD, UK
| |
Collapse
|
4
|
Ogren M, Johnson SP. Nonverbal emotion perception and vocabulary in late infancy. Infant Behav Dev 2022; 68:101743. [PMID: 35763939 PMCID: PMC10251432 DOI: 10.1016/j.infbeh.2022.101743] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 05/02/2022] [Accepted: 05/13/2022] [Indexed: 11/05/2022]
Abstract
Language has been proposed as a potential mechanism for young children's developing understanding of emotion. However, much remains unknown about this relation at an individual difference level. The present study investigated 15- to 18-month-old infants' perception of emotions across multiple pairs of faces. Parents reported their child's productive vocabulary, and infants participated in a non-linguistic emotion perception task via an eye tracker. Infant vocabulary did not predict nonverbal emotion perception when accounting for infant age, gender, and general object perception ability (β = -0.15, p = .300). However, we observed a gender difference: Only girls' vocabulary scores related to nonverbal emotion perception when controlling for age and general object perception ability (β = 0.42, p = .024). Further, boys showed a stronger preference for the novel emotion face vs. girls (t(48) = 2.35, p = .023, d= 0.67). These data suggest that pathways of processing emotional information (e.g., using language vs visual information) may differ for girls and boys in late infancy.
Collapse
|
5
|
Do infants represent human actions cross-modally? An ERP visual-auditory priming study. Biol Psychol 2021; 160:108047. [PMID: 33596461 DOI: 10.1016/j.biopsycho.2021.108047] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2020] [Revised: 01/15/2021] [Accepted: 02/08/2021] [Indexed: 12/27/2022]
Abstract
Recent findings indicate that 7-months-old infants perceive and represent the sounds inherent to moving human bodies. However, it is not known whether infants integrate auditory and visual information in representations of specific human actions. To address this issue, we used ERPs to investigate infants' neural sensitivity to the correspondence between sounds and images of human actions. In a cross-modal priming paradigm, 7-months-olds were presented with the sounds generated by two types of human body movement, walking and handclapping, after watching the kinematics of those actions in either a congruent or incongruent manner. ERPs recorded from frontal, central and parietal electrodes in response to action sounds indicate that 7-months-old infants perceptually link the visual and auditory cues of human actions. However, at this age these percepts do not seem to be integrated in cognitive multimodal representations of human actions.
Collapse
|