1
|
Schirm J, Gómez-Vargas AR, Perusquía-Hernández M, Skarbez RT, Isoyama N, Uchiyama H, Kiyokawa K. Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality. SENSORS (BASEL, SWITZERLAND) 2023; 23:6667. [PMID: 37571449 PMCID: PMC10422404 DOI: 10.3390/s23156667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 07/03/2023] [Accepted: 07/14/2023] [Indexed: 08/13/2023]
Abstract
Experiences of virtual reality (VR) can easily break if the method of evaluating subjective user states is intrusive. Behavioral measures are increasingly used to avoid this problem. One such measure is eye tracking, which recently became more standard in VR and is often used for content-dependent analyses. This research is an endeavor to utilize content-independent eye metrics, such as pupil size and blinks, for identifying mental load in VR users. We generated mental load independently from visuals through auditory stimuli. We also defined and measured a new eye metric, focus offset, which seeks to measure the phenomenon of "staring into the distance" without focusing on a specific surface. In the experiment, VR-experienced participants listened to two native and two foreign language stimuli inside a virtual phone booth. The results show that with increasing mental load, relative pupil size on average increased 0.512 SDs (0.118 mm), with 57% reduced variance. To a lesser extent, mental load led to fewer fixations, less voluntary gazing at distracting content, and a larger focus offset as if looking through surfaces (about 0.343 SDs, 5.10 cm). These results are in agreement with previous studies. Overall, we encourage further research on content-independent eye metrics, and we hope that hardware and algorithms will be developed in the future to further increase tracking stability.
Collapse
Affiliation(s)
- Johannes Schirm
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Andrés Roberto Gómez-Vargas
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Monica Perusquía-Hernández
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Richard T. Skarbez
- Department of Computer Science and Information Technology, School of Computing, Engineering and Mathematical Sciences, La Trobe University, Melbourne Campus, Melbourne, VIC 3086, Australia
| | - Naoya Isoyama
- Faculty of Social Information Studies, Otsuma Women’s University, Tokyo 102-8357, Japan;
| | - Hideaki Uchiyama
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| | - Kiyoshi Kiyokawa
- Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan; (A.R.G.-V.); (M.P.-H.); (H.U.); (K.K.)
| |
Collapse
|
2
|
Li C, Du C, Ge S, Tong T. An eye-tracking study on visual perception of vegetation permeability in virtual reality forest exposure. Front Public Health 2023; 11:1089423. [PMID: 36761146 PMCID: PMC9902884 DOI: 10.3389/fpubh.2023.1089423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2022] [Accepted: 01/04/2023] [Indexed: 01/26/2023] Open
Abstract
Previous studies have confirmed the significant effects of single forest stand attributes, such as forest type (FT), understory vegetation cover (UVC), and understory vegetation height (UVH) on visitors' visual perception. However, rarely study has yet clearly determined the relationship between vegetation permeability and visual perception, while the former is formed by the interaction of multiple forest stand attributes (i.e., FT, UVC, UVH). Based on a mixed factor matrix of FT (i.e., coniferous forests and broadleaf), UVC level (i.e., 10, 60, and 100%), and UVH level (0.1, 1, and 3 m), the study creates 18 immersive virtual forest videos with different stand attributes. Virtual reality eye-tracking technology and questionnaires are used to collect visual perception data from viewing virtual forest videos. The study finds that vegetation permeability which is formed by the interaction effect of canopy density (i.e., FT) and understory density (i.e., UVC, UVH), significantly affects participant's visual perception: in terms of visual physiology characteristics, pupil size is significantly negatively correlated with vegetation permeability when participants are viewing virtual reality forest; in terms of visual psychological characteristics, the understory density formed by the interaction of UVC and UVH has a significant impact on visual attractiveness and perceived safety and the impact in which understory density is significantly negatively correlated with perceived safety. Apart from these, the study finds a significant negative correlation between average pupil diameter and perceived safety when participants are viewing virtual reality forests. The findings may be beneficial for the maintenance and management of forest parks, as well as provide insights into similar studies to explore urban public green spaces.
Collapse
Affiliation(s)
- Chang Li
- School of Architecture and Urban Planning, Suzhou University of Science and Technology, Suzhou, China,*Correspondence: Chang Li ✉
| | - Chunlan Du
- Key Laboratory of New Technology for Construction of Cities in Mountain Area, Ministry of Education, Chongqing University, Chongqing, China,School of Architecture and Urban Planning, Chongqing University, Chongqing, China,Chunlan Du ✉
| | - Shutong Ge
- School of Architecture and Urban Planning, Suzhou University of Science and Technology, Suzhou, China
| | - Tong Tong
- School of Architecture and Urban Planning, Suzhou University of Science and Technology, Suzhou, China
| |
Collapse
|
3
|
Ban S, Lee YJ, Kim KR, Kim JH, Yeo WH. Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements. BIOSENSORS 2022; 12:1039. [PMID: 36421157 PMCID: PMC9688058 DOI: 10.3390/bios12111039] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Revised: 11/11/2022] [Accepted: 11/13/2022] [Indexed: 06/16/2023]
Abstract
Eye movements show primary responses that reflect humans' voluntary intention and conscious selection. Because visual perception is one of the fundamental sensory interactions in the brain, eye movements contain critical information regarding physical/psychological health, perception, intention, and preference. With the advancement of wearable device technologies, the performance of monitoring eye tracking has been significantly improved. It also has led to myriad applications for assisting and augmenting human activities. Among them, electrooculograms, measured by skin-mounted electrodes, have been widely used to track eye motions accurately. In addition, eye trackers that detect reflected optical signals offer alternative ways without using wearable sensors. This paper outlines a systematic summary of the latest research on various materials, sensors, and integrated systems for monitoring eye movements and enabling human-machine interfaces. Specifically, we summarize recent developments in soft materials, biocompatible materials, manufacturing methods, sensor functions, systems' performances, and their applications in eye tracking. Finally, we discuss the remaining challenges and suggest research directions for future studies.
Collapse
Affiliation(s)
- Seunghyeb Ban
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Yoon Jae Lee
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Ka Ram Kim
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Jong-Hoon Kim
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| | - Woon-Hong Yeo
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Tech and Emory University School of Medicine, Atlanta, GA 30332, USA
- Neural Engineering Center, Institute for Materials, Institute for Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30332, USA
| |
Collapse
|