1
|
RoBkopf S, Muhlberger A, Starz F, Van De Par S, Blau M, Kroczek LOH. Impact of Visual Virtual Scene and Localization Task on Auditory Distance Perception in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:2464-2474. [PMID: 40085450 DOI: 10.1109/tvcg.2025.3549855] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/16/2025]
Abstract
Investigating auditory perception and cognition in realistic, controlled environments is made possible by virtual reality (VR). However, when visual information is presented, sound localization results from multimodal integration. Additionally, using head-mounted displays leads to a distortion of visual egocentric distances. With two different paradigms, we investigated the extent to which different visual scenes influence auditory distance perception, and secondary presence and realism. To be more precise, different room models were displayed via HMD while participants had to localize sounds emanating from real loudspeakers. In the first paradigm, we manipulated whether a room was congruent or incongruent to the physical room. In a second paradigm, we manipulated room visibility - displaying either an audiovisual congruent room or a scene containing almost no spatial information- and localization task. Participants indicated distances either by placing a virtual loudspeaker, walking, or verbal report. While audiovisual room incongruence had a detrimental effect on distance perception, no main effect of room visibility was found but an interaction with the task. Overestimation of distances was higher using the placement task in the non-spatial scene. The results suggest an effect of visual scene on auditory perception in VR implying a need for consideration e.g. in virtual acoustics research.
Collapse
|
2
|
Banerjee P, Montiel MP, Tomita L, Means O, Kutch J, Culbertson H. The Impact of Airflow and Multisensory Feedback on Immersion and Cybersickness in a VR Surfing Simulation. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:2445-2454. [PMID: 40053658 DOI: 10.1109/tvcg.2025.3549125] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/09/2025]
Abstract
Virtual Reality (VR) systems have increasingly leveraged multisensory feedback to enrich user experience and mitigate cybersickness. With a similar goal in focus, this paper presents an in-depth exploration of integrating airflow with visual and kinesthetic cues in a VR surfing simulation. Utilizing a custom-designed airflow system and a physical surfboard mounted on a 6-Degree of Freedom (DoF) motion platform, we present two studies that evaluate the effect of the different feedback modalities. The first study assesses the impact of variable airflow, which dynamically adjusts to the user's speed (wind speed) in VR, compared to constant airflow conditions, under both active and passive user engagement scenarios. Results demonstrate that variable airflow significantly enhances immersion and reduces cybersickness, particularly when users are actively engaged in the simulation. The second study evaluates the individual and combined effects of vision, motion, and airflow on acceleration perception, user immersion, and cybersickness, revealing that the integration of all feedback modalities yields the most immersive and comfortable VR experience. This study underscores the importance of synchronized multisensory feedback in dynamic VR environments and provides valuable insights for the design of more immersive and realistic virtual simulations, particularly in aquatic, interactive, and motion-intensive scenarios.
Collapse
|
3
|
Brunetti R, Ferrante S, Avella AM, Indraccolo A, Del Gatto C. Turning stories into learning journeys: the principles and methods of Immersive Education. Front Psychol 2024; 15:1471459. [PMID: 39712545 PMCID: PMC11659684 DOI: 10.3389/fpsyg.2024.1471459] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2024] [Accepted: 11/25/2024] [Indexed: 12/24/2024] Open
Abstract
This paper describes the theoretical and practical aspects of Immersive Education, an educational methodology based on interactive narratives, articulated as emotional journeys, to develop competencies. It has been developed throughout three school years (2021-2024) with more than 400 students (8-12 years old) in Public Schools in Italy and Spain. Immersive Education can be integrated with curricular school activities and can be used to target both curricular and transversal learning objectives, specifically the ones connected with the Personal, Social and Learning to learn Key Competence (LifeComp European framework). The paper describes the inspirations that led to the creation of the methodology, including similar experiential learning approaches. It then analyses the theoretical principles of the methodology, dividing them in four key-concepts, along with psychological evidence supporting them. The four key-concepts describe how immersive education aims at being a motivation trigger, featuring a dramatic structure, how it is based on the involvement of the self, and how it focuses on fostering a continuous engagement. It continues with a detailed analysis of implementation strategies, specifically about the management of emotional triggers and reactions, enriched by numerous examples taken from the projects implemented with the students. The conclusions open the way to future research directions to measure the impact of this approach on the development of transversal and specific competences.
Collapse
Affiliation(s)
- Riccardo Brunetti
- Experimental and Applied Psychology Laboratory, Department of Human Sciences, Università Europea di Roma, Rome, Italy
- Project xx1, Rome, Italy
| | - Silvia Ferrante
- Project xx1, Rome, Italy
- Department of Developmental Psychology and Educational Research, ‘Sapienza’ University of Rome, Rome, Italy
| | | | - Allegra Indraccolo
- Experimental and Applied Psychology Laboratory, Department of Human Sciences, Università Europea di Roma, Rome, Italy
| | - Claudia Del Gatto
- Experimental and Applied Psychology Laboratory, Department of Human Sciences, Università Europea di Roma, Rome, Italy
| |
Collapse
|
4
|
Li G, Luo H, Yin X, Zhang Y, Li Z. Affording Social Experience for Adolescents Using Immersive Virtual Reality: A Moderated Mediation Analysis. CHILDREN (BASEL, SWITZERLAND) 2024; 11:1362. [PMID: 39594937 PMCID: PMC11593269 DOI: 10.3390/children11111362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/16/2024] [Revised: 11/05/2024] [Accepted: 11/08/2024] [Indexed: 11/28/2024]
Abstract
BACKGROUND Immersive virtual reality (IVR) serves as a promising tool to provide adolescents with enriched social experience due to its high-fidelity simulations and multimodal interaction. This study aims to design and develop a multi-user IVR collaborative game utilizing simultaneous localization and mapping (SLAM)-based inside-out tracking technique to foster social experience among students. Also, this study explored the mechanism by which technology acceptance affected social experience in the IVR collaboration game, focusing on the mediating effects of presence, collective efficacy, and group effectiveness, as well as the moderating effect of social-emotional competence (SEC). METHODS A total of 104 seventh graders from a middle school in Central China participated in this study and completed the questionnaire. Finally, 87 valid questionnaire responses were retrieved. RESULTS The results revealed that technology acceptance both directly and indirectly influenced social experience. The mediation analysis revealed a key pathway influencing social experience: technology acceptance → presence → collective efficacy → group effectiveness → social experience. However, no moderating effect of SEC was found in the relationship between technology acceptance and social experience, group effectiveness, and social experience. CONCLUSIONS Based on these results, more appropriate IVR interventions could be developed for social-emotional learning among children and adolescents.
Collapse
Affiliation(s)
- Gege Li
- Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (G.L.); (H.L.); (X.Y.); (Y.Z.)
| | - Heng Luo
- Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (G.L.); (H.L.); (X.Y.); (Y.Z.)
| | - Xin Yin
- Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (G.L.); (H.L.); (X.Y.); (Y.Z.)
| | - Yan Zhang
- Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China; (G.L.); (H.L.); (X.Y.); (Y.Z.)
| | - Zijian Li
- School of Fine Arts, Central China Normal University, Wuhan 430079, China
| |
Collapse
|
5
|
Zhang J, Huang M, Chen Y, Liao KL, Shi J, Liang HN, Yang R. TouchMark: Partial Tactile Feedback Design for Upper Limb Rehabilitation in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:7430-7440. [PMID: 39255139 DOI: 10.1109/tvcg.2024.3456173] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/12/2024]
Abstract
The use of Virtual Reality (VR) technology, especially in medical rehabilitation, has expanded to include tactile cues along with visual stimuli. For patients with upper limb hemiplegia, tangible handles with haptic stimuli could improve their ability to perform daily activities. Traditional VR controllers are unsuitable for patient rehabilitation in VR, necessitating the design of specialized tangible handles with integrated tracking devices. Besides, matching tactile stimulation with corresponding virtual visuals could strengthen users' embodiment (i.e., owning and controlling virtual bodies) in VR, which is crucial for patients' training with virtual hands. Haptic stimuli have been shown to amplify the embodiment in VR, whereas the effect of partial tactile stimulation from tangible handles on embodiment remains to be clarified. This research, including three experiments, aims to investigate how partial tactile feedback of tangible handles impacts users' embodiment, and we proposed a design concept called TouchMark for partial tactile stimuli that could help users quickly connect the physical and virtual worlds. To evaluate users' tactile and comfort perceptions when grasping tangible handles in a non-VR setting, various handles with three partial tactile factors were manipulated in Study 1. In Study 2, we explored the effects of partial feedback using three forms of TouchMark on the embodiment of healthy users in VR, with various tangible handles, while Study 3 focused on similar investigations with patients. These handles were utilized to complete virtual food preparation tasks. The tactile and comfort perceptions of tangible handles and users' embodiment were evaluated in this research using questionnaires and interviews. The results indicate that TouchMark with haptic line and ring forms over no stimulation would significantly enhance users' embodiment, especially for patients. The low-cost and innovative TouchMark approach may assist users, particularly those with limited VR experience, in achieving the embodiment and enhancing their virtual interactive experience.
Collapse
|
6
|
Hu K, Wang R, Zhao S, Yin E, Wu H. The association between social rewards and anxiety: Links from neurophysiological analysis in virtual reality and social interaction game. Neuroimage 2024; 299:120846. [PMID: 39260780 DOI: 10.1016/j.neuroimage.2024.120846] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2024] [Revised: 08/31/2024] [Accepted: 09/09/2024] [Indexed: 09/13/2024] Open
Abstract
Individuals' affective experience can be intricate, influenced by various factors including monetary rewards and social factors during social interaction. However, within this array of factors, divergent evidence has been considered as potential contributors to social anxiety. To gain a better understanding of the specific factors associated with anxiety during social interaction, we combined a social interaction task with neurophysiological recordings obtained through an anxiety-elicitation task conducted in a Virtual Reality (VR) environment. Employing inter-subject representational similarity analysis (ISRSA), we explored the potential linkage between individuals' anxiety neural patterns and their affective experiences during social interaction. Our findings suggest that, after controlling for other factors, the influence of the partner's emotional cues on individuals' affective experiences is specifically linked to their neural pattern of anxiety. This indicates that the emergence of anxiety during social interaction may be particularly associated with the emotional cues provided by the social partner, rather than individuals' own reward or prediction errors during social interaction. These results provide further support for the cognitive theory of social anxiety and extend the application of VR in future cognitive and affective studies.
Collapse
Affiliation(s)
- Keyu Hu
- Centre for Cognitive and Brain Sciences and Department of Psychology, University of Macau, Macau, China
| | - Ruien Wang
- Centre for Cognitive and Brain Sciences and Department of Psychology, University of Macau, Macau, China
| | - Shaokai Zhao
- Defense Innovation Institute, Academy of Military Sciences, Beijing, China
| | - Erwei Yin
- Defense Innovation Institute, Academy of Military Sciences, Beijing, China
| | - Haiyan Wu
- Centre for Cognitive and Brain Sciences and Department of Psychology, University of Macau, Macau, China.
| |
Collapse
|
7
|
Goktepe N, Drewing K, Schutz AC. Spatiotemporal Congruency Modulates Weighting of Visuotactile Information in Displacement Judgments. IEEE TRANSACTIONS ON HAPTICS 2024; 17:860-869. [PMID: 38980771 DOI: 10.1109/toh.2024.3421953] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/11/2024]
Abstract
Combining or integrating information from multiple senses often provides richer and more reliable estimates for the perception of objects and events. In daily life, sensory information from the same source often is in close spatiotemporal proximity. This can be an important determinant of whether and how multisensory signals are combined. The introduction of advanced technical display systems allows to present multisensory information in virtual environments. However, technical displays can lack the spatiotemporal fidelity of the real world due the rendering delays. Thus, any spatiotemporal incongruency could alter how information is combined. In the current study we tested this by investigating if and how spatially and temporally discrepant tactile displacement cues can supplement imprecise visual displacement cues. Participants performed a visual displacement task with visual and tactile displacement cues under spatial and temporal incongruency conditions. We modelled how participants combined visual and tactile information in visuotactile condition using their performance in visual only condition. We found that temporal incongruency lead to an increase in tactile weights although they were correlated with the congruency condition. In contrast, the spatial incongruency led to individual differences altering cue combination strategies. Our results illustrate the importance of spatiotemporal congruency for combining tactile and visual cues when making visual displacement judgments. Given the altered cue combination strategies and individual differences, we recommend developers to adopt individual spatiotemporal calibration procedures to improve the efficiency of the sensory augmentation.
Collapse
|
8
|
Meore A, Ganesh N, Sun S, Singer A, Byma L, Lorenzetti B, Feder A, Adams T, Galfalvy H, Boyer J, Haghighi F. Pilot study of telehealth delivery of horticultural therapy (TeleHT) as an acceptable intervention and in reducing suicide risk factors in veterans. Complement Ther Med 2024; 85:103075. [PMID: 39147286 DOI: 10.1016/j.ctim.2024.103075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Revised: 06/06/2024] [Accepted: 08/06/2024] [Indexed: 08/17/2024] Open
Abstract
OBJECTIVES Converging evidence indicates that Horticultural Therapy (HT) contributes to significant reductions in stress, loneliness, and depression, notable risk factors for suicidality. This pilot study aimed to assess the initial feasibility and acceptability of HT when virtually administered. INTERVENTION Telehealth-delivered horticultural therapy (TeleHT) was administered to groups of Veterans, including those with elevated suicide risk over the course of four weeks. Participants were each sent a package through the mail of at-home gardening supplies that were used to facilitate multisensory, nature experiences during weekly HT sessions administered via Zoom. OUTCOME MEASURES Participants completed thermometer-based scales for the suicide risk factors of stress, loneliness, depression, and pain before and after each TeleHT session. Post-intervention qualitative assessments were completed upon the conclusion of the four-week intervention. RESULTS Significant reductions in stress, depression, and loneliness risk were observed from weekly pre- to post-session measures (p < 0.05), with 89.1 % HT completion rate. Stress, pain, depression, and loneliness indices also showed small to medium sized symptom reduction amongst Veterans with no history of suicidality (Cohen's d=-0.70, d=-0.49, d=-0.62, d=-0.71), while those with elevated suicide risk at baseline also showed reduction in these risk factors with small to medium effect sizes (d=-0.58, d=-.018, d=-0.46, d=-0.41). Qualitative post-intervention assessments indicated a high degree of acceptability and pointed to the inclusion of mailed gardening packages as particularly relevant to positive experiences. CONCLUSIONS While future work is needed to fully assess efficacy, findings from this pilot study demonstrate an initial feasibility and acceptability through a high retention rate and positive qualitative assessments for TeleHT that mirror that of the in-person intervention.
Collapse
Affiliation(s)
- Anne Meore
- New York Botanical Garden, Bronx, NY, USA
| | | | - Shengnan Sun
- James J. Peters VA Medical Center, Bronx, NY, USA; Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Akiva Singer
- James J. Peters VA Medical Center, Bronx, NY, USA
| | - Lauren Byma
- James J. Peters VA Medical Center, Bronx, NY, USA
| | | | - Ann Feder
- New York Botanical Garden, Bronx, NY, USA
| | - Toby Adams
- New York Botanical Garden, Bronx, NY, USA
| | | | | | - Fatemeh Haghighi
- James J. Peters VA Medical Center, Bronx, NY, USA; Icahn School of Medicine at Mount Sinai, New York, NY, USA.
| |
Collapse
|
9
|
Gong Y, Zhang K, Lei IM, Wang Y, Zhong J. Advances in Piezoelectret Materials-Based Bidirectional Haptic Communication Devices. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2024; 36:e2405308. [PMID: 38895922 DOI: 10.1002/adma.202405308] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2024] [Revised: 06/10/2024] [Indexed: 06/21/2024]
Abstract
Bidirectional haptic communication devices accelerate the revolution of virtual/augmented reality and flexible/wearable electronics. As an emerging kind of flexible piezoelectric materials, piezoelectret materials can effortlessly convert mechanical force into electrical signals and respond to electrical fields in a deformation manner, exhibiting enormous potential in the construction of bidirectional haptic communication devices. Existing reviews on piezoelectret materials primarily focus on flexible energy harvesters and sensors, and the recent development of piezoelectret-based bidirectional haptic communication devices has not been comprehensively reviewed. Herein, a comprehensive overview of the materials construction, along with the recent advances in bidirectional haptic communication devices, is provided. First, the development timeline, key characteristics, and various fabrication methods of piezoelectret materials are introduced. Subsequently, following the underlying mechanisms of bidirectional electromechanical signal conversion of piezoelectret, strategies to improve the d33 coefficients of materials are proposed. The principles of haptic perception and feedback are also highlighted, and representative works and progress in this area are summarized. Finally, the challenges and opportunities associated with improving the overall practicability of piezoelectret materials-based bidirectional haptic communication devices are discussed.
Collapse
Affiliation(s)
- Yanting Gong
- Department of Electromechanical Engineering and Centre for Artificial Intelligence and Robotics, University of Macau, Macau, SAR, 999078, China
| | - Kaijun Zhang
- Department of Electromechanical Engineering and Centre for Artificial Intelligence and Robotics, University of Macau, Macau, SAR, 999078, China
| | - Iek Man Lei
- Department of Electromechanical Engineering and Centre for Artificial Intelligence and Robotics, University of Macau, Macau, SAR, 999078, China
| | - Yan Wang
- Department of Chemical Engineering, Guangdong Technion-Israel Institute of Technology (GTIIT), Shantou, Guangdong, 515063, China
| | - Junwen Zhong
- Department of Electromechanical Engineering and Centre for Artificial Intelligence and Robotics, University of Macau, Macau, SAR, 999078, China
| |
Collapse
|
10
|
Narciso D, Melo M, Rodrigues S, Cunha JP, Vasconcelos-Raposo J, Bessa M. Studying the Influence of Multisensory Stimuli on a Firefighting Training Virtual Environment. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:4122-4136. [PMID: 37028005 DOI: 10.1109/tvcg.2023.3251188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
How we perceive and experience the world around us is inherently multisensory. Most of the Virtual Reality (VR) literature is based on the senses of sight and hearing. However, there is a lot of potential for integrating additional stimuli into Virtual Environments (VEs), especially in a training context. Identifying the relevant stimuli for obtaining a virtual experience that is perceptually equivalent to a real experience will lead users to behave the same across environments, which adds substantial value for several training areas, such as firefighters. In this article, we present an experiment aiming to assess the impact of different sensory stimuli on stress, fatigue, cybersickness, Presence and knowledge transfer of users during a firefighter training VE. The results suggested that the stimulus that significantly impacted the user's response was wearing a firefighter's uniform and combining all sensory stimuli under study: heat, weight, uniform, and mask. The results also showed that the VE did not induce cybersickness and that it was successful in the task of transferring knowledge.
Collapse
|
11
|
Chitra E, Mubin SA, Nadarajah VD, Se WP, Sow CF, Er HM, Mitra NK, Thiruchelvam V, Davamani F. A 3-D interactive microbiology laboratory via virtual reality for enhancing practical skills. Sci Rep 2024; 14:12809. [PMID: 38834815 DOI: 10.1038/s41598-024-63601-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2024] [Accepted: 05/30/2024] [Indexed: 06/06/2024] Open
Abstract
Virtual Reality (VR) laboratories are a new pedagogical approach to support psychomotor skills development in undergraduate programmes to achieve practical competency. VR laboratories are successfully used to carry out virtual experiments in science courses and for clinical skills training in professional courses. This paper describes the development and evaluation of a VR-based microbiology laboratory on Head-Mounted Display (HMD) for undergraduate students. Student and faculty perceptions and expectations were collected to incorporate into the laboratory design. An interactive 3-dimensional VR laboratory with a 360° view was developed simulating our physical laboratory setup. The laboratory environment was created using Unity with the (created) necessary assets and 3D models. The virtual laboratory was designed to replicate the physical laboratory environment as suggested by the students and faculty. In this VR laboratory, six microbiology experiments on Gram staining, bacterial streaking, bacterial motility, catalase test, oxidase test and biochemical tests were placed on the virtual platform. First-year biomedical science students were recruited to evaluate the VR laboratory. Students' perception of the virtual laboratory was positive and encouraging. About 70% of the students expressed they felt safe using the VR laboratory and that it was engaging. They felt that the VR laboratory provided an immersive learning experience. They appreciated that they could repeat each experiment multiple times without worrying about mistakes or mishaps. They could personalise their learning by concentrating on the specific experiments. Our in-house VR-based microbiology laboratory was later extended to other health professions programmes teaching microbiology.
Collapse
Affiliation(s)
- Ebenezer Chitra
- School of Health Sciences, International Medical University, Kuala Lumpur, Malaysia
| | - Siti Azreena Mubin
- Asia Pacific University of Technology and Innovation, Kuala Lumpur, Malaysia
| | | | - Wong Pei Se
- School of Pharmacy, International Medical University, Kuala Lumpur, Malaysia
| | - Chew Fei Sow
- School of Medicine, International Medical University, Kuala Lumpur, Malaysia
| | - Hui Meng Er
- School of Pharmacy, International Medical University, Kuala Lumpur, Malaysia
| | - Nilesh Kumar Mitra
- School of Medicine, International Medical University, Kuala Lumpur, Malaysia
| | - Vinesh Thiruchelvam
- Asia Pacific University of Technology and Innovation, Kuala Lumpur, Malaysia
| | - Fabian Davamani
- School of Health Sciences, International Medical University, Kuala Lumpur, Malaysia.
| |
Collapse
|
12
|
Brookman R, Hulm Z, Hearn L, Siette J, Mathew N, Deodhar S, Cass A, Smith J, Kenny B, Liu KPY, Harris CB. Evaluation of an exercise program incorporating an international cycling competition: a multimodal intervention model for physical, psychological, and social wellbeing in residential aged care. BMC Geriatr 2024; 24:435. [PMID: 38755554 PMCID: PMC11100139 DOI: 10.1186/s12877-024-05033-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2023] [Accepted: 05/02/2024] [Indexed: 05/18/2024] Open
Abstract
BACKGROUND The transition into residential aged care is frequently associated with a reduction in physical activity, social engagement, and emotional wellbeing. Our aim was to evaluate the impact of a 26-day international cycling competition (Road Worlds Competition for Seniors), incorporating elements of exercise, audiovisual cycling footage, social engagement, and gamification, on the physical, psychological, and social well-being of aged care residents. We aimed to use findings to inform the development of a multi-modal intervention model to maximise wellbeing for older adults. METHODS Residents (N = 32) participated in a mixed-methods single-group intervention pilot study that compared pre-and post-competition measures for the following wellbeing domains; physical, psychological, and social. In addition, interviews were conducted with residents (n = 27) and staff (n = 6) to explore their experiences. RESULTS Measures identified significant improvements across multiple wellbeing domains, including functional fitness, depression, self-efficacy, and social network sizes. Findings from the interview data indicated that the multimodal components involved in the program delivery were valued by staff and residents who enjoyed the gamification, audiovisual cycling footage, social engagement, opportunities for reminiscence, and camaraderie between peers, staff, and volunteers. CONCLUSIONS Findings highlight a constellation of benefits across physical, psychological, and social domains of wellbeing and inform a model for innovative multidimensional programs in residential aged care. The benefits for residents with varying physical and cognitive abilities support the use of creative strategies that maximise inclusion and engagement for residents.
Collapse
Affiliation(s)
- Ruth Brookman
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia.
| | - Zac Hulm
- Harbison, 2 Charlotte St, Burradoo, NSW, 2576, Australia
| | - Leigh Hearn
- School of Health Sciences, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia
| | - Joyce Siette
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia
| | - Nitish Mathew
- Harbison, 2 Charlotte St, Burradoo, NSW, 2576, Australia
| | - Saili Deodhar
- Harbison, 2 Charlotte St, Burradoo, NSW, 2576, Australia
| | - Angela Cass
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia
| | - Jamilla Smith
- School of Health Sciences, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia
| | - Belinda Kenny
- School of Health Sciences, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia
| | - Karen P Y Liu
- School of Health Sciences, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia
- Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hung Hom, Hong Kong
| | - Celia B Harris
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia
| |
Collapse
|
13
|
Bernal-Berdun E, Vallejo M, Sun Q, Serrano A, Gutierrez D. Modeling the Impact of Head-Body Rotations on Audio-Visual Spatial Perception for Virtual Reality Applications. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:2624-2632. [PMID: 38446650 DOI: 10.1109/tvcg.2024.3372112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/08/2024]
Abstract
Humans perceive the world by integrating multimodal sensory feedback, including visual and auditory stimuli, which holds true in virtual reality (VR) environments. Proper synchronization of these stimuli is crucial for perceiving a coherent and immersive VR experience. In this work, we focus on the interplay between audio and vision during localization tasks involving natural head-body rotations. We explore the impact of audio-visual offsets and rotation velocities on users' directional localization acuity for various viewing modes. Using psychometric functions, we model perceptual disparities between visual and auditory cues and determine offset detection thresholds. Our findings reveal that target localization accuracy is affected by perceptual audio-visual disparities during head-body rotations, but remains consistent in the absence of stimuli-head relative motion. We then showcase the effectiveness of our approach in predicting and enhancing users' localization accuracy within realistic VR gaming applications. To provide additional support for our findings, we implement a natural VR game wherein we apply a compensatory audio-visual offset derived from our measured psychometric functions. As a result, we demonstrate a substantial improvement of up to 40% in participants' target localization accuracy. We additionally provide guidelines for content creation to ensure coherent and seamless VR experiences.
Collapse
|
14
|
Girishan Prabhu V, Stanley L, Morgan R, Shirley B. Designing and developing a nature-based virtual reality with heart rate variability biofeedback for surgical anxiety and pain management: evidence from total knee arthroplasty patients. Aging Ment Health 2024; 28:738-753. [PMID: 37850735 DOI: 10.1080/13607863.2023.2270442] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/18/2023] [Accepted: 10/06/2023] [Indexed: 10/19/2023]
Abstract
OBJECTIVES Total knee arthroplasty (TKA) is one of the most common joint surgeries, with over a million procedures performed annually in the US. Over 70% of patients report moderate to high pain and anxiety surrounding TKA surgery, and 96% are discharged with an opioid prescription. This population requires special attention as approximately 90% of TKA patients are older adults and one of the riskiest groups prone to misusing opioids. This study aimed to develop and compare the efficacy of nature-based virtual reality (VR) with heart rate variability biofeedback (HRVBF) to mitigate surgical pain and anxiety. METHODS This randomized control trial recruited 30 patients (mean age = 66.3 ± 8.2 years, 23 F, 7 M) undergoing TKA surgery and randomly assigned to a control, 2D video with HRVBF, or VR with HRVBF group. A visual analog scale (VAS) was used to measure pain levels before and after the intervention. In addition, a second VAS and the State-Trait Anxiety Inventory (STAI) were used to measure anxiety before and after the intervention. Electrocardiogram (ECG) was used to continuously measure HRV and respiration rate in preoperative and postoperative settings. RESULTS VR and 2D-video with HRVBF decreased pain and anxiety post-intervention compared with the control group, p's <.01. On analyzing physiological signals, both treatment groups showed greater parasympathetic activity levels, and VR with HRVBF reduced pain more than the 2D video, p < .01. CONCLUSIONS Nature-based VR and 2D video with HRVBF can mitigate surgical pain and anxiety. However, VR may be more efficacious than 2D video in reducing pain.
Collapse
Affiliation(s)
| | - Laura Stanley
- Gianforte School of Computing, Montana State University, Bozeman, MT, USA
| | - Robert Morgan
- Department of Anesthesiology, Prisma Health, Greenville, SC, USA
| | - Brayton Shirley
- Department of Orthopaedics, Prisma Health, Greenville, SC, USA
| |
Collapse
|
15
|
Kleygrewe L, Hutter RIV, Oudejans RRD. No pain, no gain? The effects of adding a pain stimulus in virtual training for police officers. ERGONOMICS 2023; 66:1608-1621. [PMID: 36620999 DOI: 10.1080/00140139.2022.2157496] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Accepted: 12/05/2022] [Indexed: 06/17/2023]
Abstract
Virtual training systems provide highly realistic training environments for police. This study assesses whether a pain stimulus can enhance the training responses and sense of the presence of these systems. Police officers (n = 219) were trained either with or without a pain stimulus in a 2D simulator (VirTra V-300) and a 3D virtual reality (VR) system. Two (training simulator) × 2 (pain stimulus) ANOVAs revealed a significant interaction effect for perceived stress (p = .010, ηp2 = .039). Post-hoc pairwise comparisons showed that VR provokes significantly higher levels of perceived stress compared to VirTra when no pain stimulus is used (p = .009). With a pain stimulus, VirTra training provokes significantly higher levels of perceived stress compared to VirTra training without a pain stimulus (p < .001). Sense of presence was unaffected by the pain stimulus in both training systems. Our results indicate that VR training appears sufficiently realistic without adding a pain stimulus. Practitioner summary: Virtual police training benefits from highly realistic training environments. This study found that adding a pain stimulus heightened perceived stress in a 2D simulator, whereas it influenced neither training responses nor sense of presence in a VR system. VR training appears sufficiently realistic without adding a pain stimulus.
Collapse
Affiliation(s)
- Lisanne Kleygrewe
- Department of Human Movement Sciences, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam Movement Sciences, Amsterdam, Netherlands
- Institute of Brain and Behaviour Amsterdam, Amsterdam, Netherlands
| | - R I Vana Hutter
- Department of Human Movement Sciences, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam Movement Sciences, Amsterdam, Netherlands
- Institute of Brain and Behaviour Amsterdam, Amsterdam, Netherlands
- Netherlands Institute for the Study of Crime and Law Enforcement (Nederlands Studiecentrum Criminaliteit en Rechtshandhaving; NSCR), Amsterdam, Netherlands
| | - Raôul R D Oudejans
- Department of Human Movement Sciences, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam Movement Sciences, Amsterdam, Netherlands
- Institute of Brain and Behaviour Amsterdam, Amsterdam, Netherlands
- Faculty of Sports and Nutrition, Amsterdam University of Applied Sciences, Amsterdam, Netherlands
| |
Collapse
|
16
|
Fougère M, Greco-Vuilloud J, Arnous C, Abel F, Lowe C, Elie V, Marchand S. Sensory stimulations potentializing digital therapeutics pain control. FRONTIERS IN PAIN RESEARCH 2023; 4:1168377. [PMID: 37745799 PMCID: PMC10511651 DOI: 10.3389/fpain.2023.1168377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 08/14/2023] [Indexed: 09/26/2023] Open
Abstract
For the past two decades, using Digital Therapeutics (DTx) to counter painful symptoms has emerged as a novel pain relief strategy. Several studies report that DTx significantly diminish pain while compensating for the limitations of pharmacological analgesics (e.g., addiction, side effects). Virtual reality (VR) is a major component of the most effective DTx for pain reduction. Notably, various stimuli (e.g., auditory, visual) appear to be frequently associated with VR in DTx. This review aims to compare the hypoalgesic power of specific stimuli with or without a VR environment. First, this review will briefly describe VR technology and known elements related to its hypoalgesic effect. Second, it will non-exhaustively list various stimuli known to have a hypoalgesic effect on pain independent of the immersive environment. Finally, this review will focus on studies that investigate a possible potentialized effect on pain reduction of these stimuli in a VR environment.
Collapse
Affiliation(s)
| | | | | | | | | | | | - Serge Marchand
- Lucine, Bordeaux, France
- Faculté de Médecine et des Sciences de la Santé, Centre de Recherche Clinique du Centre Hospitalier Universitaire de Sherbrooke, Université de Sherbrooke, Sherbrooke, QC, Canada
| |
Collapse
|
17
|
Liu ZM, Liu CY, Chen CQ, Ye XD. 360° Digital Travel to Improve Emotional State and Well-Being During the COVID-19 Pandemic: The Role of Presence and Sense of Place. CYBERPSYCHOLOGY, BEHAVIOR AND SOCIAL NETWORKING 2023; 26:690-697. [PMID: 37335922 DOI: 10.1089/cyber.2022.0248] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/21/2023]
Abstract
The continuation of the COVID-19 pandemic has caused a decline in people's subjective well-being and emotional states. Digital travel based on 360° videos provides an alternate way for people to improve their mental health at home during this specific period. Yet, how to construct effective digital travel content that improves emotions remains an issue. This investigation assessed the impact of people's perceived presence and sense of place (SOP) on emotional improvement during a 360° digital travel experience. A total of 156 undergraduate students volunteered to participate, and anxiety, emotion levels, and life satisfaction were measured before and after the digital travel experience; presence and SOP ratings were also collected after the experience. A Latent Change Score model was then developed, and the results indicated that the greater presence and SOP individuals experienced during their digital travel, the better their digital travel experience and emotional improvement. Furthermore, the current data highlight that SOP has a greater impact on emotional improvement than presence. This result provides a novel understanding that how SOP is generated may be more critical to digital travel than presence. This new understanding should help improve relevant applications in the field of digital travel, such as the possibility of providing meaningful narrative context in a virtual environment to induce SOP more effectively, and improve the digital travel experience. Overall, the findings of this study expand our understanding of the digital travel experience and lay the groundwork for future research on SOP and digital travel.
Collapse
Affiliation(s)
- Ze-Min Liu
- Department of Educational Technology, Wenzhou University, Wenzhou, China
| | - Cheng-Ye Liu
- Department of Educational Technology, Wenzhou University, Wenzhou, China
| | - Chuang-Qi Chen
- Department of Educational Technology, Wenzhou University, Wenzhou, China
| | - Xin-Dong Ye
- Department of Educational Technology, Wenzhou University, Wenzhou, China
| |
Collapse
|
18
|
Liu J, Burkhardt JM, Lubart T. Boosting Creativity through Users' Avatars and Contexts in Virtual Environments-A Systematic Review of Recent Research. J Intell 2023; 11:144. [PMID: 37504787 PMCID: PMC10382062 DOI: 10.3390/jintelligence11070144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2023] [Revised: 06/19/2023] [Accepted: 07/01/2023] [Indexed: 07/29/2023] Open
Abstract
As an artificial space extended from the physical environment, the virtual environment (VE) provides more possibilities for humans to work and be entertained with less physical restrictions. Benefiting from anonymity, one of the important features of VEs, users are able to receive visual stimuli that might differ from the physical environment through digital representations presented in VEs. Avatars and contextual cues in VEs can be considered as digital representations of users and contexts. In this article, we analyzed 21 articles that examined the creativity-boosting effects of different digital user and contextual representations. We summarized the main effects induced by these two digital representations, notably the effect induced by the self-similar avatar, Proteus effect, avatar with Social Identity Cues, priming effect induced by contextual representation, and embodied metaphorical effect. In addition, we examined the influence of immersion on creativity by comparing non-immersive and immersive VEs (i.e., desktop VE and headset VE, respectively). Last, we discussed the roles of embodiment and presence in the creativity in VEs, which were overlooked in the past research.
Collapse
Affiliation(s)
- Jiayin Liu
- LaPEA, Université Paris Cité and Univ Gustave Eiffel, 92100 Boulogne-Billancourt, France
| | | | - Todd Lubart
- LaPEA, Université Paris Cité and Univ Gustave Eiffel, 92100 Boulogne-Billancourt, France
| |
Collapse
|
19
|
Andonova V, Reinoso-Carvalho F, Jimenez Ramirez MA, Carrasquilla D. Does multisensory stimulation with virtual reality (VR) and smell improve learning? An educational experience in recall and creativity. Front Psychol 2023; 14:1176697. [PMID: 37397289 PMCID: PMC10308939 DOI: 10.3389/fpsyg.2023.1176697] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2023] [Accepted: 05/16/2023] [Indexed: 07/04/2023] Open
Abstract
Purpose The purpose of this paper is to derive into practical recommendations from multisensory stimulation with virtual reality (VR) and scent to help educators develop effective teaching strategies geared toward aspects of the learning experience, recall, and creativity in a stereotypical learning context. Design/methodology/approach The paper is based on a randomized experiment in which student participants were subdivided into three treatment groups and one control group. Each group was stimulated by a different combination of visual, auditory, and olfactory stimuli (2D SMELL, VR, and VR SMELL) and the outcomes were compared against those of the control group (2D). Consistent with the Cognitive Theory of Multimedia Learning, hypotheses were constructed to study the effect of different combinations of stimuli on the learning experience and learning outcomes related to recall and creativity in a stereotypical learning context. Findings Traditional video content alone and bundled with a coherent olfactory stimulus prompted higher self-reported ratings of perceived quality of the sensory experience. Olfactory stimulus in combination with either VR or a traditional video prompted higher self-reported ratings on perceived immersion. In a stereotypical learning context, the highest recall scores were achieved with traditional video alone. Both VR alone and bundled with an olfactory stimulus resulted in enhanced creativity. Research limitations/implications The findings of this study should be interpreted in the context of adopting multisensory stimulations combined with VR technology as part of stereotypical learning contexts. Most professional educators do not have robust knowledge or experience in using build-on-purpose multisensory stimuli but are increasingly engaged in using multisensory tools such as VR, as part of their teaching practice. In relation to recall, the results are consistent with the hypothesis that in a stereotypical learning context, a multisensory experience involving VR and olfactory stimuli can be related to an undesired cognitive load for learners. There exists a possibility that the low-technical version of the VR goggles used, as well as the contents of the instructional video may have influenced the learning outcomes in terms of recall. Hence, future research should consider such aspects and focus on richer learning contexts. Originality/value This work offers practical recommendations for instructional design strategies aiming to create multisensory stimulations with VR and olfactory components to foster a richer learning experience and enhanced learning outcomes, under the assumptions of a stereotypical learning context.
Collapse
|
20
|
Valentine C. Health Implications of Virtual Architecture: An Interdisciplinary Exploration of the Transferability of Findings from Neuroarchitecture. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2023; 20:2735. [PMID: 36768106 PMCID: PMC9915076 DOI: 10.3390/ijerph20032735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/09/2022] [Revised: 01/26/2023] [Accepted: 02/01/2023] [Indexed: 06/18/2023]
Abstract
Virtual architecture has been increasingly relied on to evaluate the health impacts of physical architecture. In this health research, exposure to virtual architecture has been used as a proxy for exposure to physical architecture. Despite the growing body of research on the health implications of physical architecture, there is a paucity of research examining the long-term health impacts of prolonged exposure to virtual architecture. In response, this paper considers: what can proxy studies, which use virtual architecture to assess the physiological response to physical architecture, tell us about the impact of extended exposure to virtual architecture on human health? The paper goes on to suggest that the applicability of these findings to virtual architecture may be limited by certain confounding variables when virtual architecture is experienced for a prolonged period of time. This paper explores the potential impact of two of these confounding variables: multisensory integration and gravitational perception. This paper advises that these confounding variables are unique to extended virtual architecture exposure and may not be captured by proxy studies that aim to capture the impact of physical architecture on human health through acute exposure to virtual architecture. While proxy studies may be suitable for measuring some aspects of the impact of both physical and virtual architecture on human health, this paper argues that they may be insufficient to fully capture the unintended consequences of extended exposure to virtual architecture on human health. Therefore, in the face of the increasing use of virtual architectural environments, the author calls for the establishment of a subfield of neuroarchitectural health research that empirically examines the physiological impacts of extended exposure to virtual architecture in its own right.
Collapse
Affiliation(s)
- Cleo Valentine
- Department of Architecture, University of Cambridge, Cambridge CB2 1PX, UK
| |
Collapse
|
21
|
Kim H, Lee IK. Studying the Effects of Congruence of Auditory and Visual Stimuli on Virtual Reality Experiences. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2080-2090. [PMID: 35167477 DOI: 10.1109/tvcg.2022.3150514] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Studies in virtual reality (VR) have introduced numerous multisensory simulation techniques for more immersive VR experiences. However, although they primarily focus on expanding sensory types or increasing individual sensory quality, they lack consensus in designing appropriate interactions between different sensory stimuli. This paper explores how the congruence between auditory and visual (AV) stimuli, which are the sensory stimuli typically provided by VR devices, affects the cognition and experience of VR users as a critical interaction factor in promoting multisensory integration. We defined the types of (in)congruence between AV stimuli, and then designed 12 virtual spaces with different types or degrees of congruence between AV stimuli. We then evaluated the presence, immersion, motion sickness, and cognition changes in each space. We observed the following key findings: 1) there is a limit to the degree of temporal or spatial incongruence that can be tolerated, with few negative effects on user experience until that point is exceeded; 2) users are tolerant of semantic incongruence; 3) a simulation that considers synesthetic congruence contributes to the user's sense of immersion and presence. Based on these insights, we identified the essential considerations for designing sensory simulations in VR and proposed future research directions.
Collapse
|
22
|
Li H, Zhang X, Wang H, Yang Z, Liu H, Cao Y, Zhang G. Access to Nature via Virtual Reality: A Mini-Review. Front Psychol 2021; 12:725288. [PMID: 34675840 PMCID: PMC8523668 DOI: 10.3389/fpsyg.2021.725288] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Accepted: 09/03/2021] [Indexed: 01/05/2023] Open
Abstract
Nature exposure is known to promote physical and mental health. However, actual nature exposure may be difficult to achieve for the population of people with physical disabilities or chronic conditions. Therefore, many attempts have been made to duplicate nature exposure via media devices, and virtual reality (VR) is deemed as a promising technology due to its advantage in creating a sense of immersion. Generally, current studies suggest that being exposed to virtual nature may contribute to psychological and physiological relaxation. Besides, some pieces of evidence indicate that virtual nature may improve attentional resources, cognitive performance, and pain experience. Although VR is deemed as an advanced media, insufficient evidence was found concerning the advantages of VR over traditional two-dimensional media when it comes to simulated nature exposure. On the other hand, computer-generated (CG) scenarios were found to be more beneficial than 360° videos, and mini-games may be useful in creating an interactive VR format for simulated nature exposure. Further research is needed because of the limited relevant studies.
Collapse
Affiliation(s)
- Hansen Li
- Key Laboratory of Physical Fitness Evaluation and Motor Function Monitoring of General Administration of Sports of China, Institute of Sports Science, College of Physical Education, Southwest University, Chongqing, China
| | - Xing Zhang
- Department of Basketball and Volleyball, Chengdu Sport University, Chengdu, China
| | - Hongying Wang
- College of Physical Education, JiMei University, Xiamen, China
| | - Zongqian Yang
- Key Laboratory of Physical Fitness Evaluation and Motor Function Monitoring of General Administration of Sports of China, Institute of Sports Science, College of Physical Education, Southwest University, Chongqing, China
| | - Haowei Liu
- Key Laboratory of Physical Fitness Evaluation and Motor Function Monitoring of General Administration of Sports of China, Institute of Sports Science, College of Physical Education, Southwest University, Chongqing, China
| | - Yang Cao
- Clinical Epidemiology and Biostatistics, School of Medical Sciences, Örebro University, Örebro, Sweden.,Unit of Integrative Epidemiology, Institute of Environmental Medicine, Karolinska Institutet, Stockholm, Sweden
| | - Guodong Zhang
- Key Laboratory of Physical Fitness Evaluation and Motor Function Monitoring of General Administration of Sports of China, Institute of Sports Science, College of Physical Education, Southwest University, Chongqing, China
| |
Collapse
|
23
|
Takeo Y, Hara M, Shirakawa Y, Ikeda T, Sugata H. Sequential motor learning transfers from real to virtual environment. J Neuroeng Rehabil 2021; 18:107. [PMID: 34193177 PMCID: PMC8247210 DOI: 10.1186/s12984-021-00903-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2020] [Accepted: 06/24/2021] [Indexed: 11/24/2022] Open
Abstract
Background Skill acquisition of motor learning between virtual environments (VEs) and real environments (REs) may be related. Although studies have previously examined the transfer of motor learning in VEs and REs through the same tasks, only a small number of studies have focused on studying the transfer of motor learning in VEs and REs by using different tasks. Thus, detailed effects of the transfer of motor skills between VEs and REs remain controversial. Here, we investigated the transfer of sequential motor learning between VEs and REs conditions. Methods Twenty-seven healthy volunteers performed two types of sequential motor learning tasks; a visually cued button-press task in RE (RE task) and a virtual reaching task in VE (VE task). Participants were randomly assigned to two groups in the task order; the first group was RE task followed by VE task and the second group was VE task followed by RE task. Subsequently, the response time in RE task and VE task was compared between the two groups respectively. Results The results showed that the sequential reaching task in VEs was facilitated after the sequential finger task in REs. Conclusions These findings suggested that the sequential reaching task in VEs can be facilitated by a motor learning task comprising the same sequential finger task in REs, even when a different task is applied.
Collapse
Affiliation(s)
- Yuhi Takeo
- Department of Rehabilitation, Oita University Hospital, Oita, Japan.,Graduate School of Welfare and Health Science, Oita University, Oita, Japan
| | - Masayuki Hara
- Graduate School of Science and Engineering, Saitama University, 255 Shimo-Okubo, Sakura-ku, 338-8570, Saitama City, Saitama, Japan
| | - Yuna Shirakawa
- Faculty of Welfare and Health Science, Oita University, 700, Dannoharu, 870-1192, Oita, Japan
| | - Takashi Ikeda
- Research Center for Child Mental Development, Kanazawa University, Kanazawa, Japan
| | - Hisato Sugata
- Graduate School of Welfare and Health Science, Oita University, Oita, Japan. .,Faculty of Welfare and Health Science, Oita University, 700, Dannoharu, 870-1192, Oita, Japan.
| |
Collapse
|
24
|
Monteiro P, Goncalves G, Coelho H, Melo M, Bessa M. Hands-free interaction in immersive virtual reality: A systematic review. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:2702-2713. [PMID: 33750693 DOI: 10.1109/tvcg.2021.3067687] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Hands are the most important tool to interact with virtual environments, and they should be available to perform the most critical tasks. For example, a surgeon in VR should keep his/her hands on the instruments and be able to do secondary tasks without performing a disruptive event to the operative task. In this common scenario, one can observe that hands are not available for interaction. The goal of this systematic review is to survey the literature and identify which hands-free interfaces are used, the performed interaction tasks, what metrics are used for interface evaluation, and the results of such evaluations. From 79 studies that met the eligibility criteria, the voice is the most studied interface, followed by the eye and head gaze. Some novel interfaces were brain interfaces and face expressions. System control and selection represent most of the interaction tasks studied and most studies evaluate interfaces for usability. Despite the best interface depending on the task and study, the voice was found to be versatile and showed good results amongst the studies. More research is recommended to improve the practical use of the interfaces and to evaluate the interfaces more formally.
Collapse
|