1
|
Martin-Rodriguez R, Ratschat AL, Marchal-Crespo L, Vardar Y. Tactile Weight Rendering: A Review for Researchers and Developers. IEEE TRANSACTIONS ON HAPTICS 2025; 18:93-109. [PMID: 39226192 DOI: 10.1109/toh.2024.3453894] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/05/2024]
Abstract
Haptic rendering of weight plays an essential role in naturalistic object interaction in virtual environments. While kinesthetic devices have traditionally been used for this aim by applying forces on the limbs, tactile interfaces acting on the skin have recently offered potential solutions to enhance or substitute kinesthetic ones. Here, we aim to provide an in-depth overview and comparison of existing tactile weight rendering approaches. We categorized these approaches based on their type of stimulation into asymmetric vibration and skin stretch, further divided according to the working mechanism of the devices. Then, we compared these approaches using various criteria, including physical, mechanical, and perceptual characteristics of the reported devices. We found that asymmetric vibration devices have the smallest form factor, while skin stretch devices relying on the motion of flat surfaces, belts, or tactors present numerous mechanical and perceptual advantages for scenarios requiring more accurate weight rendering. Finally, we discussed the selection of the proposed categorization of devices together with the limitations and opportunities for future research. We hope this study guides the development and use of tactile interfaces to achieve a more naturalistic object interaction and manipulation in virtual environments.
Collapse
|
2
|
Arezoo K, Tarvirdizadeh B, Alipour K, Hadi A, Arezoo J. A Novel Ungrounded Haptic Device for Generation and Orientation of Force and Torque Feedbacks. IEEE TRANSACTIONS ON HAPTICS 2025; 18:151-163. [PMID: 39509317 DOI: 10.1109/toh.2024.3493377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2024]
Abstract
To provide deeper immersion for the user in the virtual environments, both force and torque feedbacks are required rather than the mere use of visual and auditory ones. In this paper, we develop a novel propeller-based Ungrounded Handheld Haptic Device (UHHD) that delivers both force and torque feedbacks in one device to help the user experience a realistic sensation of immersion in a three-dimensional (3-D) space. The proposed UHHD uses only a pair of propellers and a set of sliders to continuously generate the desired force and torque feedbacks up to 15N and 1N.m in magnitude in less than 370 ms, respectively. The produced force and torque feedbacks are oriented in a desired direction using a gimbal mechanism where the propellers are mounted inside in such a way that a simple structure is obtained. These features facilitate the control of the proposed UHHD and enhance its practicality in various applications. To prove the capability of the system, we model it and elaborate on the force and torque analyses. Next, we develop a robust parallel force/position controller to tackle the structured and unstructured uncertainties. Finally, a measurement setup is manufactured to experimentally evaluate the performance of the UHHD and the controller. The implementation of the controller on the developed UHHD prototype shows that a satisfactory control performance is achievable in terms of offering the desired force and torque feedbacks.
Collapse
|
3
|
Achberger A, Gebhardt P, Sedlmair M. An Exploratory Expert-Study for Multi-Type Haptic Feedback for Automotive Virtual Reality Tasks. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:7255-7265. [PMID: 39255123 DOI: 10.1109/tvcg.2024.3456203] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/12/2024]
Abstract
Previous research has shown that integrating haptic feedback can improve immersion and realism in automotive VR applications. However, current haptic feedback approaches primarily focus on a single feedback type. This means users must switch between devices to experience haptic stimuli for different feedback types, such as grabbing, collision, or weight simulation. This restriction limits the ability to simulate haptics realistically for complex tasks such as maintenance. To address this issue, we evaluated existing feedback devices based on our requirements analysis to determine which devices are most suitable for simulating these three feedback types. Since no suitable haptic feedback system can simulate all three feedback types simultaneously, we evaluated which devices can be combined. Based on that, we devised a new multi-type haptic feedback system combining three haptic feedback devices. We evaluated the system with different feedback-type combinations through a qualitative expert study involving twelve automotive VR experts. The results showed that combining weight and collision feedback yielded the best and most realistic experience. The study also highlighted technical limitations in current grabbing devices. Our findings provide insights into the effectiveness of haptic device combinations and practical boundaries for automotive virtual reality tasks.
Collapse
|
4
|
Watkins A, Ghosh R, Ullal A, Sarkar N. Instilling the perception of weight in augmented reality using minimal haptic feedback. Sci Rep 2024; 14:24894. [PMID: 39438622 PMCID: PMC11496738 DOI: 10.1038/s41598-024-75596-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2024] [Accepted: 10/07/2024] [Indexed: 10/25/2024] Open
Abstract
Humans perceive gravitational forces on their surroundings through a mix of visual and sensorimotor cues. The accurate presentation of such cues is a difficult task in Mixed/Augmented Reality (MR/AR), technological paradigms that blend physical and virtual elements to enhance the way we interact with our environment. Realistically perceiving the weight of virtual elements within a MR/AR scenario aids in the embodiment of those elements within the user's reality, further blurring the lines between what is real and virtual. Unfortunately, current force feedback devices are not designed for or are entirely compatible with MR/AR experiences. To address this need, we explore minimal haptic feedback for weight perception in MR/AR, aiming to simplify the rendering of gravitational cues that are crucial to an immersive experience. Our benchtop device, focused on wrist feedback, showed improved user experience even within an implicit weight feedback task, i.e., a task where weight perception was not required for task completion. However, challenges arose in mixed real-virtual environments, a cornerstone of MR/AR interaction, where weight discrimination was observed to be less accurate. To address this, we developed a compensation scheme for virtual weights, leading to performance on par with a purely virtual environment. Our work demonstrates the viability of minimal haptic feedback in MR/AR applications and highlights the importance of integrating weight perception for increased realism. Our work also fills a research gap in MR/AR development, providing insights for designing future MR/AR systems that integrate with human sensory mechanisms to create virtual interactions that more closely mirror the physical world.
Collapse
Affiliation(s)
- Alexandra Watkins
- Mechanical Engineering, Vanderbilt University, 2400 Highland Ave, Nashville, 37212, TN, USA.
| | - Ritam Ghosh
- Electrical and Computer Engineering, Vanderbilt University, 400 24th Ave S, Nashville, 37212, TN, USA
| | - Akshith Ullal
- Electrical and Computer Engineering, Vanderbilt University, 400 24th Ave S, Nashville, 37212, TN, USA
| | - Nilanjan Sarkar
- Mechanical Engineering, Vanderbilt University, 2400 Highland Ave, Nashville, 37212, TN, USA.
- Electrical and Computer Engineering, Vanderbilt University, 400 24th Ave S, Nashville, 37212, TN, USA.
- Computer Science, Vanderbilt University, 400 24th Ave S, Nashville, 37212, TN, USA.
| |
Collapse
|
5
|
Berna Moya JL, van Oosterhout A, Marshall MT, Martinez Plasencia D. HapticWhirl, a Flywheel-Gimbal Handheld Haptic Controller for Exploring Multimodal Haptic Feedback. SENSORS (BASEL, SWITZERLAND) 2024; 24:935. [PMID: 38339652 PMCID: PMC10857638 DOI: 10.3390/s24030935] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Revised: 01/24/2024] [Accepted: 01/25/2024] [Indexed: 02/12/2024]
Abstract
Most haptic actuators available on the market today can generate only a single modality of stimuli. This ultimately limits the capacity of a kinaesthetic haptic controller to deliver more expressive feedback, requiring a haptic controller to integrate multiple actuators to generate complex haptic stimuli, with a corresponding complexity of construction and control. To address this, we designed a haptic controller to deliver several modalities of kinaesthetic haptic feedback using a single actuator: a flywheel, the orientation of which is controlled by two gimbals capable of rotating over 360 degrees, in combination with a flywheel brake. This enables the controller to generate multiple haptic feedback modalities, such as torque feedback, impact simulation, low-frequency high-amplitude vibrations, inertial effects (the sensation of momentum), and complex haptic output effects such as the experience of vortex-like forces (whirl effects). By combining these diverse haptic effects, the controller enriches the haptic dimension of VR environments. This paper presents the device's design, implementation, and characterization, and proposes potential applications for future work.
Collapse
Affiliation(s)
- Jose Luis Berna Moya
- Creative Technology Group, Department of Informatics, University of Sussex, Brighton BN1 9PH, UK;
| | - Anke van Oosterhout
- Industrial Design, Eindhoven University of Technology, 5612 AE Eindhoven, The Netherlands;
| | - Mark T. Marshall
- Interaction Design Centre, University of Limerick, V94 T9PX Limerick, Ireland
| | | |
Collapse
|
6
|
Zhang H, Zhou K, Shi K, Wang Y, Song A, Zhu L. SmartSpring: A Low-Cost Wearable Haptic VR Display with Controllable Passive Feedback. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:4460-4471. [PMID: 37782602 DOI: 10.1109/tvcg.2023.3320249] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/04/2023]
Abstract
With the development of virtual reality, the practical requirements of the wearable haptic interface have been greatly emphasized. While passive haptic devices are commonly used in virtual reality, they lack generality and are difficult to precisely generate continuous force feedback to users. In this work, we present SmartSpring, a new solution for passive haptics, which is inexpensive, lightweight and capable of providing controllable force feedback in virtual reality. We propose a hybrid spring-linkage structure as the proxy and flexibly control the mechanism for adjustable system stiffness. By analyzing the structure and force model, we enable a smart transform of the structure for producing continuous force signals. We quantitatively examine the real-world performance of SmartSpring to verify our model. By asymmetrically moving or actively pressing the end-effector, we show that our design can further support rendering torque and stiffness. Finally, we demonstrate the SmartSpring in a series of scenarios with user studies and a just noticeable difference analysis. Experimental results show the potential of the developed haptic display in virtual reality.
Collapse
|
7
|
Howard T, Gicquel G, Pacchierotti C, Marchal M. Can we Effectively Combine Tangibles and Ultrasound Mid-Air Haptics? A Study of Acoustically Transparent Tangible Surfaces. IEEE TRANSACTIONS ON HAPTICS 2023; 16:477-483. [PMID: 37058388 DOI: 10.1109/toh.2023.3267096] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
We propose to study the combination of acoustically transparent tangible objects (ATTs) and ultrasound mid-air haptic (UMH) feedback to support haptic interactions with digital content. Both these haptic feedback methods have the advantage of leaving users unencumbered, and present uniquely complementary strengths and weaknesses. In this article, we provide an overview of the design space for haptic interactions covered by this combination, as well as requirements for their technical implementation. Indeed, when imagining the concurrent manipulation of physical objects and delivery of mid-air haptic stimuli, reflection and absorption of sound by the tangibles may impede delivery of the UMH stimuli. To demonstrate the viability of our approach, we study the combination of single ATT surfaces, i.e. the basic building blocks for any tangible object, and UMH stimuli. We investigate attenuation of a focal point focused through various plates of acoustically transparent materials, and run three human subject experiments investigating the impact of acoustically transparent materials on detection thresholds, discrimination of motion, and localization of ultrasound haptic stimuli. Results show that tangible surfaces which do not significantly attenuate ultrasound can be fabricated with relative ease. The perception studies confirm that ATT surfaces do not impede perception of UMH stimulus properties, and thus that both may viably be combined in haptics applications.
Collapse
|
8
|
Shigeyama J, Hashimoto T, Yoshida S, Narumi T, Tanikawa T, Hirose M. Presenting Morphing Shape Illusion: Enhanced Sense of Morphing Virtual Object With Weight Shifting VR Controller by Computational Perception Model. IEEE COMPUTER GRAPHICS AND APPLICATIONS 2023; 43:81-89. [PMID: 37015674 DOI: 10.1109/mcg.2022.3229018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
The haptic sensation is crucial for virtual reality, as it gives the presence of objects in a virtual world and thus gives a greater sense of immersion. To provide a sense of the shape of handheld objects, a haptic device that changes weight distribution is proposed. It is known that the visual feedback enhances the haptic sensation of shape, and it is also known that it does for morphing shape as well. Our previous publication presented a perception model for the static shape of a virtual object. In this article, we extend the model to produce a plausible sense of the morphing shape of handheld objects. Our stochastic model predicts the proper weight actuation for the weight-shifting haptic device, which users can plausibly feel while reducing hardware effort. We evaluated our perception model and resulted in the model accuracy average of 8.1% error. Using this perception model, the amount of weight actuation at 75% probability of plausibility is reduced up to 37%.
Collapse
|
9
|
Bouzbib E, Pacchierotti C, Lecuyer A. When Tangibles Become Deformable: Studying Pseudo-Stiffness Perceptual Thresholds in a VR Grasping Task. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2743-2752. [PMID: 37028356 DOI: 10.1109/tvcg.2023.3247083] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Pseudo-Haptic techniques, or visuo-haptic illusions, leverage user's visual dominance over haptics to alter the users' perception. As they create a discrepancy between virtual and physical interactions, these illusions are limited to a perceptual threshold. Many haptic properties have been studied using pseudo-haptic techniques, such as weight, shape or size. In this paper, we focus on estimating the perceptual thresholds for pseudo-stiffness in a virtual reality grasping task. We conducted a user study (n = 15) where we estimated if compliance can be induced on a non-compressible tangible object and to what extent. Our results show that (1) compliance can be induced in a rigid tangible object and that (2) pseudo-haptics can simulate beyond 24 N/cm stiffness (k > 24N/cm, between a gummy bear and a raisin, up to rigid objects). Pseudo-stiffness efficiency is (3) enhanced by the objects' scales, but mostly (4) correlated to the user input force. Taken altogether, our results offer novel opportunities to simplify the design of future haptic interfaces, and extend the haptic properties of passive props in VR.
Collapse
|
10
|
Zhou Y, Popescu V. Dynamic Redirection for VR Haptics with a Handheld Stick. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2753-2762. [PMID: 37027709 DOI: 10.1109/tvcg.2023.3247047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
This paper proposes a general handheld stick haptic redirection method that allows the user to experience complex shapes with haptic feedback through both tapping and extended contact, such as in contour tracing. As the user extends the stick to make contact with a virtual object, the contact point with the virtual object and the targeted contact point with the physical object are continually updated, and the virtual stick is redirected to synchronize the virtual and real contacts. Redirection is applied either just to the virtual stick, or to both the virtual stick and hand. A user study (N = 26) confirms the effectiveness of the proposed redirection method. A first experiment following a two-interval forced-choice design reveals that the offset detection thresholds are [-15cm, +15cm]. A second experiment asks participants to guess the shape of an invisible virtual object by tapping it and by tracing its contour with the handheld stick, using a real world disk as a source of passive haptic feedback. The experiment reveals that using our haptic redirection method participants can identify the invisible object with 78% accuracy.
Collapse
|
11
|
Kourtesis P, Vizcay S, Marchal M, Pacchierotti C, Argelaguet F. Action-Specific Perception & Performance on a Fitts's Law Task in Virtual Reality: The Role of Haptic Feedback. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:3715-3726. [PMID: 36048989 DOI: 10.1109/tvcg.2022.3203003] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
While user's perception and performance are predominantly examined independently in virtual reality, the Action-Specific Perception (ASP) theory postulates that the performance of an individual on a task modulates this individual's spatial and time perception pertinent to the task's components and procedures. This paper examines the association between performance and perception and the potential effects that tactile feedback modalities could generate. This paper reports a user study (N=24), in which participants performed a standardized Fitts's law target acquisition task by using three feedback modalities: visual, visuo-electrotactile, and visuo-vibrotactile. The users completed 3 Target Sizes × 2 Distances × 3 feedback modalities = 18 trials. The size perception, distance perception, and (movement) time perception were assessed at the end of each trial. Performance-wise, the results showed that electrotactile feedback facilitates a significantly better accuracy compared to vibrotactile and visual feedback, while vibrotactile provided the worst accuracy. Electrotactile and visual feedback enabled a comparable reaction time, while the vibrotactile offered a substantially slower reaction time than visual feedback. Although amongst feedback types the pattern of differences in perceptual aspects were comparable to performance differences, none of them was statistically significant. However, performance indeed modulated perception. Significant action-specific effects on spatial and time perception were detected. Changes in accuracy modulate both size perception and time perception, while changes in movement speed modulate distance perception. Also, the index of difficulty was found to modulate all three perceptual aspects. However, individual differences appear to affect the magnitude of action-specific effects. These outcomes highlighted the importance of haptic feedback on performance, and importantly the significance of action-specific effects on spatial and time perception in VR, which should be considered in future VR studies.
Collapse
|
12
|
Tsao CA, Wu TC, Tsai HR, Wei TY, Liao FY, Chapman S, Chen BY. FrictShoes: Providing Multilevel Nonuniform Friction Feedback on Shoes in VR. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2026-2036. [PMID: 35167465 DOI: 10.1109/tvcg.2022.3150492] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Many haptic feedback methods have been proposed to enhance realism in virtual reality (VR). However, friction on the feet in VR, which renders feedback as if walking on different terrains or ground textures or stepping on objects is still less explored. Herein, we propose a wearable device, FrictShoes a pair of foot accessories, to provide multilevel nonuniform friction feedback to feet. This is achieved by the independent functioning of six brakes on six wheels underneath each FrictShoe, which allows the friction levels of the wheels from each to be either matched or to vary. We conducted a magnitude estimation study to understand users' distinguishability of friction force magnitudes (or levels). Based on the results, we performed an exploratory study to realize how users adjust and map the multilevel nonuniform friction patterns to common VR terrains or ground textures. Finally, a VR experience study was conducted to evaluate the performance of the proposed multilevel nonuniform friction feedback to the feet in VR experiences.
Collapse
|
13
|
Bstick: Handheld Virtual Reality Haptic Controller for Hand Rehabilitation. SYSTEMS 2022. [DOI: 10.3390/systems10030054] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
This study proposes Bstick, the first handheld-type haptic controller that can monitor and control the placement of five fingers in real-time using linear motors attached on the fingers. As a handheld device, it can be used with both hands, and it was designed and produced to allow it to be freely moved and used with other virtual reality (VR) devices via Bluetooth. Bstick also provides stiffness that can maintain the pressing forces of an adult man’s fingers, providing a realistic sense of grabbing and controlling a virtual object with rigidity and softness. By changing the location of the finger buttons, the device can render virtual objects of various shapes and sizes. A component that can be implemented in the Unity game engine was developed to provide convenience in the content development, using a haptic controller feature where a user can move five fingers independently, and this was applied for hand rehabilitation contents. Bstick includes five linear motors that can sustain approximately 22 N of force per finger, and the hardware and circuitry are compact, so as to be held in the user’s hand. Bstick can be used to create VR services and contents based on five-fingered force feedback using a haptic controller that can independently manage the motions of five fingers, and Unity game engine software that can modify hardware.
Collapse
|
14
|
Noguchi Y, Kamide H, Tanaka F. Weight Shift Movements of a Social Mediator Robot Make It Being Recognized as Serious and Suppress Anger, Revenge and Avoidance Motivation of the User. Front Robot AI 2022; 9:790209. [PMID: 35295616 PMCID: PMC8918567 DOI: 10.3389/frobt.2022.790209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2021] [Accepted: 01/27/2022] [Indexed: 11/13/2022] Open
Abstract
Humans can become aggressive during text messaging. To maintain a healthy interpersonal relationship through text messaging, our negative mental states, such as anger, have to be well-controlled. This paper discusses the use of a handheld social robot deployed as a mediator in text messaging between humans. The robot is equipped with a movable weight inside its body. By controlling the movement of the internal weight during the time when the robot speaks out messages received from a human sender, we hypothesize that the psychological state of a receiver who holds the robot can be affected (for example, he/she will listen to the messages more seriously). In a controlled study (n = 94), in which participants were manipulated to be frustrated by using a context scenario, we studied the effect of three dialogue scripts with/without weight shifts. Results showed that introducing weight shifts together with the robot speech suppressed on average 23% of the user’s anger. However, only 3.5% of the anger was suppressed when the weight shifts were not applied. Additionally, in cases where the robot showed empathy to the user in words with weight shifts, the user’s revenge urge was successfully reduced by 22%. There was almost no effect confirmed when the weight shifts were not applied. A similar effect was also found in avoidance motivation: 15% of the avoidance motivation was reduced if weight shifts were applied. The reductions in revenge and avoidance motivation are considered important factors for human forgiveness. Therefore, our findings provide experimental evidence that weight shifts can be an effective expression modality for mediator robots, from the perspective of not only suppressing the user’s anger but also by inducing forgiveness during messaging.
Collapse
Affiliation(s)
- Yohei Noguchi
- Department of Intelligent Interaction Technologies, University of Tsukuba, Ibaraki, Japan
| | - Hiroko Kamide
- Institute of Innovation for Future Society, Nagoya University, Aichi, Japan
| | - Fumihide Tanaka
- Faculty of Engineering, Information and Systems, University of Tsukuba, Ibaraki, Japan
- *Correspondence: Fumihide Tanaka,
| |
Collapse
|
15
|
Saito H, Horie A, Maekawa A, Matsubara S, Wakisaka S, Kashino Z, Kasahara S, Inami M. Transparency in Human-Machine Mutual Action. JOURNAL OF ROBOTICS AND MECHATRONICS 2021. [DOI: 10.20965/jrm.2021.p0987] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Recent advances in human-computer integration (HInt) have focused on the development of human-machine systems, where both human and machine autonomously act upon each other. However, a key challenge in designing such systems is augmenting the user’s physical abilities while maintaining their sense of self-attribution. This challenge is particularly prevalent when both human and machine are capable of acting upon each other, thereby creating a human-machine mutual action (HMMA) system. To address this challenge, we present a design framework that is based on the concept of transparency. We define transparency in HInt as the degree to which users can self-attribute an experience when machines intervene in the users’ action. Using this framework, we form a set of design guidelines and an approach for designing HMMA systems. By using transparency as our focus, we aim to provide a design approach for not only achieving human-machine fusion into a single agent, but also controlling the degrees of fusion at will. This study also highlights the effectiveness of our design approach through an analysis of existing studies that developed HMMA systems. Further development of our design approach is discussed, and future prospects for HInt and HMMA system designs are presented.
Collapse
|
16
|
Nilsson NC, Zenner A, Simeone AL, Johnsen K, Sandor C, Billinghurst M. Propping Up Virtual Reality With Haptic Proxies. IEEE COMPUTER GRAPHICS AND APPLICATIONS 2021; 41:104-112. [PMID: 34506272 DOI: 10.1109/mcg.2021.3097671] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Physical props serving as proxies for virtual objects (haptic proxies) offer a cheap, convenient, and compelling way of delivering a sense of touch in virtual reality (VR). To successfully use haptic proxies for VR, they have to be both similar to and colocated with their virtual counterparts. In this article, we introduce a taxonomy organizing techniques using haptic proxies for VR into eight categories based on when the techniques are deployed (offline or real-time), what reality is being manipulated (physical or virtual reality), and the purpose of the techniques (to affect object perception or the mapping between real and virtual objects). Finally, we discuss key advantages and limitations of the different categories of techniques.
Collapse
|
17
|
Weber S, Weibel D, Mast FW. How to Get There When You Are There Already? Defining Presence in Virtual Reality and the Importance of Perceived Realism. Front Psychol 2021; 12:628298. [PMID: 34025504 PMCID: PMC8136250 DOI: 10.3389/fpsyg.2021.628298] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2020] [Accepted: 04/08/2021] [Indexed: 12/05/2022] Open
Affiliation(s)
- Stefan Weber
- Department of Psychology, University of Bern, Bern, Switzerland.,Faculty of Psychology, Swiss Distance University Institute, Brig, Switzerland
| | - David Weibel
- Department of Psychology, University of Bern, Bern, Switzerland
| | - Fred W Mast
- Department of Psychology, University of Bern, Bern, Switzerland
| |
Collapse
|
18
|
Zenner A, Ullmann K, Kruger A. Combining Dynamic Passive Haptics and Haptic Retargeting for Enhanced Haptic Feedback in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:2627-2637. [PMID: 33750705 DOI: 10.1109/tvcg.2021.3067777] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
To provide immersive haptic experiences, proxy-based haptic feedback systems for virtual reality (VR) face two central challenges: (1) similarity, and (2) colocation. While to solve challenge (1), physical proxy objects need to be sufficiently similar to their virtual counterparts in terms of haptic properties, for challenge (2), proxies and virtual counterparts need to be sufficiently colocated to allow for seamless interactions. To solve these challenges, past research introduced, among others, two successful techniques: (a) Dynamic Passive Haptic Feedback (DPHF), a hardware-based technique that leverages actuated props adapting their physical state during the VR experience, and (b) Haptic Retargeting, a software-based technique leveraging hand redirection to bridge spatial offsets between real and virtual objects. Both concepts have, up to now, not ever been studied in combination. This paper proposes to combine both techniques and reports on the results of a perceptual and a psychophysical experiment situated in a proof-of-concept scenario focused on the perception of virtual weight distribution. We show that users in VR overestimate weight shifts and that, when DPHF and HR are combined, significantly greater shifts can be rendered, compared to using only a weight-shifting prop or unnoticeable hand redirection. Moreover, we find the combination of DPHF and HR to let significantly larger spatial dislocations of proxy and virtual counterpart go unnoticed by users. Our investigation is the first to show the value of combining DPHF and HR in practice, validating that their combination can better solve the challenges of similarity and colocation than the individual techniques can do alone.
Collapse
|
19
|
Park C, Kim J, Choi S. Length Perception Model for Handheld Controllers: The Effects of Diameter and Inertia. IEEE TRANSACTIONS ON HAPTICS 2021; 14:310-315. [PMID: 33950846 DOI: 10.1109/toh.2021.3077709] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Typical handheld controllers for interaction in virtual reality (VR) have fixed shapes and sizes, regardless of what visual objects they represent. Resolving this crossmodal incongruence with a shape-changing interface is our long-term goal. In this paper, we seek to find a length perception model that considers the moment of inertia (MOI) and diameter of a handheld object based on the concept of dynamic touch. Such models serve as a basis for computational algorithms for shape changing. We carried out two perceptual experiments. In Experiment 1, we measured the perceived lengths of 24 physical objects with different MOIs and diameters. Then we obtained a length perception model to reproduce the desired perceived length with a handheld controller. In Experiment 2, we validated our model in a crossmodal matching scenario, where a visual rod was matched to a haptic rod in terms of the perceived length. Our results contribute to understanding the relationship between the perceived length and physical properties of a handheld object and designing shape-changing algorithms to render equivalent visual and haptic sensory cues for length perception in VR.
Collapse
|
20
|
Multimodal Interaction Systems Based on Internet of Things and Augmented Reality: A Systematic Literature Review. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11041738] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Technology developments have expanded the diversity of interaction modalities that can be used by an agent (either a human or machine) to interact with a computer system. This expansion has created the need for more natural and user-friendly interfaces in order to achieve effective user experience and usability. More than one modality can be provided to an agent for interaction with a system to accomplish this goal, which is referred to as a multimodal interaction (MI) system. The Internet of Things (IoT) and augmented reality (AR) are popular technologies that allow interaction systems to combine the real-world context of the agent and immersive AR content. However, although MI systems have been extensively studied, there are only several studies that reviewed MI systems that used IoT and AR. Therefore, this paper presents an in-depth review of studies that proposed various MI systems utilizing IoT and AR. A total of 23 studies were identified and analyzed through a rigorous systematic literature review protocol. The results of our analysis of MI system architectures, the relationship between system components, input/output interaction modalities, and open research challenges are presented and discussed to summarize the findings and identify future research and development avenues for researchers and MI developers.
Collapse
|
21
|
Zenner A, Makhsadov A, Klingner S, Liebemann D, Kruger A. Immersive Process Model Exploration in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:2104-2114. [PMID: 32070982 DOI: 10.1109/tvcg.2020.2973476] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
In many professional domains, relevant processes are documented as abstract process models, such as event-driven process chains (EPCs). EPCs are traditionally visualized as 2D graphs and their size varies with the complexity of the process. While process modeling experts are used to interpreting complex 2D EPCs, in certain scenarios such as, for example, professional training or education, also novice users inexperienced in interpreting 2D EPC data are facing the challenge of learning and understanding complex process models. To communicate process knowledge in an effective yet motivating and interesting way, we propose a novel virtual reality (VR) interface for non-expert users. Our proposed system turns the exploration of arbitrarily complex EPCs into an interactive and multi-sensory VR experience. It automatically generates a virtual 3D environment from a process model and lets users explore processes through a combination of natural walking and teleportation. Our immersive interface leverages basic gamification in the form of a logical walkthrough mode to motivate users to interact with the virtual process. The generated user experience is entirely novel in the field of immersive data exploration and supported by a combination of visual, auditory, vibrotactile and passive haptic feedback. In a user study with N=27 novice users, we evaluate the effect of our proposed system on process model understandability and user experience, while comparing it to a traditional 2D interface on a tablet device. The results indicate a tradeoff between efficiency and user interest as assessed by the UEQ novelty subscale, while no significant decrease in model understanding performance was found using the proposed VR interface. Our investigation highlights the potential of multi-sensory VR for less time-critical professional application domains, such as employee training, communication, education, and related scenarios focusing on user interest.
Collapse
|
22
|
Yu R, Bowman DA. Pseudo-Haptic Display of Mass and Mass Distribution During Object Rotation in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:2094-2103. [PMID: 32078548 DOI: 10.1109/tvcg.2020.2973056] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
We propose and evaluate novel pseudo-haptic techniques to display mass and mass distribution for proxy-based object manipulation in virtual reality. These techniques are specifically designed to generate haptic effects during the object's rotation. They rely on manipulating the mapping between visual cues of motion and kinesthetic cues of force to generate a sense of heaviness, which alters the perception of the object's mass-related properties without changing the physical proxy. First we present a technique to display an object's mass by scaling its rotational motion relative to its mass. A psycho-physical experiment demonstrates that this technique effectively generates correct perceptions of relative mass between two virtual objects. We then present two pseudo-haptic techniques designed to display an object's mass distribution. One of them relies on manipulating the pivot point of rotation, while the other adjusts rotational motion based on the real-time dynamics of the moving object. An empirical study shows that both techniques can influence perception of mass distribution, with the second technique being significantly more effective.
Collapse
|
23
|
Directional Force Feedback: Mechanical Force Concentration for Immersive Experience in Virtual Reality. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9183692] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In recent years, consumer-level virtual-reality (VR) devices and content have become widely available. Notably, establishing a sense of presence is a key objective of VR and an immersive interface with haptic feedback for VR applications has long been in development. Despite the state-of-the-art force feedback research being conducted, a study on directional feedback, based on force concentration, has not yet been reported. Therefore, we developed directional force feedback (DFF), a device that generates directional sensations for virtual-reality (VR) applications via mechanical force concentrations. DFF uses the rotation of motors to concentrate force and deliver directional sensations to the user. To achieve this, we developed a novel method of force concentration for directional sensation; by considering both rotational rebound and gravity, the optimum rotational motor speeds and rotation angles were identified. Additionally, we validated the impact of DFF in a virtual environment, showing that the users’ presence and immersion within VR were higher with DFF than without. The result of the user studies demonstrated that the device significantly improves immersivity of virtual applications.
Collapse
|
24
|
Tong Q, Yuan Z, Liao X, Zheng M, Yuan T, Zhao J. Magnetic Levitation Haptic Augmentation for Virtual Tissue Stiffness Perception. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2018; 24:3123-3136. [PMID: 29990159 DOI: 10.1109/tvcg.2017.2772236] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Haptic-based tissue stiffness perception is essential for palpation training system, which can provide the surgeon haptic cues for improving the diagnostic abilities. However, current haptic devices, such as Geomagic Touch, fail to provide immersive and natural haptic interaction in virtual surgery due to the inherent mechanical friction, inertia, limited workspace and flawed haptic feedback. To tackle this issue, we design a novel magnetic levitation haptic device based on electromagnetic principles to augment the tissue stiffness perception in virtual environment. Users can naturally interact with the virtual tissue by tracking the motion of magnetic stylus using stereoscopic vision so that they can accurately sense the stiffness by the magnetic stylus, which moves in the magnetic field generated by our device. We propose the idea that the effective magnetic field (EMF) is closely related to the coil attitude for the first time. To fully harness the magnetic field and flexibly generate the specific magnetic field for obtaining required haptic perception, we adopt probability clouds to describe the requirement of interactive applications and put forward an algorithm to calculate the best coil attitude. Moreover, we design a control interface circuit and present a self-adaptive fuzzy proportion integration differentiation (PID) algorithm to precisely control the coil current. We evaluate our haptic device via a series of quantitative experiments which show the high consistency of the experimental and simulated magnetic flux density, the high accuracy (0.28 mm) of real-time 3D positioning and tracking of the magnetic stylus, the low power consumption of the adjustable coil configuration, and the tissue stiffness perception accuracy improvement by 2.38 percent with the self-adaptive fuzzy PID algorithm. We conduct a user study with 22 participants, and the results suggest most of the users can clearly and immersively perceive different tissue stiffness and easily detect the tissue abnormality. Experimental results demonstrate that our magnetic levitation haptic device can provide accurate tissue stiffness perception augmentation with natural and immersive haptic interaction.
Collapse
|