1
|
Hassan W, Joolee JB, Jeon S. Establishing haptic texture attribute space and predicting haptic attributes from image features using 1D-CNN. Sci Rep 2023; 13:11684. [PMID: 37468571 DOI: 10.1038/s41598-023-38929-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2023] [Accepted: 07/17/2023] [Indexed: 07/21/2023] Open
Abstract
The current study strives to provide a haptic attribute space where texture surfaces are located based on their haptic attributes. The main aim of the haptic attribute space is to come up with a standardized model for representing and identifying haptic textures analogous to the RGB model for colors. To this end, a four dimensional haptic attribute space is established by conducting a psychophysical experiment where human participants rate 100 real-life texture surfaces according to their haptic attributes. The four dimensions of the haptic attribute space are rough-smooth, flat-bumpy, sticky-slippery, and hard-soft. The generalization and scalability of the haptic attribute space is achieved by training a 1D-CNN model for predicting attributes of haptic textures. The 1D-CNN is trained using the attribute data from psychophysical experiments and image features collected from the images of real textures. The prediction power granted by the 1D-CNN renders scalability to the haptic attribute space. The prediction accuracy of the proposed 1D-CNN model is compared against other machine learning and deep learning algorithms. The results show that the proposed method outperforms the other models on MAE and RMSE metrics.
Collapse
Affiliation(s)
- Waseem Hassan
- Department of Computer Science and Engineering, Kyung Hee University, Yongin-si, Gyeonggi-do, South Korea
| | - Joolekha Bibi Joolee
- Department of Computer Science and Engineering, Kyung Hee University, Yongin-si, Gyeonggi-do, South Korea
| | - Seokhee Jeon
- Department of Computer Science and Engineering, Kyung Hee University, Yongin-si, Gyeonggi-do, South Korea.
| |
Collapse
|
2
|
Otaran A, Farkhatdinov I. Haptic Ankle Platform for Interactive Walking in Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:3974-3985. [PMID: 34506284 DOI: 10.1109/tvcg.2021.3111675] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
This article presents an impedance type ankle haptic interface for providing users with an immersive navigation experience in virtual reality (VR). The ankle platform, actuated by an electric motor with feedback control, enables the use of foot-tapping gestures to create a walking experience like a real one and to haptically render different types of walking terrains. Experimental studies demonstrated that the interface can be easily used to generate virtual walking and is capable of rendering terrains, such as hard and soft surfaces, and multi-layer complex dynamic terrains. The designed system is a seated-type VR locomotion interface, therefore allowing its user to maintain a stable seated posture to comfortably navigate a virtual scene.
Collapse
|
3
|
Multimodal Interaction of Contextual and Non-Contextual Sound and Haptics in Virtual Simulations. INFORMATICS-BASEL 2018. [DOI: 10.3390/informatics5040043] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Touch plays a fundamental role in our daily interactions, allowing us to interact with and perceive objects and their spatial properties. Despite its importance in the real-world, touch is often ignored in virtual environments. However, accurately simulating the sense of touch is difficult, requiring the use of high-fidelity haptic devices that are cost-prohibitive. Lower fidelity consumer-level haptic devices are becoming more widespread, yet are generally limited in perceived fidelity and the range of motion (degrees of freedom) required to realistically simulate many tasks. Studies into sound and vision suggest that the presence or absence of sound can influence task performance. Here, we explore whether the presence or absence of contextually relevant sound cues influences the performance of a simple haptic drilling task. Although the results of this study do not show any statistically significant difference in task performance with general (task-irrelevant) sound, we discuss how this is a necessary step in understanding the role of sound on haptic perception.
Collapse
|
4
|
Soto FA, Vucovich LE, Ashby FG. Linking signal detection theory and encoding models to reveal independent neural representations from neuroimaging data. PLoS Comput Biol 2018; 14:e1006470. [PMID: 30273337 PMCID: PMC6181430 DOI: 10.1371/journal.pcbi.1006470] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2018] [Revised: 10/11/2018] [Accepted: 08/29/2018] [Indexed: 11/18/2022] Open
Abstract
Many research questions in visual perception involve determining whether stimulus properties are represented and processed independently. In visual neuroscience, there is great interest in determining whether important object dimensions are represented independently in the brain. For example, theories of face recognition have proposed either completely or partially independent processing of identity and emotional expression. Unfortunately, most previous research has only vaguely defined what is meant by “independence,” which hinders its precise quantification and testing. This article develops a new quantitative framework that links signal detection theory from psychophysics and encoding models from computational neuroscience, focusing on a special form of independence defined in the psychophysics literature: perceptual separability. The new theory allowed us, for the first time, to precisely define separability of neural representations and to theoretically link behavioral and brain measures of separability. The framework formally specifies the relation between these different levels of perceptual and brain representation, providing the tools for a truly integrative research approach. In particular, the theory identifies exactly what valid inferences can be made about independent encoding of stimulus dimensions from the results of multivariate analyses of neuroimaging data and psychophysical studies. In addition, commonly used operational tests of independence are re-interpreted within this new theoretical framework, providing insights on their correct use and interpretation. Finally, we apply this new framework to the study of separability of brain representations of face identity and emotional expression (neutral/sad) in a human fMRI study with male and female participants. A common question in vision research is whether certain stimulus properties, like face identity and expression, are represented and processed independently. We develop a theoretical framework that allowed us, for the first time, to link behavioral and brain measures of independence. Unlike previous approaches, our framework formally specifies the relation between these different levels of perceptual and brain representation, providing the tools for a truly integrative research approach in the study of independence. This allows to identify what kind of inferences can be made about brain representations from multivariate analyses of neuroimaging data or psychophysical studies. We apply this framework to the study of independent processing of face identity and expression.
Collapse
Affiliation(s)
- Fabian A. Soto
- Department of Psychology, Florida International University, Miami, Florida, United States of America
- * E-mail:
| | - Lauren E. Vucovich
- Department of Psychological and Brain Sciences, University of California, Santa Barbara, Santa Barbara, California, United States of America
| | - F. Gregory Ashby
- Department of Psychological and Brain Sciences, University of California, Santa Barbara, Santa Barbara, California, United States of America
| |
Collapse
|
5
|
Terekhov AV, Hayward V. The brain uses extrasomatic information to estimate limb displacement. Proc Biol Sci 2016; 282:rspb.2015.1661. [PMID: 26311672 PMCID: PMC4571714 DOI: 10.1098/rspb.2015.1661] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2015] [Accepted: 07/27/2015] [Indexed: 11/12/2022] Open
Abstract
A fundamental problem faced by the brain is to estimate whether a touched object is rigidly attached to a ground reference or is movable. A simple solution to this problem would be for the brain to test whether pushing on the object with a limb is accompanied by limb displacement. The mere act of pushing excites large populations of mechanoreceptors, generating a sensory response that is only weakly sensitive to limb displacement if the movements are small, and thus can hardly be used to determine the mobility of the object. In the mechanical world, displacement or deformation of objects frequently co-occurs with microscopic fluctuations associated with the frictional sliding of surfaces in contact or with micro-failures inside an object. In this study,we provide compelling evidence that the brain relies on these microscopic mechanical events to estimate the displacement of the limb in contact with an object, and hence the mobility of the touched object. We show that when pressing with a finger on a stiff surface, fluctuations that resemble the mechanical response of granular solids provoke a sensation of limb displacement. Our findings suggest that when acting on an external object, prior knowledge about the sensory consequences of interacting with the object contributes to proprioception.
Collapse
Affiliation(s)
- Alexander V. Terekhov
- University of Paris Descartes, Paris 05, UMR 8158, LPP, Paris 75006, France
- Sorbonne Universités, UPMC Univ Paris 06, UMR 7222, ISIR, Paris 75005, France
- e-mail:
| | - Vincent Hayward
- University of Paris Descartes, Paris 05, UMR 8158, LPP, Paris 75006, France
| |
Collapse
|
6
|
Hjortkjær J, McAdams S. Spectral and temporal cues for perception of material and action categories in impacted sound sources. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2016; 140:409. [PMID: 27475165 DOI: 10.1121/1.4955181] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
In two experiments, similarity ratings and categorization performance with recorded impact sounds representing three material categories (wood, metal, glass) being manipulated by three different categories of action (drop, strike, rattle) were examined. Previous research focusing on single impact sounds suggests that temporal cues related to damping are essential for material discrimination, but spectral cues are potentially more efficient for discriminating materials manipulated by different actions that include multiple impacts (e.g., dropping, rattling). Perceived similarity between material categories across different actions was correlated with the distribution of long-term spectral energy (spectral centroid). Similarity between action categories was described by the temporal distribution of envelope energy (temporal centroid) or by the density of impacts. Moreover, perceptual similarity correlated with the pattern of confusion in categorization judgments. Listeners tended to confuse materials with similar spectral centroids, and actions with similar temporal centroids and onset densities. To confirm the influence of these different features, spectral cues were removed by applying the envelopes of the original sounds to a broadband noise carrier. Without spectral cues, listeners retained sensitivity to action categories but not to material categories. Conversely, listeners recognized material but not action categories after envelope scrambling that preserved long-term spectral content.
Collapse
Affiliation(s)
- Jens Hjortkjær
- Oticon Centre of Excellence for Hearing and Speech Sciences, Department of Electrical Engineering, Technical University of Denmark, Ørsteds Plads 352, DK-2800 Kgs. Lyngby, Denmark
| | - Stephen McAdams
- Schulich School of Music, McGill University, 555 Sherbrooke Street West, Montreal, Quebec H3A 1E3, Canada
| |
Collapse
|
7
|
Cao Y, Giordano BL, Avanzini F, McAdams S. The dominance of haptics over audition in controlling wrist velocity during striking movements. Exp Brain Res 2016; 234:1145-58. [PMID: 26790425 PMCID: PMC4785215 DOI: 10.1007/s00221-015-4529-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2014] [Accepted: 12/12/2015] [Indexed: 11/25/2022]
Abstract
Skilled interactions with sounding objects, such as drumming, rely on resolving the uncertainty in the acoustical and tactual feedback signals generated by vibrating objects. Uncertainty may arise from mis-estimation of the objects’ geometry-independent mechanical properties, such as surface stiffness. How multisensory information feeds back into the fine-tuning of sound-generating actions remains unexplored. Participants (percussionists, non-percussion musicians, or non-musicians) held a stylus and learned to control their wrist velocity while repeatedly striking a virtual sounding object whose surface stiffness was under computer control. Sensory feedback was manipulated by perturbing the surface stiffness specified by audition and haptics in a congruent or incongruent manner. The compensatory changes in striking velocity were measured as the motor effects of the sensory perturbations, and sensory dominance was quantified by the asymmetry of congruency effects across audition and haptics. A pronounced dominance of haptics over audition suggested a superior utility of somatosensation developed through long-term experience with object exploration. Large interindividual differences in the motor effects of haptic perturbation potentially arose from a differential reliance on the type of tactual prediction error for which participants tend to compensate: vibrotactile force versus object deformation. Musical experience did not have much of an effect beyond a slightly greater reliance on object deformation in mallet percussionists. The bias toward haptics in the presence of crossmodal perturbations was greater when participants appeared to rely on object deformation feedback, suggesting a weaker association between haptically sensed object deformation and the acoustical structure of concomitant sound during everyday experience of actions upon objects.
Collapse
Affiliation(s)
- Yinan Cao
- Crossmodal Research Laboratory, Department of Experimental Psychology, University of Oxford, Oxford, UK.
- Department of Music Research, Centre for Interdisciplinary Research in Music, Media and Technology, McGill University, Montreal, QC, Canada.
| | - Bruno L Giordano
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, Scotland, UK.
| | - Federico Avanzini
- Department of Information Engineering, University of Padova, Padua, Italy
| | - Stephen McAdams
- Department of Music Research, Centre for Interdisciplinary Research in Music, Media and Technology, McGill University, Montreal, QC, Canada
| |
Collapse
|
8
|
Maculewicz J, Kofoed LB, Serafin S. A Technological Review of the Instrumented Footwear for Rehabilitation with a Focus on Parkinson's Disease Patients. Front Neurol 2016; 7:1. [PMID: 26834696 PMCID: PMC4719096 DOI: 10.3389/fneur.2016.00001] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2015] [Accepted: 01/04/2016] [Indexed: 12/02/2022] Open
Abstract
In this review article, we summarize systems for gait rehabilitation based on instrumented footwear and present a context of their usage in Parkinson’s disease (PD) patients’ auditory and haptic rehabilitation. We focus on the needs of PD patients, but since only a few systems were made with this purpose, we go through several applications used in different scenarios when gait detection and rehabilitation are considered. We present developments of the designs, possible improvements, and software challenges and requirements. We conclude that in order to build successful systems for PD patients’ gait rehabilitation, technological solutions from several studies have to be applied and combined with knowledge from auditory and haptic cueing.
Collapse
Affiliation(s)
- Justyna Maculewicz
- Sound and Music Computing Group, Department of Architecture, Design and Media Technology, Aalborg University Copenhagen , Copenhagen , Denmark
| | - Lise Busk Kofoed
- Sound and Music Computing Group, Department of Architecture, Design and Media Technology, Aalborg University Copenhagen , Copenhagen , Denmark
| | - Stefania Serafin
- Sound and Music Computing Group, Department of Architecture, Design and Media Technology, Aalborg University Copenhagen , Copenhagen , Denmark
| |
Collapse
|
9
|
Asano S, Okamoto S, Matsuura Y, Yamada Y. Toward quality texture display: vibrotactile stimuli to modify material roughness sensations. Adv Robot 2014. [DOI: 10.1080/01691864.2014.913502] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
10
|
VR-MDS: multidimensional scaling for classification tasks of virtual and real stimuli. Atten Percept Psychophys 2014; 76:877-93. [PMID: 24402697 DOI: 10.3758/s13414-013-0588-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Evaluating the perceptual similarity between virtual and real sensory stimuli has been a serious problem for virtual reality interface researchers for a long time. One of the most commonly used evaluation methods is a classification task where assessors classify randomly presented stimuli into multiple candidate types. The results of this method are summarized using two types of confusion matrices, which have different stimulus sets. The present study developed a method that computes the locations of simulated and real stimuli in a perceptual space on the basis of the two confusion matrices. The spatial distribution of the stimuli allows us to visually interpret the perceptual relationships between stimuli and their perceptual dimensionality. This method is recommended when the guidance index based on the answer ratios of the confusion matrices is fairly high.
Collapse
|
11
|
Larsson M. Self-generated sounds of locomotion and ventilation and the evolution of human rhythmic abilities. Anim Cogn 2013; 17:1-14. [PMID: 23990063 PMCID: PMC3889703 DOI: 10.1007/s10071-013-0678-z] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2013] [Revised: 08/07/2013] [Accepted: 08/20/2013] [Indexed: 01/20/2023]
Abstract
It has been suggested that the basic building blocks of music mimic sounds of moving humans, and because the brain was primed to exploit such sounds, they eventually became incorporated in human culture. However, that raises further questions. Why do genetically close, culturally well-developed apes lack musical abilities? Did our switch to bipedalism influence the origins of music? Four hypotheses are raised: (1) Human locomotion and ventilation can mask critical sounds in the environment. (2) Synchronization of locomotion reduces that problem. (3) Predictable sounds of locomotion may stimulate the evolution of synchronized behavior. (4) Bipedal gait and the associated sounds of locomotion influenced the evolution of human rhythmic abilities. Theoretical models and research data suggest that noise of locomotion and ventilation may mask critical auditory information. People often synchronize steps subconsciously. Human locomotion is likely to produce more predictable sounds than those of non-human primates. Predictable locomotion sounds may have improved our capacity of entrainment to external rhythms and to feel the beat in music. A sense of rhythm could aid the brain in distinguishing among sounds arising from discrete sources and also help individuals to synchronize their movements with one another. Synchronization of group movement may improve perception by providing periods of relative silence and by facilitating auditory processing. The adaptive value of such skills to early ancestors may have been keener detection of prey or stalkers and enhanced communication. Bipedal walking may have influenced the development of entrainment in humans and thereby the evolution of rhythmic abilities.
Collapse
Affiliation(s)
- Matz Larsson
- The Cardiology Clinic, Örebro University Hospital, 701 85, Örebro, Sweden,
| |
Collapse
|
12
|
Grassi M, Pastore M, Lemaitre G. Looking at the world with your ears: how do we get the size of an object from its sound? Acta Psychol (Amst) 2013; 143:96-104. [PMID: 23542810 DOI: 10.1016/j.actpsy.2013.02.005] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2012] [Revised: 01/07/2013] [Accepted: 02/20/2013] [Indexed: 11/16/2022] Open
Abstract
Identifying the properties of on-going events by the sound they produce is crucial for our interaction with the environment when visual information is not available. Here, we investigated the ability of listeners to estimate the size of an object (a ball) dropped on a plate with ecological listening conditions (balls were dropped in real time) and response methods (listeners estimate ball-size by drawing a disk). Previous studies had shown that listeners can veridically estimate the size of objects by the sound they produce, but it is yet unclear which acoustical index listeners use to produce their estimates. In particular, it is unclear whether listeners listen to amplitude (related to loudness) or frequency (related to the sound's brightness) domain cue to produce their estimates. In the current study, in order to understand which cue is used by the listener to recover the size of the object, we manipulated the sound source event in such a way that frequency and amplitude cues provided contrasting size-information (balls were dropped from various heights). Results showed that listeners' estimations were accurate regardless of the experimental manipulations performed in the experiments. In addition, results suggest that listeners were likely integrating frequency and amplitude acoustical cues in order to produce their estimate and although these cues were often providing contrasting size-information.
Collapse
Affiliation(s)
- Massimo Grassi
- Dipartimento di Psicologia Generale, Università di Padova, Via Venezia 8, 35131 Padova, Italy.
| | | | | |
Collapse
|
13
|
Turchet L, Burelli P, Serafin S. Haptic feedback for enhancing realism of walking simulations. IEEE TRANSACTIONS ON HAPTICS 2013; 6:35-45. [PMID: 24808266 DOI: 10.1109/toh.2012.51] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
In this paper, we describe several experiments whose goal is to evaluate the role of plantar vibrotactile feedback in enhancing the realism of walking experiences in multimodal virtual environments. To achieve this goal we built an interactive and a noninteractive multimodal feedback system. While during the use of the interactive system subjects physically walked, during the use of the noninteractive system the locomotion was simulated while subjects were sitting on a chair. In both the configurations subjects were exposed to auditory and audio-visual stimuli presented with and without the haptic feedback. Results of the experiments provide a clear preference toward the simulations enhanced with haptic feedback showing that the haptic channel can lead to more realistic experiences in both interactive and noninteractive configurations. The majority of subjects clearly appreciated the added feedback. However, some subjects found the added feedback unpleasant. This might be due, on one hand, to the limits of the haptic simulation and, on the other hand, to the different individual desire to be involved in the simulations. Our findings can be applied to the context of physical navigation in multimodal virtual environments as well as to enhance the user experience of watching a movie or playing a video game.
Collapse
|