Devillard AWM, Ramasamy A, Cheng X, Faux D, Burdet E. Tactile, Audio, and Visual Dataset During Bare Finger Interaction with Textured Surfaces.
Sci Data 2025;
12:484. [PMID:
40122908 PMCID:
PMC11930942 DOI:
10.1038/s41597-025-04670-0]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2024] [Accepted: 02/18/2025] [Indexed: 03/25/2025] Open
Abstract
This paper presents a comprehensive multi-modal dataset capturing concurrent haptic, audio, and visual signals recorded from ten participants as they interacted with ten different textured surfaces using their bare fingers. The dataset includes stereoscopic images of the textures, and fingertip position, speed, applied load, emitted sound, and friction-induced vibrations, providing an unprecedented insight into the complex dynamics underlying human tactile perception. Our approach utilizes a human finger (while most previous studies relied on rigid sensorized probes), enabling the naturalistic acquisition of haptic data and addressing a significant gap in resources for studies of human tactile exploration, perceptual mechanisms, and artificial tactile perception. Additionally, fifteen participants completed a questionnaire to evaluate their subjective perception of the surfaces. Through carefully designed data collection protocols, encompassing both controlled and free exploration scenarios, this dataset offers a rich resource for studying human multi-sensory integration and supports the development of algorithms for texture recognition based on multi-modal inputs. A preliminary analysis demonstrates the dataset's potential, as classifiers trained on different combinations of data modalities show promising accuracy in surface identification, highlighting its value for advancing research in multi-sensory perception and the development of human-machine interfaces.
Collapse