1
|
A Recognition Method for Soft Objects Based on the Fusion of Vision and Haptics. Biomimetics (Basel) 2023; 8:biomimetics8010086. [PMID: 36810417 PMCID: PMC9944461 DOI: 10.3390/biomimetics8010086] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Revised: 02/07/2023] [Accepted: 02/14/2023] [Indexed: 02/22/2023] Open
Abstract
For humans and animals to recognise an object, the integration of multiple sensing methods is essential when one sensing modality is only able to acquire limited information. Among the many sensing modalities, vision has been intensively studied and proven to have superior performance for many problems. Nevertheless, there are many problems which are difficult to solve by solitary vision, such as in a dark environment or for objects with a similar outlook but different inclusions. Haptic sensing is another commonly used means of perception, which can provide local contact information and physical features that are difficult to obtain by vision. Therefore, the fusion of vision and touch is beneficial to improve the robustness of object perception. To address this, an end-to-end visual-haptic fusion perceptual method has been proposed. In particular, the YOLO deep network is used to extract vision features, while haptic explorations are used to extract haptic features. Then, visual and haptic features are aggregated using a graph convolutional network, and the object is recognised based on a multi-layer perceptron. Experimental results show that the proposed method excels in distinguishing soft objects that have similar appearance but varied interior fillers, comparing a simple convolutional network and a Bayesian filter. The resultant average recognition accuracy was improved to 0.95 from vision only (mAP is 0.502). Moreover, the extracted physical features could be further used for manipulation tasks targeting soft objects.
Collapse
|
2
|
Sozzi S, Nardone A, Schieppati M. Specific Posture-Stabilising Effects of Vision and Touch Are Revealed by Distinct Changes of Body Oscillation Frequencies. Front Neurol 2021; 12:756984. [PMID: 34880823 PMCID: PMC8645986 DOI: 10.3389/fneur.2021.756984] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2021] [Accepted: 10/25/2021] [Indexed: 01/01/2023] Open
Abstract
We addressed postural instability during stance with eyes closed (EC) on a compliant surface in healthy young people. Spectral analysis of the centre of foot pressure oscillations was used to identify the effects of haptic information (light-touch, EC-LT), or vision (eyes open, EO), or both (EO-LT). Spectral median frequency was strongly reduced by EO and EO-LT, while spectral amplitude was reduced by all "stabilising" sensory conditions. Reduction in spectrum level by EO mainly appeared in the high-frequency range. Reduction by LT was much larger than that induced by the vision in the low-frequency range, less so in the high-frequency range. Touch and vision together produced a fall in spectral amplitude across all windows, more so in anteroposterior (AP) direction. Lowermost frequencies contributed poorly to geometric measures (sway path and area) for all sensory conditions. The same subjects participated in control experiments on a solid base of support. Median frequency and amplitude of the spectrum and geometric measures were largely smaller when standing on solid than on foam base but poorly affected by the sensory conditions. Frequency analysis but not geometric measures allowed to disclose unique tuning of the postural control mode by haptic and visual information. During standing on foam, the vision did not reduce low-frequency oscillations, while touch diminished the entire spectrum, except for the medium-high frequencies, as if sway reduction by touch would rely on rapid balance corrections. The combination of frequency analysis with sensory conditions is a promising approach to explore altered postural mechanisms and prospective interventions in subjects with central or peripheral nervous system disorders.
Collapse
Affiliation(s)
- Stefania Sozzi
- Centro Studi Attività Motorie (CSAM), Istituti Clinici Scientifici Maugeri SB (Istituto di Ricovero e Cura a Carattere Scientifico, IRCCS), Pavia, Italy
| | - Antonio Nardone
- Neurorehabilitation and Spinal Unit, Department of Clinical-Surgical, Diagnostic and Pediatric Sciences, Istituti Clinici Scientifici Maugeri SB (Istituto di Ricovero e Cura a Carattere Scientifico, IRCCS), University of Pavia, Pavia, Italy
| | - Marco Schieppati
- Istituti Clinici Scientifici Maugeri SB, Istituto di Ricovero e Cura a Carattere Scientifico (IRCCS), Pavia, Italy
| |
Collapse
|
3
|
Delis I, Dmochowski JP, Sajda P, Wang Q. Correlation of neural activity with behavioral kinematics reveals distinct sensory encoding and evidence accumulation processes during active tactile sensing. Neuroimage 2018; 175:12-21. [PMID: 29580968 PMCID: PMC5960621 DOI: 10.1016/j.neuroimage.2018.03.035] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2017] [Revised: 02/21/2018] [Accepted: 03/17/2018] [Indexed: 12/16/2022] Open
Abstract
Many real-world decisions rely on active sensing, a dynamic process for directing our sensors (e.g. eyes or fingers) across a stimulus to maximize information gain. Though ecologically pervasive, limited work has focused on identifying neural correlates of the active sensing process. In tactile perception, we often make decisions about an object/surface by actively exploring its shape/texture. Here we investigate the neural correlates of active tactile decision-making by simultaneously measuring electroencephalography (EEG) and finger kinematics while subjects interrogated a haptic surface to make perceptual judgments. Since sensorimotor behavior underlies decision formation in active sensing tasks, we hypothesized that the neural correlates of decision-related processes would be detectable by relating active sensing to neural activity. Novel brain-behavior correlation analysis revealed that three distinct EEG components, localizing to right-lateralized occipital cortex (LOC), middle frontal gyrus (MFG), and supplementary motor area (SMA), respectively, were coupled with active sensing as their activity significantly correlated with finger kinematics. To probe the functional role of these components, we fit their single-trial-couplings to decision-making performance using a hierarchical-drift-diffusion-model (HDDM), revealing that the LOC modulated the encoding of the tactile stimulus whereas the MFG predicted the rate of information integration towards a choice. Interestingly, the MFG disappeared from components uncovered from control subjects performing active sensing but not required to make perceptual decisions. By uncovering the neural correlates of distinct stimulus encoding and evidence accumulation processes, this study delineated, for the first time, the functional role of cortical areas in active tactile decision-making.
Collapse
Affiliation(s)
- Ioannis Delis
- Department of Biomedical Engineering, Columbia University, New York, NY, 10027, USA
| | - Jacek P Dmochowski
- Department of Biomedical Engineering, City College of New York, New York, NY, 10031, USA
| | - Paul Sajda
- Department of Biomedical Engineering, Columbia University, New York, NY, 10027, USA; Data Science Institute, Columbia University, New York, NY, 10027, USA.
| | - Qi Wang
- Department of Biomedical Engineering, Columbia University, New York, NY, 10027, USA.
| |
Collapse
|
4
|
Toprak S, Navarro-Guerrero N, Wermter S. Evaluating Integration Strategies for Visuo-Haptic Object Recognition. Cognit Comput 2017; 10:408-425. [PMID: 29881470 PMCID: PMC5971043 DOI: 10.1007/s12559-017-9536-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2017] [Accepted: 12/05/2017] [Indexed: 11/24/2022]
Abstract
In computational systems for visuo-haptic object recognition, vision and haptics are often modeled as separate processes. But this is far from what really happens in the human brain, where cross- as well as multimodal interactions take place between the two sensory modalities. Generally, three main principles can be identified as underlying the processing of the visual and haptic object-related stimuli in the brain: (1) hierarchical processing, (2) the divergence of the processing onto substreams for object shape and material perception, and (3) the experience-driven self-organization of the integratory neural circuits. The question arises whether an object recognition system can benefit in terms of performance from adopting these brain-inspired processing principles for the integration of the visual and haptic inputs. To address this, we compare the integration strategy that incorporates all three principles to the two commonly used integration strategies in the literature. We collected data with a NAO robot enhanced with inexpensive contact microphones as tactile sensors. The results of our experiments involving every-day objects indicate that (1) the contact microphones are a good alternative to capturing tactile information and that (2) organizing the processing of the visual and haptic inputs hierarchically and in two pre-processing streams is helpful performance-wise. Nevertheless, further research is needed to effectively quantify the role of each identified principle by itself as well as in combination with others.
Collapse
Affiliation(s)
- Sibel Toprak
- Knowledge Technology, Department of Informatics, Universität Hamburg, Vogt-Kölln-Str. 30, 22527 Hamburg, Germany
| | - Nicolás Navarro-Guerrero
- Knowledge Technology, Department of Informatics, Universität Hamburg, Vogt-Kölln-Str. 30, 22527 Hamburg, Germany
| | - Stefan Wermter
- Knowledge Technology, Department of Informatics, Universität Hamburg, Vogt-Kölln-Str. 30, 22527 Hamburg, Germany
| |
Collapse
|
5
|
Sathian K. Analysis of haptic information in the cerebral cortex. J Neurophysiol 2016; 116:1795-1806. [PMID: 27440247 DOI: 10.1152/jn.00546.2015] [Citation(s) in RCA: 58] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2015] [Accepted: 07/20/2016] [Indexed: 11/22/2022] Open
Abstract
Haptic sensing of objects acquires information about a number of properties. This review summarizes current understanding about how these properties are processed in the cerebral cortex of macaques and humans. Nonnoxious somatosensory inputs, after initial processing in primary somatosensory cortex, are partially segregated into different pathways. A ventrally directed pathway carries information about surface texture into parietal opercular cortex and thence to medial occipital cortex. A dorsally directed pathway transmits information regarding the location of features on objects to the intraparietal sulcus and frontal eye fields. Shape processing occurs mainly in the intraparietal sulcus and lateral occipital complex, while orientation processing is distributed across primary somatosensory cortex, the parietal operculum, the anterior intraparietal sulcus, and a parieto-occipital region. For each of these properties, the respective areas outside primary somatosensory cortex also process corresponding visual information and are thus multisensory. Consistent with the distributed neural processing of haptic object properties, tactile spatial acuity depends on interaction between bottom-up tactile inputs and top-down attentional signals in a distributed neural network. Future work should clarify the roles of the various brain regions and how they interact at the network level.
Collapse
Affiliation(s)
- K Sathian
- Departments of Neurology, Rehabilitation Medicine and Psychology, Emory University, Atlanta, Georgia; and Center for Visual and Neurocognitive Rehabilitation, Atlanta Department of Veterans Affairs Medical Center, Decatur, Georgia
| |
Collapse
|
6
|
Papale P, Chiesi L, Rampinini AC, Pietrini P, Ricciardi E. When Neuroscience 'Touches' Architecture: From Hapticity to a Supramodal Functioning of the Human Brain. Front Psychol 2016; 7:866. [PMID: 27375542 PMCID: PMC4899444 DOI: 10.3389/fpsyg.2016.00866] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2016] [Accepted: 05/25/2016] [Indexed: 02/02/2023] Open
Abstract
In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting 'neuro-architecture' as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people-environment relationships, and even provide empirical foundations for a renewed evidence-based design theory.
Collapse
Affiliation(s)
- Paolo Papale
- Department of Engineering and Architecture, University of Trieste, TriesteItaly
| | - Leonardo Chiesi
- Citylab – Laboratory of Social Research on Design, Architecture and Beyond, Department of Political and Social Sciences, School of Architecture, University of Florence, FlorenceItaly
| | - Alessandra C. Rampinini
- Department of Surgical, Medical, Molecular Pathology and Critical Area, University of Pisa, PisaItaly
| | | | - Emiliano Ricciardi
- Department of Surgical, Medical, Molecular Pathology and Critical Area, University of Pisa, PisaItaly
| |
Collapse
|