1
|
Navarro-Guerrero N, Toprak S, Josifovski J, Jamone L. Visuo-haptic object perception for robots: an overview. Auton Robots 2023. [DOI: 10.1007/s10514-023-10091-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/28/2023]
Abstract
AbstractThe object perception capabilities of humans are impressive, and this becomes even more evident when trying to develop solutions with a similar proficiency in autonomous robots. While there have been notable advancements in the technologies for artificial vision and touch, the effective integration of these two sensory modalities in robotic applications still needs to be improved, and several open challenges exist. Taking inspiration from how humans combine visual and haptic perception to perceive object properties and drive the execution of manual tasks, this article summarises the current state of the art of visuo-haptic object perception in robots. Firstly, the biological basis of human multimodal object perception is outlined. Then, the latest advances in sensing technologies and data collection strategies for robots are discussed. Next, an overview of the main computational techniques is presented, highlighting the main challenges of multimodal machine learning and presenting a few representative articles in the areas of robotic object recognition, peripersonal space representation and manipulation. Finally, informed by the latest advancements and open challenges, this article outlines promising new research directions.
Collapse
|
2
|
Xia Z, Deng Z, Fang B, Yang Y, Sun F. A review on sensory perception for dexterous robotic manipulation. INT J ADV ROBOT SYST 2022. [DOI: 10.1177/17298806221095974] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Sensory perception for dexterous robotic hands is an active research area and recent progress in robotics. Effective dexterous manipulation requires robotic hands to accurately feedback their state or perceive the surrounding environment. This article reviews the state-of-the-art of sensory perception for dexterous robotic manipulation. Two types of sensors, such as intrinsic and extrinsic sensors, are introduced according to their function and layout in robotic hands. These sensors provide rich information to a robotic hand, which contains the posture, the contact information of objects, and the physical information of the environment. Then, a comprehensive analysis of perception methods including planning-level, control-level, and learning-level perceptions is presented. The information obtained from sensory perception can help robotic hands to make decisions effectively. Previously issued reviews mainly focus on the design of tactile senor, while we analyze and discuss the relationship among sensing, perception, and dexterous manipulation. Some potential research topics on sensory perception are also summarized and discussed.
Collapse
Affiliation(s)
- Ziwei Xia
- School of Engineering and Technology, China University of Gaosciences, Beijing, China
| | - Zhen Deng
- School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China
| | - Bin Fang
- Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China
| | - Yiyong Yang
- School of Engineering and Technology, China University of Gaosciences, Beijing, China
| | - Fuchun Sun
- Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China
| |
Collapse
|
3
|
Vision-Based Intelligent Perceiving and Planning System of a 7-DoF Collaborative Robot. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2021; 2021:5810371. [PMID: 34630547 PMCID: PMC8497130 DOI: 10.1155/2021/5810371] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Accepted: 08/17/2021] [Indexed: 11/17/2022]
Abstract
In this paper, an intelligent perceiving and planning system based on deep learning is proposed for a collaborative robot consisting of a 7-DoF (7-degree-of-freedom) manipulator, a three-finger robot hand, and a vision system, known as IPPS (intelligent perceiving and planning system). The lack of intelligence has been limiting the application of collaborative robots for a long time. A system to realize "eye-brain-hand" process is crucial for the true intelligence of robots. In this research, a more stable and accurate perceiving process was proposed. A well-designed camera system as the vision system and a new hand tracking method were proposed for operation perceiving and recording set establishment to improve the applicability. A visual process was designed to improve the accuracy of environment perceiving. Besides, a faster and more precise planning process was proposed. Deep learning based on a new CNN (convolution neural network) was designed to realize intelligent grasping planning for robot hand. A new trajectory planning method of the manipulator was proposed to improve efficiency. The performance of the IPPS was tested with simulations and experiments in a real environment. The results show that IPPS could effectively realize intelligent perceiving and planning for the robot, which could realize higher intelligence and great applicability for collaborative robots.
Collapse
|
4
|
Duan H, Wang P, Huang Y, Xu G, Wei W, Shen X. Robotics Dexterous Grasping: The Methods Based on Point Cloud and Deep Learning. Front Neurorobot 2021; 15:658280. [PMID: 34177509 PMCID: PMC8221534 DOI: 10.3389/fnbot.2021.658280] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Accepted: 05/14/2021] [Indexed: 11/13/2022] Open
Abstract
Dexterous manipulation, especially dexterous grasping, is a primitive and crucial ability of robots that allows the implementation of performing human-like behaviors. Deploying the ability on robots enables them to assist and substitute human to accomplish more complex tasks in daily life and industrial production. A comprehensive review of the methods based on point cloud and deep learning for robotics dexterous grasping from three perspectives is given in this paper. As a new category schemes of the mainstream methods, the proposed generation-evaluation framework is the core concept of the classification. The other two classifications based on learning modes and applications are also briefly described afterwards. This review aims to afford a guideline for robotics dexterous grasping researchers and developers.
Collapse
Affiliation(s)
- Haonan Duan
- The State Key Laboratory for Management and Control of Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- Department of Information Science, School of Computing and Information, University of Pittsburgh, Pittsburgh, PA, United States
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
| | - Peng Wang
- The State Key Laboratory for Management and Control of Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
- Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
| | - Yayu Huang
- The State Key Laboratory for Management and Control of Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
| | - Guangyun Xu
- The State Key Laboratory for Management and Control of Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
| | - Wei Wei
- The State Key Laboratory for Management and Control of Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
| | - Xiaofei Shen
- The State Key Laboratory for Management and Control of Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
5
|
Sanchez-Matilla R, Chatzilygeroudis K, Modas A, Duarte NF, Xompero A, Frossard P, Billard A, Cavallaro A. Benchmark for Human-to-Robot Handovers of Unseen Containers With Unknown Filling. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2969200] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|