1
|
Planning Multi-fingered Grasps with Reachability Awareness in Unrestricted Workspace. J INTELL ROBOT SYST 2023. [DOI: 10.1007/s10846-023-01829-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/14/2023]
|
2
|
Mohammed MQ, Kwek LC, Chua SC, Al-Dhaqm A, Nahavandi S, Eisa TAE, Miskon MF, Al-Mhiqani MN, Ali A, Abaker M, Alandoli EA. Review of Learning-Based Robotic Manipulation in Cluttered Environments. SENSORS (BASEL, SWITZERLAND) 2022; 22:7938. [PMID: 36298284 PMCID: PMC9607868 DOI: 10.3390/s22207938] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/23/2022] [Revised: 09/28/2022] [Accepted: 10/13/2022] [Indexed: 06/16/2023]
Abstract
Robotic manipulation refers to how robots intelligently interact with the objects in their surroundings, such as grasping and carrying an object from one place to another. Dexterous manipulating skills enable robots to assist humans in accomplishing various tasks that might be too dangerous or difficult to do. This requires robots to intelligently plan and control the actions of their hands and arms. Object manipulation is a vital skill in several robotic tasks. However, it poses a challenge to robotics. The motivation behind this review paper is to review and analyze the most relevant studies on learning-based object manipulation in clutter. Unlike other reviews, this review paper provides valuable insights into the manipulation of objects using deep reinforcement learning (deep RL) in dense clutter. Various studies are examined by surveying existing literature and investigating various aspects, namely, the intended applications, the techniques applied, the challenges faced by researchers, and the recommendations adopted to overcome these obstacles. In this review, we divide deep RL-based robotic manipulation tasks in cluttered environments into three categories, namely, object removal, assembly and rearrangement, and object retrieval and singulation tasks. We then discuss the challenges and potential prospects of object manipulation in clutter. The findings of this review are intended to assist in establishing important guidelines and directions for academics and researchers in the future.
Collapse
Affiliation(s)
- Marwan Qaid Mohammed
- Faculty of Engineering and Technology, Multimedia University (MMU), Ayer Keroh, Melaka 75450, Malaysia
| | - Lee Chung Kwek
- Faculty of Engineering and Technology, Multimedia University (MMU), Ayer Keroh, Melaka 75450, Malaysia
| | - Shing Chyi Chua
- Faculty of Engineering and Technology, Multimedia University (MMU), Ayer Keroh, Melaka 75450, Malaysia
| | - Arafat Al-Dhaqm
- School of Computing, Faculty of Engineering, Universiti Teknologi Malaysia, Skudai, Johor Bahru 81310, Malaysia
| | - Saeid Nahavandi
- Institute for Intelligent Systems, Research and Innovation, (IISRI), Deakin University, Geelong, VIC 3216, Australia
| | | | - Muhammad Fahmi Miskon
- Faculty of Electrical Engineering, Universiti Teknikal Malaysia Melaka (UTeM), Melaka 76100, Malaysia
| | - Mohammed Nasser Al-Mhiqani
- Faculty of Information Communication Technology, Universiti Teknikal Malaysia Melaka (UTeM), Melaka 76100, Malaysia
| | - Abdulalem Ali
- School of Computing, Faculty of Engineering, Universiti Teknologi Malaysia, Skudai, Johor Bahru 81310, Malaysia
| | - Mohammed Abaker
- Department Computer Science of Community College, King Khalid University, Muhayel Aseer 61913, Saudi Arabia
| | - Esmail Ali Alandoli
- Faculty of Engineering and Technology, Multimedia University (MMU), Ayer Keroh, Melaka 75450, Malaysia
| |
Collapse
|
3
|
Kiatos M, Sarantopoulos I, Koutras L, Malassiotis S, Doulgeri Z. Learning Push-Grasping in Dense Clutter. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3188437] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Marios Kiatos
- Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Iason Sarantopoulos
- Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Leonidas Koutras
- Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Sotiris Malassiotis
- Information Technologies Institute (ITI) Center of Research and Technology Hellas (CERTH), Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Zoe Doulgeri
- Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece
| |
Collapse
|
4
|
Shi C, Miao C, Zhong X, Zhong X, Hu H, Liu Q. Pixel-Reasoning-Based Robotics Fine Grasping for Novel Objects with Deep EDINet Structure. SENSORS 2022; 22:s22114283. [PMID: 35684904 PMCID: PMC9185561 DOI: 10.3390/s22114283] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 05/29/2022] [Accepted: 06/02/2022] [Indexed: 01/25/2023]
Abstract
Robotics grasp detection has mostly used the extraction of candidate grasping rectangles; those discrete sampling methods are time-consuming and may ignore the potential best grasp synthesis. This paper proposes a new pixel-level grasping detection method on RGB-D images. Firstly, a fine grasping representation is introduced to generate the gripper configurations of parallel-jaw, which can effectively resolve the gripper approaching conflicts and improve the applicability to unknown objects in cluttered scenarios. Besides, the adaptive grasping width is used to adaptively represent the grasping attribute, which is fine for objects. Then, the encoder–decoder–inception convolution neural network (EDINet) is proposed to predict the fine grasping configuration. In our findings, EDINet uses encoder, decoder, and inception modules to improve the speed and robustness of pixel-level grasping detection. The proposed EDINet structure was evaluated on the Cornell and Jacquard dataset; our method achieves 98.9% and 96.1% test accuracy, respectively. Finally, we carried out the grasping experiment on the unknown objects, and the results show that the average success rate of our network model is 97.2% in a single object scene and 93.7% in a cluttered scene, which out-performs the state-of-the-art algorithms. In addition, EDINet completes a grasp detection pipeline within only 25 ms.
Collapse
Affiliation(s)
- Chaoquan Shi
- School of Electrical Enginnering and Automation, Xiamen University of Technology, Xiamen 361024, China; (C.S.); (C.M.)
| | - Chunxiao Miao
- School of Electrical Enginnering and Automation, Xiamen University of Technology, Xiamen 361024, China; (C.S.); (C.M.)
| | - Xungao Zhong
- School of Electrical Enginnering and Automation, Xiamen University of Technology, Xiamen 361024, China; (C.S.); (C.M.)
- Correspondence: or ; Tel.: +86-189-5921-6800
| | - Xunyu Zhong
- School of Aerospace Engineering, Xiamen University, Xiamen 361005, China;
| | - Huosheng Hu
- School of Computer Science and Electronic Engineering, University of Essex, Colchester CO4 3SQ, UK;
| | - Qiang Liu
- Department of Psychiatry, University of Oxford, Oxford OX1 2JD, UK;
| |
Collapse
|
5
|
Wei W, Li D, Wang P, Li Y, Li W, Luo Y, Zhong J. DVGG: Deep Variational Grasp Generation for Dextrous Manipulation. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3140424] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
6
|
Lundell J, Verdoja F, Kyrki V. DDGC: Generative Deep Dexterous Grasping in Clutter. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3096239] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|