1
|
Ren H, Zhichao L, Wu Q, Ma X, Wu D. An Adaptive Shared Control Frame and Feedback Rendering in Interactive Robot-Assisted Surgical Manipulation. Int J Med Robot 2025; 21:e70069. [PMID: 40387255 DOI: 10.1002/rcs.70069] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2024] [Revised: 02/24/2025] [Accepted: 04/24/2025] [Indexed: 05/20/2025]
Abstract
BACKGROUND In an unstructured environment where real-time human decision is essential, shared control allows collaboration between humans and robotic systems, combining advantages of both. However, existing control methods are challenged with precision loss, inconsistency and interference from unconscious human inputs. METHODS An adaptive anisotropic control frame is presented, enabling interaction both operational and tactical levels. Using predefined trajectory, a dynamic weight function is proposed to allow the human operator to override. Movement along preferred direction is encouraged and compensated, providing accurate real-time tracking performance. Haptic feedback during shared control is evaluated and optimised. RESULTS Experiments validate that the raised method can achieve a tracking precision of± 0.17 m m $\pm 0.17mm$ under milling payload, with sensible feedback to the operator. The override manipulation can be rapidly made within 0.4 s as the tactical level interaction. CONCLUSION The proposed approach provides both stability and flexibility in interactive surgical manipulations, maintaining similar precision with autonomous execution.
Collapse
Affiliation(s)
- Hao Ren
- Department of Mechanical Engineering, Tsinghua University, Beijing, China
| | - Li Zhichao
- Department of Mechanical Engineering, Tsinghua University, Beijing, China
| | - Qingyuan Wu
- Department of Mechanical Engineering, Tsinghua University, Beijing, China
| | - Xiaodong Ma
- Emergency Department, General Hospital of Chinese PLA, Beijing, China
| | - Dan Wu
- Department of Mechanical Engineering, Tsinghua University, Beijing, China
- State Key Laboratory of Tribology in Advanced Equipment, Tsinghua University, Beijing, China
| |
Collapse
|
2
|
Liu R, Song Q, Ma T, Pan H, Li H, Zhao X. SoftBoMI: a non-invasive wearable body-machine interface for mapping movement of shoulder to commands. J Neural Eng 2024; 21:066007. [PMID: 39454612 DOI: 10.1088/1741-2552/ad8b6e] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2024] [Accepted: 10/25/2024] [Indexed: 10/28/2024]
Abstract
Objective.Customized human-machine interfaces for controlling assistive devices are vital in improving the self-help ability of upper limb amputees and tetraplegic patients. Given that most of them possess residual shoulder mobility, using it to generate commands to operate assistive devices can serve as a complementary approach to brain-computer interfaces.Approach.We propose a hybrid body-machine interface prototype that integrates soft sensors and an inertial measurement unit. This study introduces both a rule-based data decoding method and a user intent inference-based decoding method to map human shoulder movements into continuous commands. Additionally, by incorporating prior knowledge of the user's operational performance into a shared autonomy framework, we implement an adaptive switching command mapping approach. This approach enables seamless transitions between the two decoding methods, enhancing their adaptability across different tasks.Main results.The proposed method has been validated on individuals with cervical spinal cord injury, bilateral arm amputation, and healthy subjects through a series of center-out target reaching tasks and a virtual powered wheelchair driving task. The experimental results show that using both the soft sensors and the gyroscope exhibits the most well-rounded performance in intent inference. Additionally, the rule-based method demonstrates better dynamic performance for wheelchair operation, while the intent inference method is more accurate but has higher latency. Adaptive switching decoding methods offer the best adaptability by seamlessly transitioning between decoding methods for different tasks. Furthermore, we discussed the differences and characteristics among the various types of participants in the experiment.Significance.The proposed method has the potential to be integrated into clothing, enabling non-invasive interaction with assistive devices in daily life, and could serve as a tool for rehabilitation assessment in the future.
Collapse
Affiliation(s)
- Rongkai Liu
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
- University of Science and Technology of China (USTC), Hefei 230026, Anhui, People's Republic of China
| | - Quanjun Song
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
| | - Tingting Ma
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
- University of Science and Technology of China (USTC), Hefei 230026, Anhui, People's Republic of China
| | - Hongqing Pan
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
| | - Hao Li
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
| | - Xinyan Zhao
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
- University of Science and Technology of China (USTC), Hefei 230026, Anhui, People's Republic of China
| |
Collapse
|
3
|
Dillen A, Omidi M, Ghaffari F, Romain O, Vanderborght B, Roelands B, Nowé A, De Pauw K. User Evaluation of a Shared Robot Control System Combining BCI and Eye Tracking in a Portable Augmented Reality User Interface. SENSORS (BASEL, SWITZERLAND) 2024; 24:5253. [PMID: 39204948 PMCID: PMC11359122 DOI: 10.3390/s24165253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/13/2024] [Revised: 08/02/2024] [Accepted: 08/09/2024] [Indexed: 09/04/2024]
Abstract
This study evaluates an innovative control approach to assistive robotics by integrating brain-computer interface (BCI) technology and eye tracking into a shared control system for a mobile augmented reality user interface. Aimed at enhancing the autonomy of individuals with physical disabilities, particularly those with impaired motor function due to conditions such as stroke, the system utilizes BCI to interpret user intentions from electroencephalography signals and eye tracking to identify the object of focus, thus refining control commands. This integration seeks to create a more intuitive and responsive assistive robot control strategy. The real-world usability was evaluated, demonstrating significant potential to improve autonomy for individuals with severe motor impairments. The control system was compared with an eye-tracking-based alternative to identify areas needing improvement. Although BCI achieved an acceptable success rate of 0.83 in the final phase, eye tracking was more effective with a perfect success rate and consistently lower completion times (p<0.001). The user experience responses favored eye tracking in 11 out of 26 questions, with no significant differences in the remaining questions, and subjective fatigue was higher with BCI use (p=0.04). While BCI performance lagged behind eye tracking, the user evaluation supports the validity of our control strategy, showing that it could be deployed in real-world conditions and suggesting a pathway for further advancements.
Collapse
Affiliation(s)
- Arnau Dillen
- Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, 1050 Brussels, Belgium
- Equipes Traitement de l’Information et Systèmes, UMR 8051, CY Cergy Paris Université, École Nationale Supérieure de l’Electronique et de ses Applications (ENSEA), Centre National de la Recherche Scientifique (CNRS), 95000 Cergy, France; (F.G.); (O.R.)
- Brussels Human Robotics Research Center (BruBotics), Vrije Universiteit Brussel, 1050 Brussels, Belgium; (M.O.); (B.V.)
| | - Mohsen Omidi
- Brussels Human Robotics Research Center (BruBotics), Vrije Universiteit Brussel, 1050 Brussels, Belgium; (M.O.); (B.V.)
- IMEC, 1050 Brussels, Belgium
| | - Fakhreddine Ghaffari
- Equipes Traitement de l’Information et Systèmes, UMR 8051, CY Cergy Paris Université, École Nationale Supérieure de l’Electronique et de ses Applications (ENSEA), Centre National de la Recherche Scientifique (CNRS), 95000 Cergy, France; (F.G.); (O.R.)
| | - Olivier Romain
- Equipes Traitement de l’Information et Systèmes, UMR 8051, CY Cergy Paris Université, École Nationale Supérieure de l’Electronique et de ses Applications (ENSEA), Centre National de la Recherche Scientifique (CNRS), 95000 Cergy, France; (F.G.); (O.R.)
| | - Bram Vanderborght
- Brussels Human Robotics Research Center (BruBotics), Vrije Universiteit Brussel, 1050 Brussels, Belgium; (M.O.); (B.V.)
- IMEC, 1050 Brussels, Belgium
| | - Bart Roelands
- Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, 1050 Brussels, Belgium
- Brussels Human Robotics Research Center (BruBotics), Vrije Universiteit Brussel, 1050 Brussels, Belgium; (M.O.); (B.V.)
| | - Ann Nowé
- Artificial Intelligence Lab, Vrije Universiteit Brussel, 1050 Brussels, Belgium
| | - Kevin De Pauw
- Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, 1050 Brussels, Belgium
- Brussels Human Robotics Research Center (BruBotics), Vrije Universiteit Brussel, 1050 Brussels, Belgium; (M.O.); (B.V.)
| |
Collapse
|
4
|
Dillen A, Omidi M, Díaz MA, Ghaffari F, Roelands B, Vanderborght B, Romain O, De Pauw K. Evaluating the real-world usability of BCI control systems with augmented reality: a user study protocol. Front Hum Neurosci 2024; 18:1448584. [PMID: 39161850 PMCID: PMC11330773 DOI: 10.3389/fnhum.2024.1448584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2024] [Accepted: 07/18/2024] [Indexed: 08/21/2024] Open
Abstract
Brain-computer interfaces (BCI) enable users to control devices through their brain activity. Motor imagery (MI), the neural activity resulting from an individual imagining performing a movement, is a common control paradigm. This study introduces a user-centric evaluation protocol for assessing the performance and user experience of an MI-based BCI control system utilizing augmented reality. Augmented reality is employed to enhance user interaction by displaying environment-aware actions, and guiding users on the necessary imagined movements for specific device commands. One of the major gaps in existing research is the lack of comprehensive evaluation methodologies, particularly in real-world conditions. To address this gap, our protocol combines quantitative and qualitative assessments across three phases. In the initial phase, the BCI prototype's technical robustness is validated. Subsequently, the second phase involves a performance assessment of the control system. The third phase introduces a comparative analysis between the prototype and an alternative approach, incorporating detailed user experience evaluations through questionnaires and comparisons with non-BCI control methods. Participants engage in various tasks, such as object sorting, picking and placing, and playing a board game using the BCI control system. The evaluation procedure is designed for versatility, intending applicability beyond the specific use case presented. Its adaptability enables easy customization to meet the specific user requirements of the investigated BCI control application. This user-centric evaluation protocol offers a comprehensive framework for iterative improvements to the BCI prototype, ensuring technical validation, performance assessment, and user experience evaluation in a systematic and user-focused manner.
Collapse
Affiliation(s)
- Arnau Dillen
- Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, Brussels, Belgium
- Équipes Traitement de l'Information et Systèmes, UMR 8051, CY Cergy Paris Université, École Nationale Supérieure de l'Électronique et de ses Applications (ENSEA), Centre national de la recherche scientifique (CNRS), Cergy, France
- Brussels Human Robotic Research Center (BruBotics), Vrije Universiteit Brussel, Brussels, Belgium
| | - Mohsen Omidi
- Brussels Human Robotic Research Center (BruBotics), Vrije Universiteit Brussel, Brussels, Belgium
- imec, Brussels, Belgium
| | - María Alejandra Díaz
- Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, Brussels, Belgium
- Brussels Human Robotic Research Center (BruBotics), Vrije Universiteit Brussel, Brussels, Belgium
| | - Fakhreddine Ghaffari
- Équipes Traitement de l'Information et Systèmes, UMR 8051, CY Cergy Paris Université, École Nationale Supérieure de l'Électronique et de ses Applications (ENSEA), Centre national de la recherche scientifique (CNRS), Cergy, France
| | - Bart Roelands
- Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, Brussels, Belgium
- Brussels Human Robotic Research Center (BruBotics), Vrije Universiteit Brussel, Brussels, Belgium
| | - Bram Vanderborght
- Brussels Human Robotic Research Center (BruBotics), Vrije Universiteit Brussel, Brussels, Belgium
- imec, Brussels, Belgium
| | - Olivier Romain
- Équipes Traitement de l'Information et Systèmes, UMR 8051, CY Cergy Paris Université, École Nationale Supérieure de l'Électronique et de ses Applications (ENSEA), Centre national de la recherche scientifique (CNRS), Cergy, France
| | - Kevin De Pauw
- Human Physiology and Sports Physiotherapy Research Group, Vrije Universiteit Brussel, Brussels, Belgium
- Brussels Human Robotic Research Center (BruBotics), Vrije Universiteit Brussel, Brussels, Belgium
| |
Collapse
|
5
|
Gualtieri L, Fraboni F, Brendel H, Pietrantoni L, Vidoni R, Dallasega P. Updating design guidelines for cognitive ergonomics in human-centred collaborative robotics applications: An expert survey. APPLIED ERGONOMICS 2024; 117:104246. [PMID: 38354552 DOI: 10.1016/j.apergo.2024.104246] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/04/2023] [Revised: 01/30/2024] [Accepted: 02/01/2024] [Indexed: 02/16/2024]
Abstract
Within the framework of Industry 5.0, human factors are essential for enhancing the work conditions and well-being of operators interacting with even more advanced and smart manufacturing systems and machines and increasing production performances. Nevertheless, cognitive ergonomics is often underestimated when implementing advanced industrial human-robot interaction. Thus, this work aims to systematically update, develop, and validate guidelines to assist non-experts in the early stages of the design of anthropocentric and collaborative assembly applications by focusing on the main features that have positively influenced workers' cognitive responses. A methodology for structured development has been proposed. The draft guidelines have been created starting from the outcomes of a systematic and extended screening of the scientific literature. Preliminary validation has been carried out with the help of researchers working in the field. Inputs on comprehensibility and relevance have been gathered to enhance the guidelines. Lastly, a survey was used to examine in depth how international experts in different branches can interpret such guidelines. In total, 108 responders were asked to qualitatively and quantitatively evaluate the guideline's comprehensibility and provide general comments or suggestions for each guideline. Based on the survey's results, the guidelines have been validated and some have been reviewed and re-written in their final form. The present work highlights that integrating human factors into the design of collaborative applications can significantly bolster manufacturing operations' resilience through inclusivity and system adaptability by enhancing worker safety, ergonomics, and wellbeing.
Collapse
Affiliation(s)
- Luca Gualtieri
- Industrial Engineering and Automation (IEA), Faculty of Engineering, Free University of Bozen-Bolzano, Piazza Domenicani 3, 39100, Bolzano, Italy.
| | - Federico Fraboni
- Department of Psychology, Università di Bologna, Via Zamboni 33, 40126, Bologna, Italy
| | - Hannah Brendel
- Department of Psychology, Università di Bologna, Via Zamboni 33, 40126, Bologna, Italy
| | - Luca Pietrantoni
- Department of Psychology, Università di Bologna, Via Zamboni 33, 40126, Bologna, Italy
| | - Renato Vidoni
- Industrial Engineering and Automation (IEA), Faculty of Engineering, Free University of Bozen-Bolzano, Piazza Domenicani 3, 39100, Bolzano, Italy
| | - Patrick Dallasega
- Industrial Engineering and Automation (IEA), Faculty of Engineering, Free University of Bozen-Bolzano, Piazza Domenicani 3, 39100, Bolzano, Italy
| |
Collapse
|
6
|
Styler BK, Deng W, Simmons R, Admoni H, Cooper R, Ding D. Exploring Control Authority Preferences in Robotic Arm Assistance for Power Wheelchair Users. ACTUATORS 2024; 13:10.3390/act13030104. [PMID: 38586279 PMCID: PMC10996449 DOI: 10.3390/act13030104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 04/09/2024]
Abstract
This paper uses mixed methods to explore the preliminary design of control authority preferences for an Assistive Robotic Manipulator (ARM). To familiarize users with an intelligent robotic arm, we perform two kitchen task iterations: one with user-initiated software autonomy (predefined autonomous actions) and one with manual control. Then, we introduce a third scenario, enabling users to choose between manual control and system delegation throughout the task. Results showed that, while manually switching modes and controlling the arm via joystick had a higher mental workload, participants still preferred full joystick control. Thematic analysis indicates manual control offered greater freedom and sense of accomplishment. Participants reacted positively to the idea of an interactive assistive system. Users did not want to ask the system to only assist, by taking over for certain actions, but also asked for situational feedback (e.g., 'How close am I (the gripper)?', 'Is the lid centered over the jug?'). This speaks to a future assistive system that ensures the user feels like they drive the system for the entirety of the task and provides action collaboration in addition to more granular situational awareness feedback.
Collapse
Affiliation(s)
- Breelyn Kane Styler
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA 15206, USA
| | - Wei Deng
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA 15206, USA
| | - Reid Simmons
- The Robotics Institute, Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA 15213, USA
| | - Henny Admoni
- The Robotics Institute, Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA 15213, USA
| | - Rory Cooper
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA 15206, USA
- The Robotics Institute, Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA 15213, USA
- Department of Rehabilitation Science and Technology, University of Pittsburgh, Pittsburgh, PA 15213, USA
| | - Dan Ding
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA 15206, USA
- Department of Rehabilitation Science and Technology, University of Pittsburgh, Pittsburgh, PA 15213, USA
| |
Collapse
|
7
|
Shi Y, Zhu P, Wang T, Mai H, Yeh X, Yang L, Wang J. Dynamic Virtual Fixture Generation Based on Intra-Operative 3D Image Feedback in Robot-Assisted Minimally Invasive Thoracic Surgery. SENSORS (BASEL, SWITZERLAND) 2024; 24:492. [PMID: 38257585 PMCID: PMC10820968 DOI: 10.3390/s24020492] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2023] [Revised: 01/09/2024] [Accepted: 01/10/2024] [Indexed: 01/24/2024]
Abstract
This paper proposes a method for generating dynamic virtual fixtures with real-time 3D image feedback to facilitate human-robot collaboration in medical robotics. Seamless shared control in a dynamic environment, like that of a surgical field, remains challenging despite extensive research on collaborative control and planning. To address this problem, our method dynamically creates virtual fixtures to guide the manipulation of a trocar-placing robot arm using the force field generated by point cloud data from an RGB-D camera. Additionally, the "view scope" concept selectively determines the region for computational points, thereby reducing computational load. In a phantom experiment for robot-assisted port incision in minimally invasive thoracic surgery, our method demonstrates substantially improved accuracy for port placement, reducing error and completion time by 50% (p=1.06×10-2) and 35% (p=3.23×10-2), respectively. These results suggest that our proposed approach is promising in improving surgical human-robot collaboration.
Collapse
Affiliation(s)
- Yunze Shi
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (Y.S.); (T.W.); (H.M.)
- School of Mechanical Engineering, Zhejiang University, Hangzhou 310058, China
| | - Peizhang Zhu
- Flexiv Ltd., Santa Clara, CA 95054, USA; (P.Z.); (X.Y.)
| | - Tengyue Wang
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (Y.S.); (T.W.); (H.M.)
- School of Mechanical Engineering, Zhejiang University, Hangzhou 310058, China
| | - Haonan Mai
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (Y.S.); (T.W.); (H.M.)
| | - Xiyang Yeh
- Flexiv Ltd., Santa Clara, CA 95054, USA; (P.Z.); (X.Y.)
| | - Liangjing Yang
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (Y.S.); (T.W.); (H.M.)
- School of Mechanical Engineering, Zhejiang University, Hangzhou 310058, China
- Department of Mechanical Engineering, University of Illinois Urbana-Champaign, Urbana, IL 61801, USA
| | - Jingfan Wang
- Flexiv Ltd., Santa Clara, CA 95054, USA; (P.Z.); (X.Y.)
| |
Collapse
|
8
|
Kim S, Hernandez I, Nussbaum MA, Lim S. Teleoperator-Robot-Human Interaction in Manufacturing: Perspectives from Industry, Robot Manufacturers, and Researchers. IISE Trans Occup Ergon Hum Factors 2024; 12:28-40. [PMID: 38328969 DOI: 10.1080/24725838.2024.2310301] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/09/2023] [Revised: 01/16/2024] [Accepted: 01/21/2024] [Indexed: 02/09/2024]
Abstract
OCCUPATIONAL APPLICATIONSIndustrial robots have become an important aspect in modern industry. In the context of human-robot collaboration, enabling teleoperated robots to work in close proximity to local/onsite humans can provide new opportunities to improve human engagement in a distributed workplace. Interviews with industry stakeholders highlighted several potential benefits of such teleoperator-robot-human collaboration (tRHC), including the application of tRHC to tasks requiring both expertise and manual dexterity (e.g., maintenance and highly skilled tasks in sectors including construction, manufacturing, and healthcare), as well as opportunities to expand job accessibility for individuals with disabilities and older individuals. However, interviewees also indicated potential challenges of tRHC, particularly related to human perception (e.g., perceiving remote environments), safety, and trust. Given these challenges, and the current limited information on the practical value and implementation of tRHC, we propose several future research directions, with a focus on human factors and ergonomics, to help realize the potential benefits of tRHC.
Collapse
Affiliation(s)
- Sunwook Kim
- Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg, VA, USA
| | - Ivan Hernandez
- Psychology Department, Virginia Tech, Blacksburg, VA, USA
| | - Maury A Nussbaum
- Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg, VA, USA
| | - Sol Lim
- Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg, VA, USA
| |
Collapse
|
9
|
Kennedy M. The role of collaborative robotics in assistive and rehabilitation applications. Sci Robot 2023; 8:eadk6743. [PMID: 37878691 DOI: 10.1126/scirobotics.adk6743] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2023]
Abstract
Collaborative robotics principles and advancements may transform the field of assistive and rehabilitation robotics.
Collapse
Affiliation(s)
- Monroe Kennedy
- Department of Mechanical Engineering, Stanford University, Stanford, CA, USA
| |
Collapse
|
10
|
Abdulazeem N, Hu Y. Human Factors Considerations for Quantifiable Human States in Physical Human-Robot Interaction: A Literature Review. SENSORS (BASEL, SWITZERLAND) 2023; 23:7381. [PMID: 37687837 PMCID: PMC10490212 DOI: 10.3390/s23177381] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Revised: 08/11/2023] [Accepted: 08/16/2023] [Indexed: 09/10/2023]
Abstract
As the global population rapidly ages with longer life expectancy and declining birth rates, the need for healthcare services and caregivers for older adults is increasing. Current research envisions addressing this shortage by introducing domestic service robots to assist with daily activities. The successful integration of robots as domestic service providers in our lives requires them to possess efficient manipulation capabilities, provide effective physical assistance, and have adaptive control frameworks that enable them to develop social understanding during human-robot interaction. In this context, human factors, especially quantifiable ones, represent a necessary component. The objective of this paper is to conduct an unbiased review encompassing the studies on human factors studied in research involving physical interactions and strong manipulation capabilities. We identified the prevalent human factors in physical human-robot interaction (pHRI), noted the factors typically addressed together, and determined the frequently utilized assessment approaches. Additionally, we gathered and categorized proposed quantification approaches based on the measurable data for each human factor. We also formed a map of the common contexts and applications addressed in pHRI for a comprehensive understanding and easier navigation of the field. We found out that most of the studies in direct pHRI (when there is direct physical contact) focus on social behaviors with belief being the most commonly addressed human factor type. Task collaboration is moderately investigated, while physical assistance is rarely studied. In contrast, indirect pHRI studies (when the physical contact is mediated via a third item) often involve industrial settings, with physical ergonomics being the most frequently investigated human factor. More research is needed on the human factors in direct and indirect physical assistance applications, including studies that combine physical social behaviors with physical assistance tasks. We also found that while the predominant approach in most studies involves the use of questionnaires as the main method of quantification, there is a recent trend that seeks to address the quantification approaches based on measurable data.
Collapse
Affiliation(s)
| | - Yue Hu
- Active & Interactive Robotics Lab, Department of Mechanical and Mechatronics Engineering, University of Waterloo, 200 University Ave. W., Waterloo, ON N2L 3G1, Canada;
| |
Collapse
|
11
|
Ranieri CM, Moioli RC, Vargas PA, Romero RAF. A neurorobotics approach to behaviour selection based on human activity recognition. Cogn Neurodyn 2023; 17:1009-1028. [PMID: 37522044 PMCID: PMC10374508 DOI: 10.1007/s11571-022-09886-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2021] [Revised: 08/04/2022] [Accepted: 09/14/2022] [Indexed: 11/03/2022] Open
Abstract
Behaviour selection has been an active research topic for robotics, in particular in the field of human-robot interaction. For a robot to interact autonomously and effectively with humans, the coupling between techniques for human activity recognition and robot behaviour selection is of paramount importance. However, most approaches to date consist of deterministic associations between the recognised activities and the robot behaviours, neglecting the uncertainty inherent to sequential predictions in real-time applications. In this paper, we address this gap by presenting an initial neurorobotics model that embeds, in a simulated robot, computational models of parts of the mammalian brain that resembles neurophysiological aspects of the basal ganglia-thalamus-cortex (BG-T-C) circuit, coupled with human activity recognition techniques. A robotics simulation environment was developed for assessing the model, where a mobile robot accomplished tasks by using behaviour selection in accordance with the activity being performed by the inhabitant of an intelligent home. Initial results revealed that the initial neurorobotics model is advantageous, especially considering the coupling between the most accurate activity recognition approaches and the computational models of more complex animals.
Collapse
Affiliation(s)
- Caetano M. Ranieri
- Institute of Mathematical and Computer Sciences, University of Sao Paulo, Avenida Trabalhador Sao Carlense, 400, Sao Carlos, SP 13566-590 Brazil
| | - Renan C. Moioli
- Bioinformatics Multidisciplinary Environment (BioME), Digital Metropolis Institute, Federal University of Rio Grande do Norte, Avenida Senador Salgado Filho, 3000, Natal, RN 59078-970 Brazil
| | - Patricia A. Vargas
- Edinburgh Centre for Robotics, Heriot-Watt University, Edinburgh, EH14 4AS Scotland, UK
| | - Roseli A. F. Romero
- Institute of Mathematical and Computer Sciences, University of Sao Paulo, Avenida Trabalhador Sao Carlense, 400, Sao Carlos, SP 13566-590 Brazil
| |
Collapse
|
12
|
Xu B, Liu D, Xue M, Miao M, Hu C, Song A. Continuous shared control of a mobile robot with brain-computer interface and autonomous navigation for daily assistance. Comput Struct Biotechnol J 2023; 22:3-16. [PMID: 37600142 PMCID: PMC10433001 DOI: 10.1016/j.csbj.2023.07.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2023] [Revised: 07/04/2023] [Accepted: 07/22/2023] [Indexed: 08/22/2023] Open
Abstract
Although the electroencephalography (EEG) based brain-computer interface (BCI) has been successfully developed for rehabilitation and assistance, it is still challenging to achieve continuous control of a brain-actuated mobile robot system. In this study, we propose a continuous shared control strategy combining continuous BCI and autonomous navigation for a mobile robot system. The weight of shared control is designed to dynamically adjust the fusion of continuous BCI control and autonomous navigation. During this process, the system uses the visual-based simultaneous localization and mapping (SLAM) method to construct environmental maps. After obtaining the global optimal path, the system utilizes the brain-based shared control dynamic window approach (BSC-DWA) to evaluate safe and reachable trajectories while considering shared control velocity. Eight subjects participated in two-stage training, and six of these eight subjects participated in online shared control experiments. The training results demonstrated that naïve subjects could achieve continuous control performance with an average percent valid correct rate of approximately 97 % and an average total correct rate of over 80 %. The results of online shared control experiments showed that all of the subjects could complete navigation tasks in an unknown corridor with continuous shared control. Therefore, our experiments verified the feasibility and effectiveness of the proposed system combining continuous BCI, shared control, autonomous navigation, and visual SLAM. The proposed continuous shared control framework shows great promise in BCI-driven tasks, especially navigation tasks for brain-driven assistive mobile robots and wheelchairs in daily applications.
Collapse
Affiliation(s)
- Baoguo Xu
- State Key Laboratory of Bioelectronics, Jiangsu Key Laboratory of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China
| | - Deping Liu
- State Key Laboratory of Bioelectronics, Jiangsu Key Laboratory of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China
| | - Muhui Xue
- State Key Laboratory of Bioelectronics, Jiangsu Key Laboratory of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China
| | - Minmin Miao
- School of Information Engineering, Huzhou University, Huzhou 313000, China
| | - Cong Hu
- Guangxi Key Laboratory of Automatic Detecting Technology and Instruments, Guilin University of Electronic Technology, Guilin 541004, China
| | - Aiguo Song
- State Key Laboratory of Bioelectronics, Jiangsu Key Laboratory of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China
| |
Collapse
|
13
|
Görür OC, Rosman B, Sivrikaya F, Albayrak S. FABRIC: A Framework for the Design and Evaluation of Collaborative Robots with Extended Human Adaptation. ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION 2023. [DOI: 10.1145/3585276] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/19/2023]
Abstract
A limitation for collaborative robots (cobots) is their lack of ability to adapt to human partners, who typically exhibit an immense diversity of behaviors. We present an autonomous framework as a cobot’s real-time decision-making mechanism to anticipate a variety of human characteristics and behaviors, including human errors, toward a personalized collaboration. Our framework handles such behaviors in two levels: 1) short-term human behaviors are adapted through our novel Anticipatory Partially Observable Markov Decision Process (A-POMDP) models, covering a human’s changing intent (motivation), availability, and capability; 2) long-term changing human characteristics are adapted by our novel Adaptive Bayesian Policy Selection (ABPS) mechanism that selects a short-term decision model, e.g., an A-POMDP, according to an estimate of a human’s workplace characteristics, such as her expertise and collaboration preferences. To design and evaluate our framework over a diversity of human behaviors, we propose a pipeline where we first train and rigorously test the framework in simulation over novel human models. Then, we deploy and evaluate it on our novel physical experiment setup that induces cognitive load on humans to observe their dynamic behaviors, including their mistakes, and their changing characteristics such as their expertise. We conduct user studies and show that our framework effectively collaborates non-stop for hours and adapts to various changing human behaviors and characteristics in real-time. That increases the efficiency and naturalness of the collaboration with a higher perceived collaboration, positive teammate traits, and human trust. We believe that such an extended human-adaptation is a key to the long-term use of cobots.
Collapse
Affiliation(s)
- O. Can Görür
- DAI-Labor, Technische Universität Berlin, Germany
| | | | | | | |
Collapse
|
14
|
Enhancing Robot Task Completion Through Environment and Task Inference: A Survey from the Mobile Robot Perspective. J INTELL ROBOT SYST 2022. [DOI: 10.1007/s10846-022-01776-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
|
15
|
Laghi M, Raiano L, Amadio F, Rollo F, Zunino A, Ajoudani A. A Target-Guided Telemanipulation Architecture for Assisted Grasping. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3188436] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
- Marco Laghi
- Intelligent and Autonomous Systems, Leonardo Labs, Genova, Italy
| | - Luigi Raiano
- Intelligent and Autonomous Systems, Leonardo Labs, Genova, Italy
| | - Fabio Amadio
- Intelligent and Autonomous Systems, Leonardo Labs, Genova, Italy
| | - Federico Rollo
- Intelligent and Autonomous Systems, Leonardo Labs, Genova, Italy
| | - Andrea Zunino
- Intelligent and Autonomous Systems, Leonardo Labs, Genova, Italy
| | - Arash Ajoudani
- Human-Robot Interfaces and Physical Interaction (HRI2), Istituto Italiano di Tecnologia, Genoa Roma, Italy
| |
Collapse
|
16
|
Digital Twin for Human–Robot Interactions by Means of Industry 4.0 Enabling Technologies. SENSORS 2022; 22:s22134950. [PMID: 35808462 PMCID: PMC9269811 DOI: 10.3390/s22134950] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/23/2022] [Revised: 06/14/2022] [Accepted: 06/16/2022] [Indexed: 12/11/2022]
Abstract
There has been a rapid increase in the use of collaborative robots in manufacturing industries within the context of Industry 4.0 and smart factories. The existing human–robot interactions, simulations, and robot programming methods do not fit into these fast-paced technological advances as they are time-consuming, require engineering expertise, waste a lot of time in programming and the interaction is not trivial for non-expert operators. To tackle these challenges, we propose a digital twin (DT) approach for human–robot interactions (HRIs) in hybrid teams in this paper. We achieved this using Industry 4.0 enabling technologies, such as mixed reality, the Internet of Things, collaborative robots, and artificial intelligence. We present a use case scenario of the proposed method using Microsoft Hololens 2 and KUKA IIWA collaborative robot. The obtained results indicated that it is possible to achieve efficient human–robot interactions using these advanced technologies, even with operators who have not been trained in programming. The proposed method has further benefits, such as real-time simulation in natural environments and flexible system integration to incorporate new devices (e.g., robots or software capabilities).
Collapse
|
17
|
Allenspach M, Vyas Y, Rubio M, Siegwart R, Tognon M. Human-State-Aware Controller for a Tethered Aerial Robot Guiding a Human by Physical Interaction. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3143574] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|