1
|
Osmani K, Schulz D. Comprehensive Investigation of Unmanned Aerial Vehicles (UAVs): An In-Depth Analysis of Avionics Systems. SENSORS (BASEL, SWITZERLAND) 2024; 24:3064. [PMID: 38793917 PMCID: PMC11125140 DOI: 10.3390/s24103064] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/09/2024] [Revised: 05/01/2024] [Accepted: 05/08/2024] [Indexed: 05/26/2024]
Abstract
The evolving technologies regarding Unmanned Aerial Vehicles (UAVs) have led to their extended applicability in diverse domains, including surveillance, commerce, military, and smart electric grid monitoring. Modern UAV avionics enable precise aircraft operations through autonomous navigation, obstacle identification, and collision prevention. The structures of avionics are generally complex, and thorough hierarchies and intricate connections exist in between. For a comprehensive understanding of a UAV design, this paper aims to assess and critically review the purpose-classified electronics hardware inside UAVs, each with the corresponding performance metrics thoroughly analyzed. This review includes an exploration of different algorithms used for data processing, flight control, surveillance, navigation, protection, and communication. Consequently, this paper enriches the knowledge base of UAVs, offering an informative background on various UAV design processes, particularly those related to electric smart grid applications. As a future work recommendation, an actual relevant project is openly discussed.
Collapse
Affiliation(s)
| | - Detlef Schulz
- Department of Electrical Engineering, Helmut Schmidt University, 22043 Hamburg, Germany;
| |
Collapse
|
2
|
Vysocký A, Poštulka T, Chlebek J, Kot T, Maslowski J, Grushko S. Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23094219. [PMID: 37177421 PMCID: PMC10180605 DOI: 10.3390/s23094219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Revised: 04/21/2023] [Accepted: 04/21/2023] [Indexed: 05/15/2023]
Abstract
The article explores the possibilities of using hand gestures as a control interface for robotic systems in a collaborative workspace. The development of hand gesture control interfaces has become increasingly important in everyday life as well as professional contexts such as manufacturing processes. We present a system designed to facilitate collaboration between humans and robots in manufacturing processes that require frequent revisions of the robot path and that allows direct definition of the waypoints, which differentiates our system from the existing ones. We introduce a novel and intuitive approach to human-robot cooperation through the use of simple gestures. As part of a robotic workspace, a proposed interface was developed and implemented utilising three RGB-D sensors for monitoring the operator's hand movements within the workspace. The system employs distributed data processing through multiple Jetson Nano units, with each unit processing data from a single camera. MediaPipe solution is utilised to localise the hand landmarks in the RGB image, enabling gesture recognition. We compare the conventional methods of defining robot trajectories with their developed gesture-based system through an experiment with 20 volunteers. The experiment involved verification of the system under realistic conditions in a real workspace closely resembling the intended industrial application. Data collected during the experiment included both objective and subjective parameters. The results indicate that the gesture-based interface enables users to define a given path objectively faster than conventional methods. We critically analyse the features and limitations of the developed system and suggest directions for future research. Overall, the experimental results indicate the usefulness of the developed system as it can speed up the definition of the robot's path.
Collapse
Affiliation(s)
- Aleš Vysocký
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Tomáš Poštulka
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Jakub Chlebek
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Tomáš Kot
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Jan Maslowski
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Stefan Grushko
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| |
Collapse
|
3
|
Korzeniewska E, Kania M, Zawiślak R. Textronic Glove Translating Polish Sign Language. SENSORS (BASEL, SWITZERLAND) 2022; 22:6788. [PMID: 36146138 PMCID: PMC9505883 DOI: 10.3390/s22186788] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Revised: 08/30/2022] [Accepted: 09/05/2022] [Indexed: 06/16/2023]
Abstract
Communication between people is a basic social skill used to exchange information. It is often used for self-express and to meet basic human needs, such as the need for closeness, belonging, and security. This process takes place at different levels, using different means, with specific effects. It generally means a two-way flow of information in the immediate area of contact with another person. When people are communicating using the same language, the flow of information is much easier compared to the situation when two people use two different languages from different language families. The process of social communication with the deaf is difficult as well. It is therefore essential to use modern technologies to facilitate communication with deaf and non-speaking people. This article presents the results of work on a prototype of a glove using textronic elements produced using a physical vacuum deposition process. The signal from the sensors, in the form of resistance changes, is read by the microcontroller, and then it is processed and displayed on a smartphone screen in the form of single letters. During the experiment, 520 letters were signed by each author. The correctness of interpreting the signs was 86.5%. Each letter was recognized within approximately 3 s. One of the main results of the article was also the selection of an appropriate material (Velostat, membrane) that can be used as a sensor for the proposed application solution. The proposed solution can enable communication with the deaf using the finger alphabet, which can be used to spell single words or the most important key words.
Collapse
Affiliation(s)
- Ewa Korzeniewska
- Institute of Electrical Engineering Systems, Lodz University of Technology, Stefanowskiego 18 Street, 90-537 Lodz, Poland
| | - Marta Kania
- Institute of Automatic Control, Lodz University of Technology, Stefanowskiego 18 Street, 90-537 Lodz, Poland
| | - Rafał Zawiślak
- Institute of Automatic Control, Lodz University of Technology, Stefanowskiego 18 Street, 90-537 Lodz, Poland
| |
Collapse
|
4
|
Hang F, Xie L, Zhang Z, Guo W, Li H. Artificial intelligence enabled fuzzy multimode decision support system for cyber threat security defense automation. JOURNAL OF COMPUTER VIROLOGY AND HACKING TECHNIQUES 2022. [DOI: 10.1007/s11416-022-00443-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
|
5
|
Virtual interaction and manipulation control of a hexacopter through hand gesture recognition from a data glove. ROBOTICA 2022. [DOI: 10.1017/s0263574722000972] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Abstract
The purpose of this study is to realize virtual interaction and manipulation control of a hexacopter based on hand gesture recognition from a designed data glove, to provide an intuitive and visual real-time simulation system for flight control algorithm verification and external control equipment testing. First, the hand gesture recognition from a designed data glove is studied, which can recognize different actions, such as mobile ready, grab, loosen, landing, take-off, and hover. Then, the design of virtual simulation system for hexacopter capture is completed, with the model design of hexacopter and manipulator, and the simulation software design with
$CoppeliaSim$
. Finally, virtual simulation experiment of hexacopter grasping and virtual flight control experiment based on data glove are tested, respectively, and quantitatively described. The overall recognition rate is 84.3%, indicating that the data glove produced has the ability to recognize gestures, but its recognition performance is not superior. In gesture recognition, the recognition rate of static gestures is relatively higher than that of dynamic gestures. Among the static gestures, the hover gesture has the highest recognition rate. The average correct rate of static gestures can reach 94%. The lowest recognition rate of dynamic gestures is upward movement, and the average recognition rate of dynamic gestures is 76.1%. The research can be used to remotely operate hexacopter using a data glove in the future and improve the control performance through virtual interaction and manipulation simulation before actual application.
Collapse
|
6
|
Yoo M, Na Y, Song H, Kim G, Yun J, Kim S, Moon C, Jo K. Motion Estimation and Hand Gesture Recognition-Based Human-UAV Interaction Approach in Real Time. SENSORS 2022; 22:s22072513. [PMID: 35408128 PMCID: PMC9002368 DOI: 10.3390/s22072513] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 03/10/2022] [Accepted: 03/23/2022] [Indexed: 02/04/2023]
Abstract
As an alternative to traditional remote controller, research on vision-based hand gesture recognition is being actively conducted in the field of interaction between human and unmanned aerial vehicle (UAV). However, vision-based gesture system has a challenging problem in recognizing the motion of dynamic gesture because it is difficult to estimate the pose of multi-dimensional hand gestures in 2D images. This leads to complex algorithms, including tracking in addition to detection, to recognize dynamic gestures, but they are not suitable for human–UAV interaction (HUI) systems that require safe design with high real-time performance. Therefore, in this paper, we propose a hybrid hand gesture system that combines an inertial measurement unit (IMU)-based motion capture system and a vision-based gesture system to increase real-time performance. First, IMU-based commands and vision-based commands are divided according to whether drone operation commands are continuously input. Second, IMU-based control commands are intuitively mapped to allow the UAV to move in the same direction by utilizing estimated orientation sensed by a thumb-mounted micro-IMU, and vision-based control commands are mapped with hand’s appearance through real-time object detection. The proposed system is verified in a simulation environment through efficiency evaluation with dynamic gestures of the existing vision-based system in addition to usability comparison with traditional joystick controller conducted for applicants with no experience in manipulation. As a result, it proves that it is a safer and more intuitive HUI design with a 0.089 ms processing speed and average lap time that takes about 19 s less than the joystick controller. In other words, it shows that it is viable as an alternative to existing HUI.
Collapse
|
7
|
Song S, Kim B, Kim S, Lee J. Foot Gesture Recognition Using High-Compression Radar Signature Image and Deep Learning. SENSORS 2021; 21:s21113937. [PMID: 34200461 PMCID: PMC8201004 DOI: 10.3390/s21113937] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Revised: 05/30/2021] [Accepted: 06/04/2021] [Indexed: 11/16/2022]
Abstract
Recently, Doppler radar-based foot gesture recognition has attracted attention as a hands-free tool. Doppler radar-based recognition for various foot gestures is still very challenging. So far, no studies have yet dealt deeply with recognition of various foot gestures based on Doppler radar and a deep learning model. In this paper, we propose a method of foot gesture recognition using a new high-compression radar signature image and deep learning. By means of a deep learning AlexNet model, a new high-compression radar signature is created by extracting dominant features via Singular Value Decomposition (SVD) processing; four different foot gestures including kicking, swinging, sliding, and tapping are recognized. Instead of using an original radar signature, the proposed method improves the memory efficiency required for deep learning training by using a high-compression radar signature. Original and reconstructed radar images with high compression values of 90%, 95%, and 99% were applied for the deep learning AlexNet model. As experimental results, movements of all four different foot gestures and of a rolling baseball were recognized with an accuracy of approximately 98.64%. In the future, due to the radar’s inherent robustness to the surrounding environment, this foot gesture recognition sensor using Doppler radar and deep learning will be widely useful in future automotive and smart home industry fields.
Collapse
Affiliation(s)
- Seungeon Song
- Division of Automotive Technology, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu 42988, Korea; (S.S.); (B.K.); (S.K.)
| | - Bongseok Kim
- Division of Automotive Technology, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu 42988, Korea; (S.S.); (B.K.); (S.K.)
| | - Sangdong Kim
- Division of Automotive Technology, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu 42988, Korea; (S.S.); (B.K.); (S.K.)
- Department of Interdisciplinary Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu 42988, Korea
| | - Jonghun Lee
- Division of Automotive Technology, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu 42988, Korea; (S.S.); (B.K.); (S.K.)
- Department of Interdisciplinary Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu 42988, Korea
- Correspondence: ; Tel.: +82-53-785-4580
| |
Collapse
|