1
|
Bat2Web: A Framework for Real-Time Classification of Bat Species Echolocation Signals Using Audio Sensor Data. SENSORS (BASEL, SWITZERLAND) 2024; 24:2899. [PMID: 38733008 PMCID: PMC11086295 DOI: 10.3390/s24092899] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/27/2024] [Revised: 04/09/2024] [Accepted: 04/26/2024] [Indexed: 05/13/2024]
Abstract
Bats play a pivotal role in maintaining ecological balance, and studying their behaviors offers vital insights into environmental health and aids in conservation efforts. Determining the presence of various bat species in an environment is essential for many bat studies. Specialized audio sensors can be used to record bat echolocation calls that can then be used to identify bat species. However, the complexity of bat calls presents a significant challenge, necessitating expert analysis and extensive time for accurate interpretation. Recent advances in neural networks can help identify bat species automatically from their echolocation calls. Such neural networks can be integrated into a complete end-to-end system that leverages recent internet of things (IoT) technologies with long-range, low-powered communication protocols to implement automated acoustical monitoring. This paper presents the design and implementation of such a system that uses a tiny neural network for interpreting sensor data derived from bat echolocation signals. A highly compact convolutional neural network (CNN) model was developed that demonstrated excellent performance in bat species identification, achieving an F1-score of 0.9578 and an accuracy rate of 97.5%. The neural network was deployed, and its performance was evaluated on various alternative edge devices, including the NVIDIA Jetson Nano and Google Coral.
Collapse
|
2
|
Non-Terrestrial Networks for Energy-Efficient Connectivity of Remote IoT Devices in the 6G Era: A Survey. SENSORS (BASEL, SWITZERLAND) 2024; 24:1227. [PMID: 38400391 PMCID: PMC10891744 DOI: 10.3390/s24041227] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Revised: 02/06/2024] [Accepted: 02/12/2024] [Indexed: 02/25/2024]
Abstract
The Internet of Things (IoT) is gaining popularity and market share, driven by its ability to connect devices and systems that were previously siloed, enabling new applications and services in a cost-efficient manner. Thus, the IoT fuels societal transformation and enables groundbreaking innovations like autonomous transport, robotic assistance, and remote healthcare solutions. However, when considering the Internet of Remote Things (IoRT), which refers to the expansion of IoT in remote and geographically isolated areas where neither terrestrial nor cellular networks are available, internet connectivity becomes a challenging issue. Non-Terrestrial Networks (NTNs) are increasingly gaining popularity as a solution to provide connectivity in remote areas due to the growing integration of satellites and Unmanned Aerial Vehicles (UAVs) with cellular networks. In this survey, we provide the technological framework for NTNs and Remote IoT, followed by a classification of the most recent scientific research on NTN-based IoRT systems. Therefore, we provide a comprehensive overview of the current state of research in IoRT and identify emerging research areas with high potential. In conclusion, we present and discuss 3GPP's roadmap for NTN standardization, which aims to establish an energy-efficient IoRT environment in the 6G era.
Collapse
|
3
|
Gearbox Compound Fault Diagnosis in Edge-IoT Based on Legendre Multiwavelet Transform and Convolutional Neural Network. SENSORS (BASEL, SWITZERLAND) 2023; 23:8669. [PMID: 37960369 PMCID: PMC10649726 DOI: 10.3390/s23218669] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 10/19/2023] [Accepted: 10/19/2023] [Indexed: 11/15/2023]
Abstract
The application of edge computing combined with the Internet of Things (edge-IoT) has been rapidly developed. It is of great significance to develop a lightweight network for gearbox compound fault diagnosis in the edge-IoT context. The goal of this paper is to devise a novel and high-accuracy lightweight neural network based on Legendre multiwavelet transform and multi-channel convolutional neural network (LMWT-MCNN) to fast recognize various compound fault categories of gearbox. The contributions of this paper mainly lie in three aspects: The feature images are designed based on the LMWT frequency domain and they are easily implemented in the MCNN model to effectively avoid noise interference. The proposed lightweight model only consists of three convolutional layers and three pooling layers to further extract the most valuable fault features without any artificial feature extraction. In a fully connected layer, the specific fault type of rotating machinery is identified by the multi-label method. This paper provides a promising technique for rotating machinery fault diagnosis in real applications based on edge-IoT, which can largely reduce labor costs. Finally, the PHM 2009 gearbox and Paderborn University bearing compound fault datasets are used to verify the effectiveness and robustness of the proposed method. The experimental results demonstrate that the proposed lightweight network is able to reliably identify the compound fault categories with the highest accuracy under the strong noise environment compared with the existing methods.
Collapse
|
4
|
DRL-OS: A Deep Reinforcement Learning-Based Offloading Scheduler in Mobile Edge Computing. SENSORS (BASEL, SWITZERLAND) 2022; 22:9212. [PMID: 36501914 PMCID: PMC9740101 DOI: 10.3390/s22239212] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Revised: 11/20/2022] [Accepted: 11/25/2022] [Indexed: 06/17/2023]
Abstract
Hardware bottlenecks can throttle smart device (SD) performance when executing computation-intensive and delay-sensitive applications. Hence, task offloading can be used to transfer computation-intensive tasks to an external server or processor in Mobile Edge Computing. However, in this approach, the offloaded task can be useless when a process is significantly delayed or a deadline has expired. Due to the uncertain task processing via offloading, it is challenging for each SD to determine its offloading decision (whether to local or remote and drop). This study proposes a deep-reinforcement-learning-based offloading scheduler (DRL-OS) that considers the energy balance in selecting the method for performing a task, such as local computing, offloading, or dropping. The proposed DRL-OS is based on the double dueling deep Q-network (D3QN) and selects an appropriate action by learning the task size, deadline, queue, and residual battery charge. The average battery level, drop rate, and average latency of the DRL-OS were measured in simulations to analyze the scheduler performance. The DRL-OS exhibits a lower average battery level (up to 54%) and lower drop rate (up to 42.5%) than existing schemes. The scheduler also achieves a lower average latency of 0.01 to >0.25 s, despite subtle case-wise differences in the average latency.
Collapse
|
5
|
A blockchain-based conditional privacy-preserving authentication scheme for edge computing services. JOURNAL OF INFORMATION SECURITY AND APPLICATIONS 2022. [DOI: 10.1016/j.jisa.2022.103334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/31/2022]
|
6
|
Compact SPICE Model of Memristor with Barrier Modulated Considering Short- and Long-Term Memory Characteristics by IGZO Oxygen Content. MICROMACHINES 2022; 13:1630. [PMID: 36295983 PMCID: PMC9610060 DOI: 10.3390/mi13101630] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Revised: 09/20/2022] [Accepted: 09/26/2022] [Indexed: 06/16/2023]
Abstract
This paper introduces a compact SPICE model of a two-terminal memory with a Pd/Ti/IGZO/p+-Si structure. In this paper, short- and long-term components are systematically separated and applied in each model. Such separations are conducted by the applied bias and oxygen flow rate (OFR) during indium gallium zinc oxide (IGZO) deposition. The short- and long-term components in the potentiation and depression curves are modeled by considering the process (OFR of IGZO) and bias conditions. The compact SPICE model with the physical mechanism of SiO2 modulation is introduced, which can be useful for optimizing the specification of memristor devices.
Collapse
|
7
|
Edge-to-Cloud IIoT for Condition Monitoring in Manufacturing Systems with Ubiquitous Smart Sensors. SENSORS (BASEL, SWITZERLAND) 2022; 22:5901. [PMID: 35957460 PMCID: PMC9371406 DOI: 10.3390/s22155901] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/13/2022] [Revised: 07/20/2022] [Accepted: 08/04/2022] [Indexed: 06/15/2023]
Abstract
The Industrial Internet of Things (IIoT) connects industrial assets to ubiquitous smart sensors and actuators to enhance manufacturing and industrial processes. Data-driven condition monitoring is an essential technology for intelligent manufacturing systems to identify anomalies from malfunctioning equipment, prevent unplanned downtime, and reduce the operation costs by predictive maintenance without interrupting normal machine operations. However, data-driven condition monitoring requires massive data collected from smart sensors to be transmitted to the cloud for further processing, thereby contributing to network congestion and affecting the network performance. Furthermore, unbalanced training data with very few labelled anomalies limit supervised learning models because of the lack of sufficient fault data for the training process in anomaly detection algorithms. To address these issues, we proposed an IIoT-based condition monitoring system with an edge-to-cloud architecture and computed the relative wavelet energy as feature vectors on the edge layer to reduce the network traffic overhead. We also proposed an unsupervised deep long short-term memory (LSTM) network module for anomaly detection. We implemented the proposed IIoT condition monitoring system for a manufacturing machine in a real shop site to evaluate our proposed solution. Our experimental results verify the effectiveness of our approach which can not only reduce the network traffic overhead for the IIoT but also detect anomalies accurately.
Collapse
|
8
|
A multi-objective optimization of resource management and minimum batch VM migration for prioritized task allocation in fog-edge-cloud computing. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2022. [DOI: 10.3233/jifs-213520] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Network resources and traffic priorities can be utilized to distribute requested tasks across edge nodes at the edge layer. However, due to the variety of tasks, the edge nodes have an impact on data accessibility. Resource management approaches based on Virtual Machine (VM) migration, job prioritization, and other methods were used to overcome this problem. A Minimized Upgrading Batch VM Scheduling (MSBP) has recently been developed, which reduces the number of batches required to complete a system-scale upgrade and assigns bandwidth to VM migration matrices. However, due to poor resource sharing caused by suboptimal VM utilization, the MSBP was unable to effectively ensure the global best solutions. In order to distribute resources and schedule tasks optimally during VM migration, this paper proposes the MSBP with Multi-objective Optimization of Resource Allocation (MORA) method. The major goal of this proposed methodology is to take into account different objectives and solve the Pareto-front problem to enhance lifetime of the fog-edge network. First, it formulates an NP-hard challenge for MSBP by taking into account a variety of factors such as network sustainability, path contention, network delay, and cost-efficiency. The Multi-objective Krill Herd optimization (MoKH) algorithm is then used to address the NP-hard issue using the Pareto optimality rule and produce the best solution. First, it introduces an NP-hard challenge for MSBP by accounting in network sustainability, path contention, network latency, and cost-efficiency. The Pareto optimality rule is then implemented to overcome the NP-hard problem and provide the optimum solution employing the Multi-objective Krill Herd optimization (MoKH) algorithm. This increases network lifetime and improves resource allocation cost efficiency. Finally, the simulation results show that the MSBP-MORA distributes resources more efficiently and hence increases network lifetime when compared to other traditional algorithms.
Collapse
|
9
|
SLedge: Scheduling and Load Balancing for a Stream Processing EDGE Architecture. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12136474] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Natural disasters have a significant impact on human welfare. In recent years, disasters are more violent and frequent due to climate change, so their impact may be higher if no preemptive measures are taken. In this context, real-time data processing and analysis have shown great potential to support decision-making, rescue, and recovery after a disaster. However, disaster scenarios are challenging due to their highly dynamic nature. In particular, we focus on data traffic and available processing resources. In this work, we propose SLedge—an edge-based processing model that enables mobile devices to support stream processing systems’ tasks under post-disaster scenarios. SLedge relies on a two-level control loop that automatically schedules SPS’s tasks over mobile devices to increase the system’s resilience, reduce latency, and provide accurate outputs. Our results show that SLedge can outperform a cloud-based infrastructure in terms of latency while keeping a low overhead. SLedge processes data up to five times faster than a cloud-based architecture while improving load balancing among processing resources, dealing better with traffic spikes, and reducing data loss and battery drain.
Collapse
|
10
|
TinyML: Enabling of Inference Deep Learning Models on Ultra-Low-Power IoT Edge Devices for AI Applications. MICROMACHINES 2022; 13:mi13060851. [PMID: 35744466 PMCID: PMC9227753 DOI: 10.3390/mi13060851] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Revised: 05/26/2022] [Accepted: 05/27/2022] [Indexed: 02/04/2023]
Abstract
Recently, the Internet of Things (IoT) has gained a lot of attention, since IoT devices are placed in various fields. Many of these devices are based on machine learning (ML) models, which render them intelligent and able to make decisions. IoT devices typically have limited resources, which restricts the execution of complex ML models such as deep learning (DL) on them. In addition, connecting IoT devices to the cloud to transfer raw data and perform processing causes delayed system responses, exposes private data and increases communication costs. Therefore, to tackle these issues, there is a new technology called Tiny Machine Learning (TinyML), that has paved the way to meet the challenges of IoT devices. This technology allows processing of the data locally on the device without the need to send it to the cloud. In addition, TinyML permits the inference of ML models, concerning DL models on the device as a Microcontroller that has limited resources. The aim of this paper is to provide an overview of the revolution of TinyML and a review of tinyML studies, wherein the main contribution is to provide an analysis of the type of ML models used in tinyML studies; it also presents the details of datasets and the types and characteristics of the devices with an aim to clarify the state of the art and envision development requirements.
Collapse
|
11
|
Recent Advances in Internet of Things Solutions for Early Warning Systems: A Review. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22062124. [PMID: 35336296 PMCID: PMC8954208 DOI: 10.3390/s22062124] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Revised: 03/01/2022] [Accepted: 03/04/2022] [Indexed: 05/27/2023]
Abstract
Natural disasters cause enormous damage and losses every year, both economic and in terms of human lives. It is essential to develop systems to predict disasters and to generate and disseminate timely warnings. Recently, technologies such as the Internet of Things solutions have been integrated into alert systems to provide an effective method to gather environmental data and produce alerts. This work reviews the literature regarding Internet of Things solutions in the field of Early Warning for different natural disasters: floods, earthquakes, tsunamis, and landslides. The aim of the paper is to describe the adopted IoT architectures, define the constraints and the requirements of an Early Warning system, and systematically determine which are the most used solutions in the four use cases examined. This review also highlights the main gaps in literature and provides suggestions to satisfy the requirements for each use case based on the articles and solutions reviewed, particularly stressing the advantages of integrating a Fog/Edge layer in the developed IoT architectures.
Collapse
|
12
|
A Comprehensive Review of Internet of Things: Technology Stack, Middlewares, and Fog/Edge Computing Interface. SENSORS (BASEL, SWITZERLAND) 2022; 22:995. [PMID: 35161740 PMCID: PMC8840251 DOI: 10.3390/s22030995] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/18/2021] [Revised: 01/11/2022] [Accepted: 01/21/2022] [Indexed: 06/14/2023]
Abstract
The Internet of Things (IoT) is an extensive network of heterogeneous devices that provides an array of innovative applications and services. IoT networks enable the integration of data and services to seamlessly interconnect the cyber and physical systems. However, the heterogeneity of devices, underlying technologies and lack of standardization pose critical challenges in this domain. On account of these challenges, this research article aims to provide a comprehensive overview of the enabling technologies and standards that build up the IoT technology stack. First, a layered architecture approach is presented where the state-of-the-art research and open challenges are discussed at every layer. Next, this research article focuses on the role of middleware platforms in IoT application development and integration. Furthermore, this article addresses the open challenges and provides comprehensive steps towards IoT stack optimization. Finally, the interfacing of Fog/Edge Networks to IoT technology stack is thoroughly investigated by discussing the current research and open challenges in this domain. The main scope of this study is to provide a comprehensive review into IoT technology (the horizontal fabric), the associated middleware and networks required to build future proof applications (the vertical markets).
Collapse
|
13
|
Industry 4.0: A Proposal of Paradigm Organization Schemes from a Systematic Literature Review. SENSORS (BASEL, SWITZERLAND) 2021; 22:66. [PMID: 35009609 PMCID: PMC8747394 DOI: 10.3390/s22010066] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/03/2021] [Revised: 12/18/2021] [Accepted: 12/21/2021] [Indexed: 06/14/2023]
Abstract
Currently, the concept of Industry 4.0 is well known; however, it is extremely complex, as it is constantly evolving and innovating. It includes the participation of many disciplines and areas of knowledge as well as the integration of many technologies, both mature and emerging, but working in collaboration and relying on their study and implementation under the novel criteria of Cyber-Physical Systems. This study starts with an exhaustive search for updated scientific information of which a bibliometric analysis is carried out with results presented in different tables and graphs. Subsequently, based on the qualitative analysis of the references, we present two proposals for the schematic analysis of Industry 4.0 that will help academia and companies to support digital transformation studies. The results will allow us to perform a simple alternative analysis of Industry 4.0 to understand the functions and scope of the integrating technologies to achieve a better collaboration of each area of knowledge and each professional, considering the potential and limitations of each one, supporting the planning of an appropriate strategy, especially in the management of human resources, for the successful execution of the digital transformation of the industry.
Collapse
|
14
|
Highly-Optimized Radar-Based Gesture Recognition System with Depthwise Expansion Module. SENSORS (BASEL, SWITZERLAND) 2021; 21:7298. [PMID: 34770603 PMCID: PMC8588382 DOI: 10.3390/s21217298] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Revised: 10/08/2021] [Accepted: 10/26/2021] [Indexed: 11/16/2022]
Abstract
The increasing integration of technology in our daily lives demands the development of more convenient human-computer interaction (HCI) methods. Most of the current hand-based HCI strategies exhibit various limitations, e.g., sensibility to variable lighting conditions and limitations on the operating environment. Further, the deployment of such systems is often not performed in resource-constrained contexts. Inspired by the MobileNetV1 deep learning network, this paper presents a novel hand gesture recognition system based on frequency-modulated continuous wave (FMCW) radar, exhibiting a higher recognition accuracy in comparison to the state-of-the-art systems. First of all, the paper introduces a method to simplify radar preprocessing while preserving the main information of the performed gestures. Then, a deep neural classifier with the novel Depthwise Expansion Module based on the depthwise separable convolutions is presented. The introduced classifier is optimized and deployed on the Coral Edge TPU board. The system defines and adopts eight different hand gestures performed by five users, offering a classification accuracy of 98.13% while operating in a low-power and resource-constrained environment.
Collapse
|
15
|
Modeling of a Generic Edge Computing Application Design. SENSORS 2021; 21:s21217276. [PMID: 34770582 PMCID: PMC8587040 DOI: 10.3390/s21217276] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2021] [Revised: 10/25/2021] [Accepted: 10/27/2021] [Indexed: 12/29/2022]
Abstract
Edge computing applications leverage advances in edge computing along with the latest trends of convolutional neural networks in order to achieve ultra-low latency, high-speed processing, low-power consumptions scenarios, which are necessary for deploying real-time Internet of Things deployments efficiently. As the importance of such scenarios is growing by the day, we propose to undertake two different kind of models, such as an algebraic models, with a process algebra called ACP and a coding model with a modeling language called Promela. Both approaches have been used to build models considering an edge infrastructure with a cloud backup, which has been further extended with the addition of extra fog nodes, and after having applied the proper verification techniques, they have all been duly verified. Specifically, a generic edge computing design has been specified in an algebraic manner with ACP, being followed by its corresponding algebraic verification, whereas it has also been specified by means of Promela code, which has been verified by means of the model checker Spin.
Collapse
|
16
|
Abstract
The Internet of Things (IoT) is a vital component of many future industries. By intelligent integration of sensors, wireless communications, computing techniques, and data analytics, IoT can increase productivity and efficiency of industries. Reliability of data transmission is key to realize several applications offered by IoT. In this paper, we present an overview of future IoT applications, and their major communication requirements. We provide a brief survey of recent work in four major areas of reliable IoT including resource allocation, latency management, security, and reliability metrics. Finally, we highlight some of the important challenges for reliable IoT related to machine learning techniques, 6G communications and blockchain based security that need further investigation and discuss related future directions.
Collapse
|
17
|
State-of-the-Art Review on IoT Threats and Attacks: Taxonomy, Challenges and Solutions. SUSTAINABILITY 2021. [DOI: 10.3390/su13169463] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The Internet of Things (IoT) plays a vital role in interconnecting physical and virtual objects that are embedded with sensors, software, and other technologies intending to connect and exchange data with devices and systems around the globe over the Internet. With a multitude of features to offer, IoT is a boon to mankind, but just as two sides of a coin, the technology, with its lack of securing information, may result in a big bane. It is estimated that by the year 2030, there will be nearly 25.44 billion IoT devices connected worldwide. Due to the unprecedented growth, IoT is endangered by numerous attacks, impairments, and misuses due to challenges such as resource limitations, heterogeneity, lack of standardization, architecture, etc. It is known that almost 98% of IoT traffic is not encrypted, exposing confidential and personal information on the network. To implement such a technology in the near future, a comprehensive implementation of security, privacy, authentication, and recovery is required. Therefore, in this paper, the comprehensive taxonomy of security and threats within the IoT paradigm is discussed. We also provide insightful findings, presumptions, and outcomes of the challenges to assist IoT developers to address risks and security flaws for better protection. A five-layer and a seven-layer IoT architecture are presented in addition to the existing three-layer architecture. The communication standards and the protocols, along with the threats and attacks corresponding to these three architectures, are discussed. In addition, the impact of different threats and attacks along with their detection, mitigation, and prevention are comprehensively presented. The state-of-the-art solutions to enhance security features in IoT devices are proposed based on Blockchain (BC) technology, Fog Computing (FC), Edge Computing (EC), and Machine Learning (ML), along with some open research problems.
Collapse
|
18
|
Method to Increase Dependability in a Cloud-Fog-Edge Environment. SENSORS 2021; 21:s21144714. [PMID: 34300454 PMCID: PMC8309580 DOI: 10.3390/s21144714] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/11/2021] [Revised: 06/28/2021] [Accepted: 07/07/2021] [Indexed: 11/23/2022]
Abstract
Robots can be very different, from humanoids to intelligent self-driving cars or just IoT systems that collect and process local sensors’ information. This paper presents a way to increase dependability for information exchange and processing in systems with Cloud-Fog-Edge architectures. In an ideal interconnected world, the recognized and registered robots must be able to communicate with each other if they are close enough, or through the Fog access points without overloading the Cloud. In essence, the presented work addresses the Edge area and how the devices can communicate in a safe and secure environment using cryptographic methods for structured systems. The presented work emphasizes the importance of security in a system’s dependability and offers a communication mechanism for several robots without overburdening the Cloud. This solution is ideal to be used where various monitoring and control aspects demand extra degrees of safety. The extra private keys employed by this procedure further enhance algorithm complexity, limiting the probability that the method may be broken by brute force or systemic attacks.
Collapse
|
19
|
A Low-Cost Platform for Environmental Smart Farming Monitoring System Based on IoT and UAVs. SUSTAINABILITY 2021. [DOI: 10.3390/su13115908] [Citation(s) in RCA: 29] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
When integrating the Internet of Things (IoT) with Unmanned Aerial Vehicles (UAVs) occurred, tens of applications including smart agriculture have emerged to offer innovative solutions to modernize the farming sector. This paper aims to present a low-cost platform for comprehensive environmental parameter monitoring using flying IoT. This platform is deployed and tested in a real scenario on a farm in Medenine, Tunisia, in the period of March 2020 to March 2021. The experimental work fulfills the requirements of automated and real-time monitoring of the environmental parameters using both under- and aboveground sensors. These IoT sensors are on a farm collecting vast amounts of environmental data, where it is sent to ground gateways every 1 h, after which the obtained data is collected and transmitted by a drone to the cloud for storage and analysis every 12 h. This low-cost platform can help farmers, governmental, or manufacturers to predict environmental data over the geographically large farm field, which leads to enhancement in crop productivity and farm management in a cost-effective, and timely manner. Obtained experimental results infer that automated and human-made sets of actions can be applied and/or suggested, due to the innovative integration between IoT sensors with the drone. These smart actions help in precision agriculture, which, in turn, intensely boost crop productivity, saving natural resources.
Collapse
|
20
|
Real-Time Compression for Tactile Internet Data Streams. SENSORS (BASEL, SWITZERLAND) 2021; 21:1924. [PMID: 33803484 PMCID: PMC7967243 DOI: 10.3390/s21051924] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/19/2020] [Revised: 03/02/2021] [Accepted: 03/05/2021] [Indexed: 11/16/2022]
Abstract
The Tactile Internet will require ultra-low latencies for combining machines and humans in systems where humans are in the control loop. Real-time and perceptual coding in these systems commonly require content-specific approaches. We present a generic approach based on deliberately reduced number accuracy and evaluate the trade-off between savings achieved and errors introduced with real-world data for kinesthetic movement and tele-surgery. Our combination of bitplane-level accuracy adaptability with perceptual threshold-based limits allows for great flexibility in broad application scenarios. Combining the attainable savings with the relatively small introduced errors enables the optimal selection of a working point for the method in actual implementations.
Collapse
|
21
|
Internet of Things: A General Overview between Architectures, Protocols and Applications. INFORMATION 2021. [DOI: 10.3390/info12020087] [Citation(s) in RCA: 36] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
In recent years, the growing number of devices connected to the internet has increased significantly. These devices can interact with the external environment and with human beings through a wide range of sensors that, perceiving reality through the digitization of some parameters of interest, can provide an enormous amount of data. All this data is then shared on the network with other devices and with different applications and infrastructures. This dynamic and ever-changing world underlies the Internet of Things (IoT) paradigm. To date, countless applications based on IoT have been developed; think of Smart Cities, smart roads, and smart industries. This article analyzes the current architectures, technologies, protocols, and applications that characterize the paradigm.
Collapse
|
22
|
A Fully Open-Source Approach to Intelligent Edge Computing: AGILE's Lesson. SENSORS (BASEL, SWITZERLAND) 2021; 21:1309. [PMID: 33673065 PMCID: PMC7918801 DOI: 10.3390/s21041309] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/09/2020] [Revised: 01/23/2021] [Accepted: 02/10/2021] [Indexed: 01/21/2023]
Abstract
In this paper, we describe the main outcomes of AGILE (acronym for "Adaptive Gateways for dIverse muLtiple Environments"), an EU-funded project that recently delivered a modular hardware and software framework conceived to address the fragmented market of embedded, multi-service, adaptive gateways for the Internet of Things (IoT). Its main goal is to provide a low-cost solution capable of supporting proof-of-concept implementations and rapid prototyping methodologies for both consumer and industrial IoT markets. AGILE allows developers to implement and deliver a complete (software and hardware) IoT solution for managing non-IP IoT devices through a multi-service gateway. Moreover, it simplifies the access of startups to the IoT market, not only providing an efficient and cost-effective solution for industries but also allowing end-users to customize and extend it according to their specific requirements. This flexibility is the result of the joint experience of established organizations in the project consortium already promoting the principles of openness, both at the software and hardware levels. We illustrate how the AGILE framework can provide a cost-effective yet solid and highly customizable, technological foundation supporting the configuration, deployment, and assessment of two distinct showcases, namely a quantified self application for individual consumers, and an air pollution monitoring station for industrial settings.
Collapse
|
23
|
Stochastic Latency Guarantee in Wireless Powered Virtualized Sensor Networks. SENSORS 2020; 21:s21010121. [PMID: 33375504 PMCID: PMC7795341 DOI: 10.3390/s21010121] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/24/2020] [Revised: 12/16/2020] [Accepted: 12/22/2020] [Indexed: 11/16/2022]
Abstract
How to guarantee the data rate and latency requirement for an application with limited energy is an open issue in wireless virtualized sensor networks. In this paper, we integrate the wireless energy transfer technology into the wireless virtualized sensor network and focus on the stochastic performance guarantee. Firstly, a joint task and resource allocation optimization problem are formulated. In order to characterize the stochastic latency of data transmission, effective capacity theory is resorted to study the relationship between network latency violation probability and the transmission capability of each node. The performance under the FDMA mode and that under the TDMA mode are first proved to be identical. We then propose a bisection search approach to ascertain the optimal task allocation with the objective to minimize the application latency violation probability. Furthermore, a one-dimensional searching scheme is proposed to find out the optimal energy harvesting time in each time block. The effectiveness of the proposed scheme is finally validated by extensive numerical simulations. Particularly, the proposed scheme is able to lower the latency violation probability by 11.6 times and 4600 times while comparing with the proportional task allocation scheme and the equal task allocation scheme, respectively.
Collapse
|