1
|
Hakim A, Srivastava AK, Hamza A, Owais M, Habib-Ur-Rahman M, Qadri S, Qayyum MA, Ahmad Khan FZ, Mahmood MT, Gaiser T. Yolo-pest: an optimized YoloV8x for detection of small insect pests using smart traps. Sci Rep 2025; 15:14029. [PMID: 40269001 PMCID: PMC12019348 DOI: 10.1038/s41598-025-97825-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2024] [Accepted: 04/07/2025] [Indexed: 04/25/2025] Open
Abstract
Fruit flies and fall-armyworm are one of the major insect pest that adversely affect fruit and crops, whereas fall-armyworm is also a highly destructive pest in maize crop but also damage other economically important field crops and vegetables. Adults of both pests can fly, making it hard to monitor them in the field. This study focuses on fine-tuning the YoloV8x model for automated monitoring and identifying insect pests, like fruit flies and fall-armyworms, in open fields and closed environments using IoT-based Smart Traps. The conventional techniques for monitoring of these insect pests involve pheromone attractants and sticky traps that require regular farm visits. We developed an IoT-based device, called Smart Trap, that attracts insect pests with pheromones and captures real-time images using cameras and IoT sensors. Its main objective is automated pest monitoring in fields or indoor grain storage houses. Images captured by smart traps are transmitted to the server, where Yolo-pest, a fine-tuned YoloV8x model with customized hyperparameters performs in real time for object detection. The performance of the smart trap was evaluated in a mango orchard (Fruit Flies) and maize field (fall Armyworm) in an arid climate, achieving a 94% average detection accuracy. The validation process on grayscale and coloured images further confirmed the model's consistent accuracy in identifying insect pests in maze crop and mango orchards. The mobile application also enhances the practical utility as having a user-friendly interface for real time identification of insect pest. Farmers can easily acces the information and data remotely that empowering them for efficient pest maangment.
Collapse
Affiliation(s)
- Ayesha Hakim
- Institute of Computing, MNS - University of Agriculture, Multan, 60000, Pakistan.
- School of Electrical Engineering and Computer Science (SEECS), National University of Sciences and Technology (NUST), Islamabad, Pakistan.
| | - Amit Kumar Srivastava
- Crop Science, Institute of Crop Science and Resource Conservation (INRES), University of Bonn, 53115, Bonn, Germany.
- Leibniz Centre for Agricultural Landscape Research (ZALF), Eberswalder Str. 84, 15374, Müncheberg, Germany.
| | - Ali Hamza
- Institute of Computing, MNS - University of Agriculture, Multan, 60000, Pakistan
| | - Muhammad Owais
- Institute of Computing, MNS - University of Agriculture, Multan, 60000, Pakistan
| | - Muhammad Habib-Ur-Rahman
- Crop Science, Institute of Crop Science and Resource Conservation (INRES), University of Bonn, 53115, Bonn, Germany
- Tropical Plant Production and Agricultural Systems Modelling (TROPAGS), University of Göttingen, 37077, Göttingen, Germany
- North Florida Research and Education Center, University of Florida, Gainesville, USA
| | - Salman Qadri
- Institute of Computing, MNS - University of Agriculture, Multan, 60000, Pakistan
| | - Mirza Abdul Qayyum
- Institute of Plant Protection, MNS- University of Agriculture, Multan, 60000, Pakistan
| | - Fawad Zafar Ahmad Khan
- Department of Outreach and Continuing Education, MNS-University of Agriculture, Multan, 60000, Pakistan
- Department of Entomology, University of Georgia, Griffin, GA, USA
| | - Muhammad Tariq Mahmood
- Department of Zoology, Cholistan University of Veterinary and Animal Sciences, Bahawalpur, 63100, Pakistan
| | - Thomas Gaiser
- Crop Science, Institute of Crop Science and Resource Conservation (INRES), University of Bonn, 53115, Bonn, Germany
| |
Collapse
|
2
|
Bharti A, Jain U, Chauhan N. From lab to field: Nano-biosensors for real-time plant nutrient tracking. PLANT NANO BIOLOGY 2024; 9:100079. [DOI: 10.1016/j.plana.2024.100079] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/06/2025]
|
3
|
Maia LJ, de Oliveira CH, Silva AB, Souza PAA, Müller NFD, Cardoso JDC, Ribeiro BM, de Abreu FVS, Campos FS. Arbovirus surveillance in mosquitoes: Historical methods, emerging technologies, and challenges ahead. Exp Biol Med (Maywood) 2023; 248:2072-2082. [PMID: 38183286 PMCID: PMC10800135 DOI: 10.1177/15353702231209415] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2024] Open
Abstract
Arboviruses cause millions of infections each year; however, only limited options are available for treatment and pharmacological prevention. Mosquitoes are among the most important vectors for the transmission of several pathogens to humans. Despite advances, the sampling, viral detection, and control methods for these insects remain ineffective. Challenges arise with the increase in mosquito populations due to climate change, insecticide resistance, and human interference affecting natural habitats, which contribute to the increasing difficulty in controlling the spread of arboviruses. Therefore, prioritizing arbovirus surveillance is essential for effective epidemic preparedness. In this review, we offer a concise historical account of the discovery and monitoring of arboviruses in mosquitoes, from mosquito capture to viral detection. We then analyzed the advantages and limitations of these traditional methods. Furthermore, we investigated the potential of emerging technologies to address these limitations, including the implementation of next-generation sequencing, paper-based devices, spectroscopic detectors, and synthetic biosensors. We also provide perspectives on recurring issues and areas of interest such as insect-specific viruses.
Collapse
Affiliation(s)
- Luis Janssen Maia
- Instituto de Ciências Biológicas, Departamento de Biologia Celular, Laboratório de Baculovírus, Universidade de Brasília, Brasília 70910-900, Brasil
| | - Cirilo Henrique de Oliveira
- Laboratório de Comportamento de Insetos, Instituto Federal do Norte de Minas Gerais, Salinas 39560-000, Brasil
| | - Arthur Batista Silva
- Laboratório de Bioinformática e Biotecnologia, Universidade Federal do Tocantins, Gurupi 77402-970, Brasil
| | - Pedro Augusto Almeida Souza
- Laboratório de Comportamento de Insetos, Instituto Federal do Norte de Minas Gerais, Salinas 39560-000, Brasil
| | - Nicolas Felipe Drumm Müller
- Instituto de Ciências Básicas da Saúde, Universidade Federal do Rio Grande do Sul, Porto Alegre 90035-003, Brasil
| | - Jader da Cruz Cardoso
- Divisão de Vigilância Ambiental em Saúde, Centro Estadual de Vigilância em Saúde, Secretaria Estadual de Saúde do Rio Grande do Sul, Porto Alegre 90610-000, Brasil
| | - Bergmann Morais Ribeiro
- Instituto de Ciências Biológicas, Departamento de Biologia Celular, Laboratório de Baculovírus, Universidade de Brasília, Brasília 70910-900, Brasil
| | | | - Fabrício Souza Campos
- Laboratório de Bioinformática e Biotecnologia, Universidade Federal do Tocantins, Gurupi 77402-970, Brasil
- Instituto de Ciências Básicas da Saúde, Universidade Federal do Rio Grande do Sul, Porto Alegre 90035-003, Brasil
| |
Collapse
|
4
|
Hadipour-Rokni R, Askari Asli-Ardeh E, Jahanbakhshi A, Esmaili Paeen-Afrakoti I, Sabzi S. Intelligent detection of citrus fruit pests using machine vision system and convolutional neural network through transfer learning technique. Comput Biol Med 2023; 155:106611. [PMID: 36774891 DOI: 10.1016/j.compbiomed.2023.106611] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Revised: 10/12/2022] [Accepted: 11/16/2022] [Indexed: 02/04/2023]
Abstract
Plant pests and diseases play a significant role in reducing the quality of agricultural products. As one of the most important plant pathogens, pests like Mediterranean fruit fly cause significant damage to crops and thus annually farmers face a lot of loss in their products. Therefore, the use of modern and non-destructive methods such as machine vision systems and deep learning for early detection of pests in agricultural products is of particular importance. In this study, citrus fruit images were taken in three stages: 1) before pest infestation, 2) beginning of fruit infestation, and 3) eight days after the second stage, in natural light conditions (7000-11,000 lux). A total of 1519 images were prepared for all classes. To classify the images, 70% of the images were used for the network training stage, 10% and 20% of the images were used for the validation and testing stages. Four pre-trained CNN models, namely ResNet-50, GoogleNet, VGG-16 and AlexNet as well as the SGDm, RMSProp and Adam optimization algorithms were used to identify and classify healthy fruit and fruit infected with the Mediterranean fly. The results of evaluating the models in the pest outbreak stage showed that the VGG-16 model with the help of SGDm algorithm had the best efficiency with the highest detection accuracy and F1 of 98.33% and 98.36%, respectively. The evaluation of the third stage showed that the AlexNet model with the help of SGDm algorithm had the best result with the highest detection accuracy and F1 of 99.33% and 99.34%, respectively. AlexNet model using SGDm optimization algorithm had the shortest network training time (323 s). The results of this study showed that convolutional neural network method and machine vision system can be effective in controlling and managing pests in orchards and other agricultural products.
Collapse
Affiliation(s)
- Ramazan Hadipour-Rokni
- Department of Biosystem Engineering, Sari Agricultural Sciences and Natural Resources University, Sari, Iran
| | | | - Ahmad Jahanbakhshi
- Department of Biosystems Engineering, University of Mohaghegh Ardabili, Ardabil, Iran
| | | | - Sajad Sabzi
- Department of Biosystems Engineering, University of Mohaghegh Ardabili, Ardabil, Iran
| |
Collapse
|
5
|
Parra C, Grijalva F, Núñez B, Núñez A, Pérez N, Benítez D. Automatic identification of intestinal parasites in reptiles using microscopic stool images and convolutional neural networks. PLoS One 2022; 17:e0271529. [PMID: 35925986 PMCID: PMC9352023 DOI: 10.1371/journal.pone.0271529] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Accepted: 07/05/2022] [Indexed: 11/18/2022] Open
Abstract
Captive environments trigger the propagation and multiplication of parasites among different reptile species, thus weakening their immune response and causing infections and diseases. Technological advances of convolutional neural networks have opened a new field for detecting and classifying diseases which have shown great potential to overcome the shortcomings of manual detection performed by experts. Therefore, we propose an approach to identify six captive reptiles parasitic agents (Ophionyssus natricis, Blastocystis sp, Oxiurdo egg, Rhytidoides similis, Strongyloides, Taenia) or the absence of such parasites from a microscope stool images dataset. Towards this end, we first use an image segmentation stage to detect the parasite within the image, which combines the Contrast Limited Adaptive Histogram Equalization (CLAHE) technique, the OTSU binarization method, and morphological operations. Then, we carry out a classification stage through MobileNet CNN under a transfer learning scheme. This method was validated on a stool image dataset containing 3616 images data samples and 26 videos from the six parasites mentioned above. The results obtained indicate that our transfer learning-based approach can learn a helpful representation from the dataset. We obtained an average accuracy of 94.26% across the seven classes (i.e., six parasitic agents and the absence of parasites), which statistically outperformed, at a 95% confidence level, a custom CNN trained from scratch.
Collapse
Affiliation(s)
- Carla Parra
- NuCom, Nuevas Comunicaciones Iberia S.A., Barcelona, Spain
| | - Felipe Grijalva
- Faculty of Engineering and Applied Sciences (FICA), Telecommunications Engineering, Universidad de Las Américas (UDLA), Quito, Ecuador
- Departamento de Electrónica, Telecomunicaciones y Redes de Información (DETRI), Escuela Politécnica Nacional, Ladrón de Guevara, Quito, Ecuador
- * E-mail:
| | - Bryan Núñez
- Departamento de Electrónica, Telecomunicaciones y Redes de Información (DETRI), Escuela Politécnica Nacional, Ladrón de Guevara, Quito, Ecuador
| | - Alejandra Núñez
- Carrera de Medicina Veterinaria y Zootecnia, Universidad Técnica de Ambato, Ambato, Ecuador
| | - Noel Pérez
- Colegio de Ciencias e Ingenierías “El Politécnico”, Universidad San Francisco de Quito USFQ, Quito, Ecuador
| | - Diego Benítez
- Colegio de Ciencias e Ingenierías “El Politécnico”, Universidad San Francisco de Quito USFQ, Quito, Ecuador
| |
Collapse
|
6
|
Reynolds J, Williams E, Martin D, Readling C, Ahmmed P, Huseth A, Bozkurt A. A Multimodal Sensing Platform for Interdisciplinary Research in Agrarian Environments. SENSORS (BASEL, SWITZERLAND) 2022; 22:5582. [PMID: 35898084 PMCID: PMC9331660 DOI: 10.3390/s22155582] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Revised: 07/13/2022] [Accepted: 07/20/2022] [Indexed: 06/15/2023]
Abstract
Agricultural and environmental monitoring programs often require labor-intensive inputs and substantial costs to manually gather data from remote field locations. Recent advances in the Internet of Things enable the construction of wireless sensor systems to automate these remote monitoring efforts. This paper presents the design of a modular system to serve as a research platform for outdoor sensor development and deployment. The advantages of this system include low power consumption (enabling solar charging), the use of commercially available electronic parts for lower-cost and scaled up deployments, and the flexibility to include internal electronics and external sensors, allowing novel applications. In addition to tracking environmental parameters, the modularity of this system brings the capability to measure other non-traditional elements. This capability is demonstrated with two different agri- and aquacultural field applications: tracking moth phenology and monitoring bivalve gaping. Collection of these signals in conjunction with environmental parameters could provide a holistic and context-aware data analysis. Preliminary experiments generated promising results, demonstrating the reliability of the system. Idle power consumption of 27.2 mW and 16.6 mW for the moth- and bivalve-tracking systems, respectively, coupled with 2.5 W solar cells allows for indefinite deployment in remote locations.
Collapse
Affiliation(s)
- James Reynolds
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Evan Williams
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Devon Martin
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Caleb Readling
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Parvez Ahmmed
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Anders Huseth
- Department of Entomology and Plant Pathology and North Carolina Plant Sciences Initiative, North Carolina State University, Raleigh, NC 27695-8208, USA;
| | - Alper Bozkurt
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| |
Collapse
|
7
|
Saradopoulos I, Potamitis I, Ntalampiras S, Konstantaras AI, Antonidakis EN. Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22052006. [PMID: 35271153 PMCID: PMC8914644 DOI: 10.3390/s22052006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Revised: 02/23/2022] [Accepted: 03/01/2022] [Indexed: 05/15/2023]
Abstract
Our aim is to promote the widespread use of electronic insect traps that report captured pests to a human-controlled agency. This work reports on edge-computing as applied to camera-based insect traps. We present a low-cost device with high power autonomy and an adequate picture quality that reports an internal image of the trap to a server and counts the insects it contains based on quantized and embedded deep-learning models. The paper compares different aspects of performance of three different edge devices, namely ESP32, Raspberry Pi Model 4 (RPi), and Google Coral, running a deep learning framework (TensorFlow Lite). All edge devices were able to process images and report accuracy in counting exceeding 95%, but at different rates and power consumption. Our findings suggest that ESP32 appears to be the best choice in the context of this application according to our policy for low-cost devices.
Collapse
Affiliation(s)
- Ioannis Saradopoulos
- Department of Electronic Engineering, Hellenic Mediterranean University, 73133 Chania, Greece; (I.S.); (A.I.K.); (E.N.A.)
| | - Ilyas Potamitis
- Department of Music Technology and Acoustics, Hellenic Mediterranean University, 74100 Rethymno, Greece
- Correspondence:
| | | | - Antonios I. Konstantaras
- Department of Electronic Engineering, Hellenic Mediterranean University, 73133 Chania, Greece; (I.S.); (A.I.K.); (E.N.A.)
| | - Emmanuel N. Antonidakis
- Department of Electronic Engineering, Hellenic Mediterranean University, 73133 Chania, Greece; (I.S.); (A.I.K.); (E.N.A.)
| |
Collapse
|
8
|
A Review of the Challenges of Using Deep Learning Algorithms to Support Decision-Making in Agricultural Activities. REMOTE SENSING 2022. [DOI: 10.3390/rs14030638] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
Deep Learning has been successfully applied to image recognition, speech recognition, and natural language processing in recent years. Therefore, there has been an incentive to apply it in other fields as well. The field of agriculture is one of the most important fields in which the application of deep learning still needs to be explored, as it has a direct impact on human well-being. In particular, there is a need to explore how deep learning models can be used as a tool for optimal planting, land use, yield improvement, production/disease/pest control, and other activities. The vast amount of data received from sensors in smart farms makes it possible to use deep learning as a model for decision-making in this field. In agriculture, no two environments are exactly alike, which makes testing, validating, and successfully implementing such technologies much more complex than in most other industries. This paper reviews some recent scientific developments in the field of deep learning that have been applied to agriculture, and highlights some challenges and potential solutions using deep learning algorithms in agriculture. The results in this paper indicate that by employing new methods from deep learning, higher performance in terms of accuracy and lower inference time can be achieved, and the models can be made useful in real-world applications. Finally, some opportunities for future research in this area are suggested.
Collapse
|
9
|
IoT-Based Fumigation for Insect Repellent in Food Storages: Breaking the Trade-Off between Efficiency and Safety. SUSTAINABILITY 2022. [DOI: 10.3390/su14031129] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/04/2022]
Abstract
Insect infestation in food can cause various health risks when ingested by humans, as well as damage to food itself. To tackle this, food safety can be secured by fumigating the food storage, using specific materials containing pesticides. However, because most fumigation is toxic to human health, there is a trade-off relationship between insect repellency and safety assurance. In this paper, to overcome this problem, first, organic fumigation is proposed, in which a relatively low-risk pyrethrin oil is developed. Second, a novel system which can remotely monitor and control fumigation using IoT is proposed for mitigating the fact that pyrethrin can also be dangerous when inhaled directly. Third, an insect repellent LED lamp system, which can replace insecticide through direct fumigation and ensure safety, has been proposed. Fourth, a camera-based human access detection system is developed for more efficient and safe controls during the fumigation. The performance of the proposed system has been verified through implemented test-bed, and it is revealed that the trade-off relationship between efficiency and safety can be overcome.
Collapse
|
10
|
Classification of Fruit Flies by Gender in Images Using Smartphones and the YOLOv4-Tiny Neural Network. MATHEMATICS 2022. [DOI: 10.3390/math10030295] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
The fruit fly Drosophila melanogaster is a classic research object in genetics and systems biology. In the genetic analysis of flies, a routine task is to determine the offspring size and gender ratio in their populations. Currently, these estimates are made manually, which is a very time-consuming process. The counting and gender determination of flies can be automated by using image analysis with deep learning neural networks on mobile devices. We proposed an algorithm based on the YOLOv4-tiny network to identify Drosophila flies and determine their gender based on the protocol of taking pictures of insects on a white sheet of paper with a cell phone camera. Three strategies with different types of augmentation were used to train the network. The best performance (F1 = 0.838) was achieved using synthetic images with mosaic generation. Females gender determination is worse than that one of males. Among the factors that most strongly influencing the accuracy of fly gender recognition, the fly’s position on the paper was the most important. Increased light intensity and higher quality of the device cameras have a positive effect on the recognition accuracy. We implement our method in the FlyCounter Android app for mobile devices, which performs all the image processing steps using the device processors only. The time that the YOLOv4-tiny algorithm takes to process one image is less than 4 s.
Collapse
|
11
|
Palanisamy P, Mohan RE, Semwal A, Jun Melivin LM, Félix Gómez B, Balakrishnan S, Elangovan K, Ramalingam B, Terntzer DN. Drain Structural Defect Detection and Mapping Using AI-Enabled Reconfigurable Robot Raptor and IoRT Framework. SENSORS 2021; 21:s21217287. [PMID: 34770593 PMCID: PMC8587168 DOI: 10.3390/s21217287] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 10/24/2021] [Accepted: 10/27/2021] [Indexed: 11/16/2022]
Abstract
Human visual inspection of drains is laborious, time-consuming, and prone to accidents. This work presents an AI-enabled robot-assisted remote drain inspection and mapping framework using our in-house developed reconfigurable robot Raptor. The four-layer IoRT serves as a bridge between the users and the robots, through which seamless information sharing takes place. The Faster RCNN ResNet50, Faster RCNN ResNet101, and Faster RCNN Inception-ResNet-v2 deep learning frameworks were trained using a transfer learning scheme with six typical concrete defect classes and deployed in an IoRT framework remote defect detection task. The efficiency of the trained CNN algorithm and drain inspection robot Raptor was evaluated through various real-time drain inspection field trials using the SLAM technique. The experimental results indicate that robot’s maneuverability was stable, and its mapping and localization were also accurate in different drain types. Finally, for effective drain maintenance, the SLAM-based defect map was generated by fusing defect detection results in the lidar-SLAM map.
Collapse
Affiliation(s)
- Povendhan Palanisamy
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
| | - Rajesh Elara Mohan
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
| | - Archana Semwal
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
| | - Lee Ming Jun Melivin
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
| | - Braulio Félix Gómez
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
| | - Selvasundari Balakrishnan
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
| | - Karthikeyan Elangovan
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
| | - Balakrishnan Ramalingam
- Engineering Product Development Pillar, Singapore University of Technology and Design (SUTD), Singapore 487372, Singapore; (P.P.); (R.E.M.); (A.S.); (L.M.J.M.); (B.F.G.); (S.B.); (K.E.)
- Correspondence:
| | - Dylan Ng Terntzer
- LionsBot International Pte. Ltd., #03-02, 11 Changi South Street 3, Singapore 486122, Singapore;
| |
Collapse
|
12
|
AI Enabled IoRT Framework for Rodent Activity Monitoring in a False Ceiling Environment. SENSORS 2021; 21:s21165326. [PMID: 34450767 PMCID: PMC8398580 DOI: 10.3390/s21165326] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Revised: 07/28/2021] [Accepted: 08/02/2021] [Indexed: 11/16/2022]
Abstract
Routine rodent inspection is essential to curbing rat-borne diseases and infrastructure damages within the built environment. Rodents find false ceilings to be a perfect spot to seek shelter and construct their habitats. However, a manual false ceiling inspection for rodents is laborious and risky. This work presents an AI-enabled IoRT framework for rodent activity monitoring inside a false ceiling using an in-house developed robot called “Falcon”. The IoRT serves as a bridge between the users and the robots, through which seamless information sharing takes place. The shared images by the robots are inspected through a Faster RCNN ResNet 101 object detection algorithm, which is used to automatically detect the signs of rodent inside a false ceiling. The efficiency of the rodent activity detection algorithm was tested in a real-world false ceiling environment, and detection accuracy was evaluated with the standard performance metrics. The experimental results indicate that the algorithm detects rodent signs and 3D-printed rodents with a good confidence level.
Collapse
|
13
|
Pathmakumar T, Kalimuthu M, Elara MR, Ramalingam B. An Autonomous Robot-Aided Auditing Scheme for Floor Cleaning. SENSORS 2021; 21:s21134332. [PMID: 34202746 PMCID: PMC8271831 DOI: 10.3390/s21134332] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Revised: 06/20/2021] [Accepted: 06/21/2021] [Indexed: 01/21/2023]
Abstract
Cleaning is an important factor in most aspects of our day-to-day life. This research work brings a solution to the fundamental question of “How clean is clean” by introducing a novel framework for auditing the cleanliness of built infrastructure using mobile robots. The proposed system presents a strategy for assessing the quality of cleaning in a given area and a novel exploration strategy that facilitates the auditing in a given location by a mobile robot. An audit sensor that works by the “touch and inspect” analogy that assigns an audit score corresponds to its area of inspection has been developed. A vision-based dirt-probability-driven exploration is proposed to empower a mobile robot with an audit sensor on-board to perform auditing tasks effectively. The quality of cleaning is quantified using a dirt density map representing location-wise audit scores, dirt distribution pattern obtained by kernel density estimation, and cleaning benchmark score representing the extent of cleanliness. The framework is realized in an in-house developed audit robot to perform the cleaning audit in indoor and semi-outdoor environments. The proposed method is validated by experiment trials to estimate the cleanliness in five different locations using the developed audit sensor and dirt-probability-driven exploration.
Collapse
|