1
|
Schnalke M, Funk J, Wagner A. Bridging technology and ecology: enhancing applicability of deep learning and UAV-based flower recognition. FRONTIERS IN PLANT SCIENCE 2025; 16:1498913. [PMID: 40171479 PMCID: PMC11959073 DOI: 10.3389/fpls.2025.1498913] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/19/2024] [Accepted: 02/14/2025] [Indexed: 04/03/2025]
Abstract
The decline of insect biomass, including pollinators, represents a significant ecological challenge, impacting both biodiversity and ecosystems. Effective monitoring of pollinator habitats, especially floral resources, is essential for addressing this issue. This study connects drone and deep learning technologies to their practical application in ecological research. It focuses on simplifying the application of these technologies. Updating an object detection toolbox to TensorFlow (TF) 2 enhanced performance and ensured compatibility with newer software packages, facilitating access to multiple object recognition models - Faster Region-based Convolutional Neural Network (Faster R-CNN), Single-Shot-Detector (SSD), and EfficientDet. The three object detection models were tested on two datasets of UAV images of flower-rich grasslands, to evaluate their application potential in practice. A practical guide for biologists to apply flower recognition to Unmanned Aerial Vehicle (UAV) imagery is also provided. The results showed that Faster RCNN had the best overall performance with a precision of 89.9% and a recall of 89%, followed by EfficientDet, which excelled in recall but at a lower precision. Notably, EfficientDet demonstrated the lowest model complexity, making it a suitable choice for applications requiring a balance between efficiency and detection performance. Challenges remain, such as detecting flowers in dense vegetation and accounting for environmental variability.
Collapse
Affiliation(s)
- Marie Schnalke
- Faculty of Management Science and Engineering, Karlsruhe University of Applied Sciences (HKA), Karlsruhe, Germany
| | - Jonas Funk
- Faculty of Management Science and Engineering, Karlsruhe University of Applied Sciences (HKA), Karlsruhe, Germany
| | - Andreas Wagner
- Faculty of Management Science and Engineering, Karlsruhe University of Applied Sciences (HKA), Karlsruhe, Germany
- Fraunhofer Institute for Industrial Mathematics (ITWM), Kaiserslautern, Germany
| |
Collapse
|
2
|
Zhou G, Zhang B, Li Q, Zhao Q, Zhang S. Optimizing success rate with Nonlinear Mapping Control in a high-performance raspberry Pi-based light source target tracking system. PLoS One 2025; 20:e0319071. [PMID: 39999188 PMCID: PMC11856289 DOI: 10.1371/journal.pone.0319071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2024] [Accepted: 01/28/2025] [Indexed: 02/27/2025] Open
Abstract
This study addresses the limitations of linear mapping in two-dimensional gimbal control for moving target tracking, which results in significant control errors and slow response times. To overcome these issues, we propose a nonlinear mapping control method that enhances the success rate of light source target tracking systems. Using Raspberry Pi 4B and OpenCV, the control system performs real-time recognition of rectangular frames and laser spot images. The tracking system, which includes an OpenMV H7 Plus camera, captures and processes the laser spot path. Both systems are connected to an STM32F407ZGT6 microcontroller to drive a 42-step stepper motor with precise control. By adjusting the parameter c of the nonlinear mapping curve, we optimize the system's performance, balancing the response speed and stability. Our results show a significant improvement in control accuracy, with a miss rate of 3.3%, an average error rate of 0.188% at 1.25 m, and a 100% success rate in target tracking. The proposed nonlinear mapping control method offers substantial advancements in real-time tracking and control systems, demonstrating its potential for broader application in intelligent control fields.
Collapse
Affiliation(s)
- Guiyu Zhou
- School of Electronic Information and Engineering, Yibin University, Yibin, China
| | - Bo Zhang
- School of Electronic Information and Engineering, Yibin University, Yibin, China
- Shanghai Judong Semiconductor Company Limited, Shanghai, China
| | - Qinghao Li
- School of of Mechanical and electrical engineering, Yibin University, Yibin, China
| | - Qin Zhao
- School of Business English (Yibin Campus), Chengdu International Studies University, Yibin, China
| | - Shengyao Zhang
- School of Mathematics and Physics, Yibin University, Yibin, China
| |
Collapse
|
3
|
Bjerge K, Frigaard CE, Karstoft H. Object Detection of Small Insects in Time-Lapse Camera Recordings. SENSORS (BASEL, SWITZERLAND) 2023; 23:7242. [PMID: 37631778 PMCID: PMC10459366 DOI: 10.3390/s23167242] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2023] [Revised: 08/09/2023] [Accepted: 08/16/2023] [Indexed: 08/27/2023]
Abstract
As pollinators, insects play a crucial role in ecosystem management and world food production. However, insect populations are declining, necessitating efficient insect monitoring methods. Existing methods analyze video or time-lapse images of insects in nature, but analysis is challenging as insects are small objects in complex and dynamic natural vegetation scenes. In this work, we provide a dataset of primarily honeybees visiting three different plant species during two months of the summer. The dataset consists of 107,387 annotated time-lapse images from multiple cameras, including 9423 annotated insects. We present a method for detecting insects in time-lapse RGB images, which consists of a two-step process. Firstly, the time-lapse RGB images are preprocessed to enhance insects in the images. This motion-informed enhancement technique uses motion and colors to enhance insects in images. Secondly, the enhanced images are subsequently fed into a convolutional neural network (CNN) object detector. The method improves on the deep learning object detectors You Only Look Once (YOLO) and faster region-based CNN (Faster R-CNN). Using motion-informed enhancement, the YOLO detector improves the average micro F1-score from 0.49 to 0.71, and the Faster R-CNN detector improves the average micro F1-score from 0.32 to 0.56. Our dataset and proposed method provide a step forward for automating the time-lapse camera monitoring of flying insects.
Collapse
Affiliation(s)
- Kim Bjerge
- Department of Electrical and Computer Engineering, Aarhus University, 8200 Aarhus N, Denmark (H.K.)
| | | | | |
Collapse
|
4
|
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01715-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
5
|
Kalake L, Dong Y, Wan W, Hou L. Enhancing Detection Quality Rate with a Combined HOG and CNN for Real-Time Multiple Object Tracking across Non-Overlapping Multiple Cameras. SENSORS 2022; 22:s22062123. [PMID: 35336294 PMCID: PMC8949134 DOI: 10.3390/s22062123] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Revised: 03/02/2022] [Accepted: 03/05/2022] [Indexed: 12/04/2022]
Abstract
Multi-object tracking in video surveillance is subjected to illumination variation, blurring, motion, and similarity variations during the identification process in real-world practice. The previously proposed applications have difficulties in learning the appearances and differentiating the objects from sundry detections. They mostly rely heavily on local features and tend to lose vital global structured features such as contour features. This contributes to their inability to accurately detect, classify or distinguish the fooling images. In this paper, we propose a paradigm aimed at eliminating these tracking difficulties by enhancing the detection quality rate through the combination of a convolutional neural network (CNN) and a histogram of oriented gradient (HOG) descriptor. We trained the algorithm with an input of 120 × 32 images size and cleaned and converted them into binary for reducing the numbers of false positives. In testing, we eliminated the background on frames size and applied morphological operations and Laplacian of Gaussian model (LOG) mixture after blobs. The images further underwent feature extraction and computation with the HOG descriptor to simplify the structural information of the objects in the captured video images. We stored the appearance features in an array and passed them into the network (CNN) for further processing. We have applied and evaluated our algorithm for real-time multiple object tracking on various city streets using EPFL multi-camera pedestrian datasets. The experimental results illustrate that our proposed technique improves the detection rate and data associations. Our algorithm outperformed the online state-of-the-art approach by recording the highest in precisions and specificity rates.
Collapse
Affiliation(s)
- Lesole Kalake
- School of Communications and Information Engineering, Institute of Smart City, Shanghai University, Shanghai 200444, China; (Y.D.); (W.W.)
- Correspondence: ; Tel.: +86-198-2121-4680
| | - Yanqiu Dong
- School of Communications and Information Engineering, Institute of Smart City, Shanghai University, Shanghai 200444, China; (Y.D.); (W.W.)
| | - Wanggen Wan
- School of Communications and Information Engineering, Institute of Smart City, Shanghai University, Shanghai 200444, China; (Y.D.); (W.W.)
| | - Li Hou
- School of Information Engineering, Huangshan University, Huangshan 245041, China;
| |
Collapse
|
6
|
Filipi J, Stojnić V, Muštra M, Gillanders RN, Jovanović V, Gajić S, Turnbull GA, Babić Z, Kezić N, Risojević V. Honeybee-based biohybrid system for landmine detection. THE SCIENCE OF THE TOTAL ENVIRONMENT 2022; 803:150041. [PMID: 34500270 DOI: 10.1016/j.scitotenv.2021.150041] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Revised: 08/25/2021] [Accepted: 08/26/2021] [Indexed: 06/13/2023]
Abstract
Legacy landmines in post-conflict areas are a non-discriminatory lethal hazard and can still be triggered decades after the conflict has ended. Efforts to detect these explosive devices are expensive, time-consuming, and dangerous to humans and animals involved. While methods such as metal detectors and sniffer dogs have successfully been used in humanitarian demining, more tools are required for both site surveying and accurate mine detection. Honeybees have emerged in recent years as efficient bioaccumulation and biomonitoring animals. The system reported here uses two complementary landmine detection methods: passive sampling and active search. Passive sampling aims to confirm the presence of explosive materials in a mine-suspected area by the analysis of explosive material brought back to the colony on honeybee bodies returning from foraging trips. Analysis is performed by light-emitting chemical sensors detecting explosives thermally desorbed from a preconcentrator strip. The active search is intended to be able to pinpoint the place where individual landmines are most likely to be present. Used together, both methods are anticipated to be useful in an end-to-end process for area surveying, suspected hazardous area reduction, and post-clearing internal and external quality control in humanitarian demining.
Collapse
Affiliation(s)
- Janja Filipi
- Department of Ecology, Agronomy and Aquaculture, University of Zadar, Trg Kneza Višeslava 9, 23000 Zadar, Croatia
| | - Vladan Stojnić
- Faculty of Electrical Engineering, University of Banja Luka, Patre 5, 78000 Banja Luka, Bosnia and Herzegovina
| | - Mario Muštra
- Faculty of Transport and Traffic Sciences, University of Zagreb, Vukelićeva 4, 10000 Zagreb, Croatia
| | - Ross N Gillanders
- Organic Semiconductor Centre, SUPA, School of Physics & Astronomy, University of St Andrews, Fife KY16 9SS, Scotland, United Kingdom
| | - Vedran Jovanović
- Faculty of Agriculture, University of Zagreb, Svetošimunska Cesta 25, 10000 Zagreb, Croatia
| | - Slavica Gajić
- Faculty of Electrical Engineering, University of Banja Luka, Patre 5, 78000 Banja Luka, Bosnia and Herzegovina
| | - Graham A Turnbull
- Organic Semiconductor Centre, SUPA, School of Physics & Astronomy, University of St Andrews, Fife KY16 9SS, Scotland, United Kingdom
| | - Zdenka Babić
- Faculty of Electrical Engineering, University of Banja Luka, Patre 5, 78000 Banja Luka, Bosnia and Herzegovina
| | - Nikola Kezić
- Faculty of Agriculture, University of Zagreb, Svetošimunska Cesta 25, 10000 Zagreb, Croatia
| | - Vladimir Risojević
- Faculty of Electrical Engineering, University of Banja Luka, Patre 5, 78000 Banja Luka, Bosnia and Herzegovina.
| |
Collapse
|
7
|
Happ C, Sutor A, Hochradel K. Methodology for the Automated Visual Detection of Bird and Bat Collision Fatalities at Onshore Wind Turbines. J Imaging 2021; 7:jimaging7120272. [PMID: 34940738 PMCID: PMC8704095 DOI: 10.3390/jimaging7120272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2021] [Revised: 11/18/2021] [Accepted: 11/26/2021] [Indexed: 11/17/2022] Open
Abstract
The number of collision fatalities is one of the main quantification measures for research concerning wind power impacts on birds and bats. Despite being integral in ongoing investigations as well as regulatory approvals, the state-of-the-art method for the detection of fatalities remains a manual search by humans or dogs. This is expensive, time consuming and the efficiency varies greatly among different studies. Therefore, we developed a methodology for the automatic detection using visual/near-infrared cameras for daytime and thermal cameras for nighttime. The cameras can be installed in the nacelle of wind turbines and monitor the area below. The methodology is centered around software that analyzes the images in real time using pixel-wise and region-based methods. We found that the structural similarity is the most important measure for the decision about a detection. Phantom drop tests in the actual wind test field with the system installed on 75 m above the ground resulted in a sensitivity of 75.6% for the nighttime detection and 84.3% for the daylight detection. The night camera detected 2.47 false positives per hour using a time window designed for our phantom drop tests. However, in real applications this time window can be extended to eliminate false positives caused by nightly active animals. Excluding these from our data reduced the false positive rate to 0.05. The daylight camera detected 0.20 false positives per hour. Our proposed method has the advantages of being more consistent, more objective, less time consuming, and less expensive than manual search methods.
Collapse
|
8
|
Zhan W, Sun C, Wang M, She J, Zhang Y, Zhang Z, Sun Y. An improved Yolov5 real-time detection method for small objects captured by UAV. Soft comput 2021. [DOI: 10.1007/s00500-021-06407-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
9
|
Pilipović R, Risojević V, Božič J, Bulić P, Lotrič U. An Approximate GEMM Unit for Energy-Efficient Object Detection. SENSORS 2021; 21:s21124195. [PMID: 34207295 PMCID: PMC8234017 DOI: 10.3390/s21124195] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Revised: 06/12/2021] [Accepted: 06/15/2021] [Indexed: 11/16/2022]
Abstract
Edge computing brings artificial intelligence algorithms and graphics processing units closer to data sources, making autonomy and energy-efficient processing vital for their design. Approximate computing has emerged as a popular strategy for energy-efficient circuit design, where the challenge is to achieve the best tradeoff between design efficiency and accuracy. The essential operation in artificial intelligence algorithms is the general matrix multiplication (GEMM) operation comprised of matrix multiplication and accumulation. This paper presents an approximate general matrix multiplication (AGEMM) unit that employs approximate multipliers to perform matrix–matrix operations on four-by-four matrices given in sixteen-bit signed fixed-point format. The synthesis of the proposed AGEMM unit to the 45 nm Nangate Open Cell Library revealed that it consumed only up to 36% of the area and 25% of the energy required by the exact general matrix multiplication unit. The AGEMM unit is ideally suited to convolutional neural networks, which can adapt to the error induced in the computation. We evaluated the AGEMM units’ usability for honeybee detection with the YOLOv4-tiny convolutional neural network. The results implied that we can deploy the AGEMM units in convolutional neural networks without noticeable performance degradation. Moreover, the AGEMM unit’s employment can lead to more area- and energy-efficient convolutional neural network processing, which in turn could prolong sensors’ and edge nodes’ autonomy.
Collapse
Affiliation(s)
- Ratko Pilipović
- Faculty of Computer and Information Science, University of Ljubljana, 1000 Ljubljana, Slovenia; (P.B.); (U.L.)
- Correspondence:
| | - Vladimir Risojević
- Faculty of Electrical Engineering, University of Banja Luka, 78000 Banja Luka, Bosnia and Herzegovina;
| | - Janko Božič
- Biotechnical Faculty, University of Ljubljana, 1000 Ljubljana, Slovenia;
| | - Patricio Bulić
- Faculty of Computer and Information Science, University of Ljubljana, 1000 Ljubljana, Slovenia; (P.B.); (U.L.)
| | - Uroš Lotrič
- Faculty of Computer and Information Science, University of Ljubljana, 1000 Ljubljana, Slovenia; (P.B.); (U.L.)
| |
Collapse
|