1
|
Harandi N, Vandenberghe B, Vankerschaver J, Depuydt S, Van Messem A. How to make sense of 3D representations for plant phenotyping: a compendium of processing and analysis techniques. PLANT METHODS 2023; 19:60. [PMID: 37353846 DOI: 10.1186/s13007-023-01031-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 05/19/2023] [Indexed: 06/25/2023]
Abstract
Computer vision technology is moving more and more towards a three-dimensional approach, and plant phenotyping is following this trend. However, despite its potential, the complexity of the analysis of 3D representations has been the main bottleneck hindering the wider deployment of 3D plant phenotyping. In this review we provide an overview of typical steps for the processing and analysis of 3D representations of plants, to offer potential users of 3D phenotyping a first gateway into its application, and to stimulate its further development. We focus on plant phenotyping applications where the goal is to measure characteristics of single plants or crop canopies on a small scale in research settings, as opposed to large scale crop monitoring in the field.
Collapse
Affiliation(s)
- Negin Harandi
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | | | - Joris Vankerschaver
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | - Stephen Depuydt
- Erasmus Applied University of Sciences and Arts, Campus Kaai, Nijverheidskaai 170, Anderlecht, Belgium
| | - Arnout Van Messem
- Department of Mathematics, Université de Liège, Allée de la Découverte 12, Liège, Belgium.
| |
Collapse
|
2
|
Ye D, Wu L, Li X, Atoba TO, Wu W, Weng H. A Synthetic Review of Various Dimensions of Non-Destructive Plant Stress Phenotyping. PLANTS (BASEL, SWITZERLAND) 2023; 12:1698. [PMID: 37111921 PMCID: PMC10146287 DOI: 10.3390/plants12081698] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/20/2023] [Revised: 04/08/2023] [Accepted: 04/16/2023] [Indexed: 06/19/2023]
Abstract
Non-destructive plant stress phenotyping begins with traditional one-dimensional (1D) spectroscopy, followed by two-dimensional (2D) imaging, three-dimensional (3D) or even temporal-three-dimensional (T-3D), spectral-three-dimensional (S-3D), and temporal-spectral-three-dimensional (TS-3D) phenotyping, all of which are aimed at observing subtle changes in plants under stress. However, a comprehensive review that covers all these dimensional types of phenotyping, ordered in a spatial arrangement from 1D to 3D, as well as temporal and spectral dimensions, is lacking. In this review, we look back to the development of data-acquiring techniques for various dimensions of plant stress phenotyping (1D spectroscopy, 2D imaging, 3D phenotyping), as well as their corresponding data-analyzing pipelines (mathematical analysis, machine learning, or deep learning), and look forward to the trends and challenges of high-performance multi-dimension (integrated spatial, temporal, and spectral) phenotyping demands. We hope this article can serve as a reference for implementing various dimensions of non-destructive plant stress phenotyping.
Collapse
Affiliation(s)
- Dapeng Ye
- College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
| | - Libin Wu
- College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
| | - Xiaobin Li
- College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
| | - Tolulope Opeyemi Atoba
- College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
| | - Wenhao Wu
- College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
| | - Haiyong Weng
- College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
- Fujian Key Laboratory of Agricultural Information Sensing Technology, College of Mechanical and Electrical Engineering, Fujian Agriculture and Forestry University, Fuzhou 350002, China
| |
Collapse
|
3
|
Xu R, Li C. A Review of High-Throughput Field Phenotyping Systems: Focusing on Ground Robots. PLANT PHENOMICS 2022; 2022:9760269. [PMID: 36059604 PMCID: PMC9394113 DOI: 10.34133/2022/9760269] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/22/2021] [Accepted: 04/25/2022] [Indexed: 12/03/2022]
Abstract
Manual assessments of plant phenotypes in the field can be labor-intensive and inefficient. The high-throughput field phenotyping systems and in particular robotic systems play an important role to automate data collection and to measure novel and fine-scale phenotypic traits that were previously unattainable by humans. The main goal of this paper is to review the state-of-the-art of high-throughput field phenotyping systems with a focus on autonomous ground robotic systems. This paper first provides a brief review of nonautonomous ground phenotyping systems including tractors, manually pushed or motorized carts, gantries, and cable-driven systems. Then, a detailed review of autonomous ground phenotyping robots is provided with regard to the robot's main components, including mobile platforms, sensors, manipulators, computing units, and software. It also reviews the navigation algorithms and simulation tools developed for phenotyping robots and the applications of phenotyping robots in measuring plant phenotypic traits and collecting phenotyping datasets. At the end of the review, this paper discusses current major challenges and future research directions.
Collapse
Affiliation(s)
- Rui Xu
- Bio-Sensing and Instrumentation Laboratory, College of Engineering, The University of Georgia, Athens, USA
| | - Changying Li
- Bio-Sensing and Instrumentation Laboratory, College of Engineering, The University of Georgia, Athens, USA
- Phenomics and Plant Robotics Center, The University of Georgia, Athens, USA
| |
Collapse
|
4
|
Barbedo JGA. Data Fusion in Agriculture: Resolving Ambiguities and Closing Data Gaps. SENSORS (BASEL, SWITZERLAND) 2022; 22:2285. [PMID: 35336456 PMCID: PMC8952279 DOI: 10.3390/s22062285] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Revised: 03/09/2022] [Accepted: 03/14/2022] [Indexed: 06/14/2023]
Abstract
Acquiring useful data from agricultural areas has always been somewhat of a challenge, as these are often expansive, remote, and vulnerable to weather events. Despite these challenges, as technologies evolve and prices drop, a surge of new data are being collected. Although a wealth of data are being collected at different scales (i.e., proximal, aerial, satellite, ancillary data), this has been geographically unequal, causing certain areas to be virtually devoid of useful data to help face their specific challenges. However, even in areas with available resources and good infrastructure, data and knowledge gaps are still prevalent, because agricultural environments are mostly uncontrolled and there are vast numbers of factors that need to be taken into account and properly measured for a full characterization of a given area. As a result, data from a single sensor type are frequently unable to provide unambiguous answers, even with very effective algorithms, and even if the problem at hand is well defined and limited in scope. Fusing the information contained in different sensors and in data from different types is one possible solution that has been explored for some decades. The idea behind data fusion involves exploring complementarities and synergies of different kinds of data in order to extract more reliable and useful information about the areas being analyzed. While some success has been achieved, there are still many challenges that prevent a more widespread adoption of this type of approach. This is particularly true for the highly complex environments found in agricultural areas. In this article, we provide a comprehensive overview on the data fusion applied to agricultural problems; we present the main successes, highlight the main challenges that remain, and suggest possible directions for future research.
Collapse
|
5
|
Enhancing the Tracking of Seedling Growth Using RGB-Depth Fusion and Deep Learning. SENSORS 2021; 21:s21248425. [PMID: 34960519 PMCID: PMC8708901 DOI: 10.3390/s21248425] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/28/2021] [Revised: 12/07/2021] [Accepted: 12/14/2021] [Indexed: 11/16/2022]
Abstract
The use of high-throughput phenotyping with imaging and machine learning to monitor seedling growth is a tough yet intriguing subject in plant research. This has been recently addressed with low-cost RGB imaging sensors and deep learning during day time. RGB-Depth imaging devices are also accessible at low-cost and this opens opportunities to extend the monitoring of seedling during days and nights. In this article, we investigate the added value to fuse RGB imaging with depth imaging for this task of seedling growth stage monitoring. We propose a deep learning architecture along with RGB-Depth fusion to categorize the three first stages of seedling growth. Results show an average performance improvement of 5% correct recognition rate by comparison with the sole use of RGB images during the day. The best performances are obtained with the early fusion of RGB and Depth. Also, Depth is shown to enable the detection of growth stage in the absence of the light.
Collapse
|
6
|
da Silva DQ, dos Santos FN, Sousa AJ, Filipe V. Visible and Thermal Image-Based Trunk Detection with Deep Learning for Forestry Mobile Robotics. J Imaging 2021; 7:176. [PMID: 34564102 PMCID: PMC8468268 DOI: 10.3390/jimaging7090176] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2021] [Revised: 08/20/2021] [Accepted: 08/28/2021] [Indexed: 11/18/2022] Open
Abstract
Mobile robotics in forests is currently a hugely important topic due to the recurring appearance of forest wildfires. Thus, in-site management of forest inventory and biomass is required. To tackle this issue, this work presents a study on detection at the ground level of forest tree trunks in visible and thermal images using deep learning-based object detection methods. For this purpose, a forestry dataset composed of 2895 images was built and made publicly available. Using this dataset, five models were trained and benchmarked to detect the tree trunks. The selected models were SSD MobileNetV2, SSD Inception-v2, SSD ResNet50, SSDLite MobileDet and YOLOv4 Tiny. Promising results were obtained; for instance, YOLOv4 Tiny was the best model that achieved the highest AP (90%) and F1 score (89%). The inference time was also evaluated, for these models, on CPU and GPU. The results showed that YOLOv4 Tiny was the fastest detector running on GPU (8 ms). This work will enhance the development of vision perception systems for smarter forestry robots.
Collapse
Affiliation(s)
- Daniel Queirós da Silva
- INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal; (F.N.d.S.); (A.J.S.); (V.F.)
- School of Science and Technology, University of Trás-os-Montes e Alto Douro (UTAD), 5000-801 Vila Real, Portugal
| | - Filipe Neves dos Santos
- INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal; (F.N.d.S.); (A.J.S.); (V.F.)
| | - Armando Jorge Sousa
- INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal; (F.N.d.S.); (A.J.S.); (V.F.)
- Faculty of Engineering, University of Porto (FEUP), 4200-465 Porto, Portugal
| | - Vítor Filipe
- INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal; (F.N.d.S.); (A.J.S.); (V.F.)
- School of Science and Technology, University of Trás-os-Montes e Alto Douro (UTAD), 5000-801 Vila Real, Portugal
| |
Collapse
|
7
|
Skoczeń M, Ochman M, Spyra K, Nikodem M, Krata D, Panek M, Pawłowski A. Obstacle Detection System for Agricultural Mobile Robot Application Using RGB-D Cameras. SENSORS 2021; 21:s21165292. [PMID: 34450732 PMCID: PMC8399919 DOI: 10.3390/s21165292] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/30/2021] [Revised: 07/30/2021] [Accepted: 07/30/2021] [Indexed: 11/16/2022]
Abstract
Mobile robots designed for agricultural tasks need to deal with challenging outdoor unstructured environments that usually have dynamic and static obstacles. This assumption significantly limits the number of mapping, path planning, and navigation algorithms to be used in this application. As a representative case, the autonomous lawn mowing robot considered in this work is required to determine the working area and to detect obstacles simultaneously, which is a key feature for its working efficiency and safety. In this context, RGB-D cameras are the optimal solution, providing a scene image including depth data with a compromise between precision and sensor cost. For this reason, the obstacle detection effectiveness and precision depend significantly on the sensors used, and the information processing approach has an impact on the avoidance performance. The study presented in this work aims to determine the obstacle mapping accuracy considering both hardware- and information processing-related uncertainties. The proposed evaluation is based on artificial and real data to compute the accuracy-related performance metrics. The results show that the proposed image and depth data processing pipeline introduces an additional distortion of 38 cm.
Collapse
Affiliation(s)
- Magda Skoczeń
- Unitem, ul. Kominiarska 42C, 51-180 Wrocław, Poland; (M.O.); (K.S.); (M.N.); (D.K.); (M.P.); (A.P.)
- Faculty of Electronics, Wrocław University of Science and Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław, Poland
- Correspondence:
| | - Marcin Ochman
- Unitem, ul. Kominiarska 42C, 51-180 Wrocław, Poland; (M.O.); (K.S.); (M.N.); (D.K.); (M.P.); (A.P.)
- Faculty of Electronics, Wrocław University of Science and Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław, Poland
| | - Krystian Spyra
- Unitem, ul. Kominiarska 42C, 51-180 Wrocław, Poland; (M.O.); (K.S.); (M.N.); (D.K.); (M.P.); (A.P.)
| | - Maciej Nikodem
- Unitem, ul. Kominiarska 42C, 51-180 Wrocław, Poland; (M.O.); (K.S.); (M.N.); (D.K.); (M.P.); (A.P.)
- Faculty of Electronics, Wrocław University of Science and Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław, Poland
| | - Damian Krata
- Unitem, ul. Kominiarska 42C, 51-180 Wrocław, Poland; (M.O.); (K.S.); (M.N.); (D.K.); (M.P.); (A.P.)
| | - Marcin Panek
- Unitem, ul. Kominiarska 42C, 51-180 Wrocław, Poland; (M.O.); (K.S.); (M.N.); (D.K.); (M.P.); (A.P.)
| | - Andrzej Pawłowski
- Unitem, ul. Kominiarska 42C, 51-180 Wrocław, Poland; (M.O.); (K.S.); (M.N.); (D.K.); (M.P.); (A.P.)
| |
Collapse
|