1
|
Cao M, Tang F, Ji P, Ma F. Improved Real-Time Semantic Segmentation Network Model for Crop Vision Navigation Line Detection. FRONTIERS IN PLANT SCIENCE 2022; 13:898131. [PMID: 35720554 PMCID: PMC9201824 DOI: 10.3389/fpls.2022.898131] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/17/2022] [Accepted: 05/10/2022] [Indexed: 06/15/2023]
Abstract
Field crops are generally planted in rows to improve planting efficiency and facilitate field management. Therefore, automatic detection of crop planting rows is of great significance for achieving autonomous navigation and precise spraying in intelligent agricultural machinery and is an important part of smart agricultural management. To study the visual navigation line extraction technology of unmanned aerial vehicles (UAVs) in farmland environments and realize real-time precise farmland UAV operations, we propose an improved ENet semantic segmentation network model to perform row segmentation of farmland images. Considering the lightweight and low complexity requirements of the network for crop row detection, the traditional network is compressed and replaced by convolution. Based on the residual network, we designed a network structure of the shunting process, in which low-dimensional boundary information in the feature extraction process is passed backward using the residual stream, allowing efficient extraction of low-dimensional information and significantly improving the accuracy of boundary locations and row-to-row segmentation of farmland crops. According to the characteristics of the segmented image, an improved random sampling consensus algorithm is proposed to extract the navigation line, define a new model-scoring index, find the best point set, and use the least-squares method to fit the navigation line. The experimental results showed that the proposed algorithm allows accurate and efficient extraction of farmland navigation lines, and it has the technical advantages of strong robustness and high applicability. The algorithm can provide technical support for the subsequent quasi-flight of agricultural UAVs in farmland operations.
Collapse
|
2
|
Crop Row Segmentation and Detection in Paddy Fields Based on Treble-Classification Otsu and Double-Dimensional Clustering Method. REMOTE SENSING 2021. [DOI: 10.3390/rs13050901] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Visual navigation is developing rapidly and is of great significance to improve agricultural automation. The most important issue involved in visual navigation is extracting a guidance path from agricultural field images. Traditional image segmentation methods may fail to work in paddy field, for the colors of weed, duckweed, and eutrophic water surface are very similar to those of real rice seedings. To deal with these problems, a crop row segmentation and detection algorithm, designed for complex paddy fields, is proposed. Firstly, the original image is transformed to the grayscale image and then the treble-classification Otsu method classifies the pixels in the grayscale image into three clusters according to their gray values. Secondly, the binary image is divided into several horizontal strips, and feature points representing green plants are extracted. Lastly, the proposed double-dimensional adaptive clustering method, which can deal with gaps inside a single crop row and misleading points between real crop rows, is applied to obtain the clusters of real crop rows and the corresponding fitting line. Quantitative validation tests of efficiency and accuracy have proven that the combination of these two methods constitutes a new robust integrated solution, with attitude error and distance error within 0.02° and 10 pixels, respectively. The proposed method achieved better quantitative results than the detection method based on typical Otsu under various conditions.
Collapse
|
3
|
Zhang S, Guo J, Wang Z. Combing K-means Clustering and Local Weighted Maximum Discriminant Projections for Weed Species Recognition. FRONTIERS IN COMPUTER SCIENCE 2019. [DOI: 10.3389/fcomp.2019.00004] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
|
4
|
Richard K, Abdel-Rahman EM, Subramanian S, Nyasani JO, Thiel M, Jozani H, Borgemeister C, Landmann T. Maize Cropping Systems Mapping Using RapidEye Observations in Agro-Ecological Landscapes in Kenya. SENSORS 2017; 17:s17112537. [PMID: 29099780 PMCID: PMC5713137 DOI: 10.3390/s17112537] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/29/2017] [Revised: 09/27/2017] [Accepted: 10/20/2017] [Indexed: 11/29/2022]
Abstract
Cropping systems information on explicit scales is an important but rarely available variable in many crops modeling routines and of utmost importance for understanding pests and disease propagation mechanisms in agro-ecological landscapes. In this study, high spatial and temporal resolution RapidEye bio-temporal data were utilized within a novel 2-step hierarchical random forest (RF) classification approach to map areas of mono- and mixed maize cropping systems. A small-scale maize farming site in Machakos County, Kenya was used as a study site. Within the study site, field data was collected during the satellite acquisition period on general land use/land cover (LULC) and the two cropping systems. Firstly, non-cropland areas were masked out from other land use/land cover using the LULC mapping result. Subsequently an optimized RF model was applied to the cropland layer to map the two cropping systems (2nd classification step). An overall accuracy of 93% was attained for the LULC classification, while the class accuracies (PA: producer’s accuracy and UA: user’s accuracy) for the two cropping systems were consistently above 85%. We concluded that explicit mapping of different cropping systems is feasible in complex and highly fragmented agro-ecological landscapes if high resolution and multi-temporal satellite data such as 5 m RapidEye data is employed. Further research is needed on the feasibility of using freely available 10–20 m Sentinel-2 data for wide-area assessment of cropping systems as an important variable in numerous crop productivity models.
Collapse
Affiliation(s)
- Kyalo Richard
- International Center for Insect Physiology and Ecology (ICIPE), P.O. Box 30772, 00100 Nairobi, Kenya.
| | - Elfatih M Abdel-Rahman
- International Center for Insect Physiology and Ecology (ICIPE), P.O. Box 30772, 00100 Nairobi, Kenya.
- Department of Agronomy, Faculty of Agriculture, University of Khartoum, Khartoum North 13314, Sudan.
| | - Sevgan Subramanian
- International Center for Insect Physiology and Ecology (ICIPE), P.O. Box 30772, 00100 Nairobi, Kenya.
| | - Johnson O Nyasani
- International Center for Insect Physiology and Ecology (ICIPE), P.O. Box 30772, 00100 Nairobi, Kenya.
- Crop Health Unit, Kenya Agricultural and Livestock Research Organization, Embu Research Centre, P.O. Box 27, 60100 Embu, Kenya.
| | - Michael Thiel
- Department of Remote Sensing, University of Würzburg, Oswald-Külpe-Weg 86, 97074 Würzburg, Germany.
| | - Hosein Jozani
- Department of Remote Sensing, University of Würzburg, Oswald-Külpe-Weg 86, 97074 Würzburg, Germany.
| | - Christian Borgemeister
- Center for Development Research (ZEF), Department of Ecology and Natural Resources Management, University of Bonn, Walter-Flex-Str. 3, 53113 Bonn, Germany.
| | - Tobias Landmann
- International Center for Insect Physiology and Ecology (ICIPE), P.O. Box 30772, 00100 Nairobi, Kenya.
| |
Collapse
|
5
|
Assessing Lodging Severity over an Experimental Maize (Zea mays L.) Field Using UAS Images. REMOTE SENSING 2017. [DOI: 10.3390/rs9090923] [Citation(s) in RCA: 54] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
|
6
|
|
7
|
|
8
|
Bengochea-Guevara JM, Conesa-Muñoz J, Andújar D, Ribeiro A. Merge Fuzzy Visual Servoing and GPS-Based Planning to Obtain a Proper Navigation Behavior for a Small Crop-Inspection Robot. SENSORS 2016; 16:276. [PMID: 26927102 PMCID: PMC4813851 DOI: 10.3390/s16030276] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/21/2015] [Revised: 02/11/2016] [Accepted: 02/19/2016] [Indexed: 11/16/2022]
Abstract
The concept of precision agriculture, which proposes farming management adapted to crop variability, has emerged in recent years. To effectively implement precision agriculture, data must be gathered from the field in an automated manner at minimal cost. In this study, a small autonomous field inspection vehicle was developed to minimise the impact of the scouting on the crop and soil compaction. The proposed approach integrates a camera with a GPS receiver to obtain a set of basic behaviours required of an autonomous mobile robot to inspect a crop field with full coverage. A path planner considered the field contour and the crop type to determine the best inspection route. An image-processing method capable of extracting the central crop row under uncontrolled lighting conditions in real time from images acquired with a reflex camera positioned on the front of the robot was developed. Two fuzzy controllers were also designed and developed to achieve vision-guided navigation. A method for detecting the end of a crop row using camera-acquired images was developed. In addition, manoeuvres necessary for the robot to change rows were established. These manoeuvres enabled the robot to autonomously cover the entire crop by following a previously established plan and without stepping on the crop row, which is an essential behaviour for covering crops such as maize without damaging them.
Collapse
Affiliation(s)
| | - Jesus Conesa-Muñoz
- Center for Automation and Robotics, CSIC-UPM, Arganda del Rey, Madrid 28500, Spain.
| | - Dionisio Andújar
- Center for Automation and Robotics, CSIC-UPM, Arganda del Rey, Madrid 28500, Spain.
| | - Angela Ribeiro
- Center for Automation and Robotics, CSIC-UPM, Arganda del Rey, Madrid 28500, Spain.
| |
Collapse
|
9
|
Adaptive bacteria colony picking in unstructured environments using intensity histogram and unascertained LS-SVM classifier. ScientificWorldJournal 2014; 2014:928395. [PMID: 24955423 PMCID: PMC4052681 DOI: 10.1155/2014/928395] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2014] [Accepted: 04/10/2014] [Indexed: 12/01/2022] Open
Abstract
Features analysis is an important task which can significantly affect the performance of automatic bacteria colony picking. Unstructured environments also affect the automatic colony screening. This paper presents a novel approach for adaptive colony segmentation in unstructured environments by treating the detected peaks of intensity histograms as a morphological feature of images. In order to avoid disturbing peaks, an entropy based mean shift filter is introduced to smooth images as a preprocessing step. The relevance and importance of these features can be determined in an improved support vector machine classifier using unascertained least square estimation. Experimental results show that the proposed unascertained least square support vector machine (ULSSVM) has better recognition accuracy than the other state-of-the-art techniques, and its training process takes less time than most of the traditional approaches presented in this paper.
Collapse
|
10
|
Romeo J, Guerrero JM, Montalvo M, Emmi L, Guijarro M, Gonzalez-de-Santos P, Pajares G. Camera sensor arrangement for crop/weed detection accuracy in agronomic images. SENSORS 2013; 13:4348-66. [PMID: 23549361 PMCID: PMC3673087 DOI: 10.3390/s130404348] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/28/2013] [Revised: 03/25/2013] [Accepted: 03/27/2013] [Indexed: 11/16/2022]
Abstract
In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.
Collapse
Affiliation(s)
- Juan Romeo
- Department of Software Engineering and Artificial Intelligence, Faculty of Informatics, Complutense University, Madrid 28040, Spain; E-Mails: (J.M.G.); (M.G.)
- Authors to whom correspondence should be addressed; E-Mails: (J.R.); (G.P.); Tel.: +34-1-394-7546 (G.P.); Fax: +34-1-394-7547 (G.P.)
| | - José Miguel Guerrero
- Department of Software Engineering and Artificial Intelligence, Faculty of Informatics, Complutense University, Madrid 28040, Spain; E-Mails: (J.M.G.); (M.G.)
| | - Martín Montalvo
- Department of Computer Architecture and Automatic Control, Faculty of Informatics, Complutense University, Madrid 28040, Spain; E-Mail:
| | - Luis Emmi
- Centre for Automation and Robotics (UPM-CSIC), Arganda del Rey 28500, Madrid, Spain; E-Mails: (L.E.); (P.G.-S.)
| | - María Guijarro
- Department of Software Engineering and Artificial Intelligence, Faculty of Informatics, Complutense University, Madrid 28040, Spain; E-Mails: (J.M.G.); (M.G.)
| | - Pablo Gonzalez-de-Santos
- Centre for Automation and Robotics (UPM-CSIC), Arganda del Rey 28500, Madrid, Spain; E-Mails: (L.E.); (P.G.-S.)
| | - Gonzalo Pajares
- Department of Software Engineering and Artificial Intelligence, Faculty of Informatics, Complutense University, Madrid 28040, Spain; E-Mails: (J.M.G.); (M.G.)
- Authors to whom correspondence should be addressed; E-Mails: (J.R.); (G.P.); Tel.: +34-1-394-7546 (G.P.); Fax: +34-1-394-7547 (G.P.)
| |
Collapse
|