1
|
Moving to Automated Tree Inventory: Comparison of UAS-Derived Lidar and Photogrammetric Data with Manual Ground Estimates. REMOTE SENSING 2020. [DOI: 10.3390/rs13010072] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Unmanned aircraft systems (UAS) have advanced rapidly enabling low-cost capture of high-resolution images with cameras, from which three-dimensional photogrammetric point clouds can be derived. More recently UAS equipped with laser scanners, or lidar, have been employed to create similar 3D datasets. While airborne lidar (originally from conventional aircraft) has been used effectively in forest systems for many years, the ability to obtain important tree features such as height, diameter at breast height, and crown dimensions is now becoming feasible for individual trees at reasonable costs thanks to UAS lidar. Getting to individual tree resolution is crucial for detailed phenotyping and genetic analyses. This study evaluates the quality of three three-dimensional datasets from three sensors—two cameras of different quality and one lidar sensor—collected over a managed, closed-canopy pine stand with different planting densities. For reference, a ground-based timber cruise of the same pine stand is also collected. This study then conducted three straightforward experiments to determine the quality of the three sensors’ datasets for use in automated forest inventory: manual mensuration of the point clouds to (1) detect trees and (2) measure tree heights, and (3) automated individual tree detection. The results demonstrate that, while both photogrammetric and lidar data are well-suited for single-tree forest inventory, the photogrammetric data from the higher-quality camera is sufficient for individual tree detection and height determination, but that lidar data is best. The automated tree detection algorithm used in the study performed well with the lidar data, detecting 98% of the 2199 trees in the pine stand, but fell short of manual mensuration within the lidar point cloud, where 100% of the trees were detected. The manually-mensurated heights in the lidar dataset correlated with field measurements at r = 0.95 with a bias of −0.25 m, where the photogrammetric datasets were again less accurate and precise.
Collapse
|
2
|
An Assessment of High-Density UAV Point Clouds for the Measurement of Young Forestry Trials. REMOTE SENSING 2020. [DOI: 10.3390/rs12244039] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
The measurement of forestry trials is a costly and time-consuming process. Over the past few years, unmanned aerial vehicles (UAVs) have provided some significant developments that could improve cost and time efficiencies. However, little research has examined the accuracies of these technologies for measuring young trees. This study compared the data captured by a UAV laser scanning system (ULS), and UAV structure from motion photogrammetry (SfM), with traditional field-measured heights in a series of forestry trials in the central North Island of New Zealand. Data were captured from UAVs, and then processed into point clouds, from which heights were derived and compared to field measurements. The results show that predictions from both ULS and SfM were very strongly correlated to tree heights (R2 = 0.99, RMSE = 5.91%, and R2 = 0.94, RMSE = 18.5%, respectively) but that the height underprediction was markedly lower for ULS than SfM (Mean Bias Error = 0.05 vs. 0.38 m). Integration of a ULS DTM to the SfM made a minor improvement in precision (R2 = 0.95, RMSE = 16.5%). Through plotting error against tree height, we identified a minimum threshold of 1 m, under which the accuracy of height measurements using ULS and SfM significantly declines. Our results show that SfM and ULS data collected from UAV remote sensing can be used to accurately measure height in young forestry trials. It is hoped that this study will give foresters and tree breeders the confidence to start to operationalise this technology for monitoring trials.
Collapse
|
3
|
Aboutalebi M, Torres-Rua AF, McKee M, Kustas WP, Nieto H, Alsina MM, White A, Prueger JH, McKee L, Alfieri J, Hipps L, Coopmans C, Dokoozlian N. Incorporation of Unmanned Aerial Vehicle (UAV) Point Cloud Products into Remote Sensing Evapotranspiration Models. REMOTE SENSING 2020; 12:50. [PMID: 32355570 PMCID: PMC7192004 DOI: 10.3390/rs12010050] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
In recent years, the deployment of satellites and unmanned aerial vehicles (UAVs) has led to production of enormous amounts of data and to novel data processing and analysis techniques for monitoring crop conditions. One overlooked data source amid these efforts, however, is incorporation of 3D information derived from multi-spectral imagery and photogrammetry algorithms into crop monitoring algorithms. Few studies and algorithms have taken advantage of 3D UAV information in monitoring and assessment of plant conditions. In this study, different aspects of UAV point cloud information for enhancing remote sensing evapotranspiration (ET) models, particularly the Two-Source Energy Balance Model (TSEB), over a commercial vineyard located in California are presented. Toward this end, an innovative algorithm called Vegetation Structural-Spectral Information eXtraction Algorithm (VSSIXA) has been developed. This algorithm is able to accurately estimate height, volume, surface area, and projected surface area of the plant canopy solely based on point cloud information. In addition to biomass information, it can add multi-spectral UAV information to point clouds and provide spectral-structural canopy properties. The biomass information is used to assess its relationship with in situ Leaf Area Index (LAI), which is a crucial input for ET models. In addition, instead of using nominal field values of plant parameters, spatial information of fractional cover, canopy height, and canopy width are input to the TSEB model. Therefore, the two main objectives for incorporating point cloud information into remote sensing ET models for this study are to (1) evaluate the possible improvement in the estimation of LAI and biomass parameters from point cloud information in order to create robust LAI maps at the model resolution and (2) assess the sensitivity of the TSEB model to using average/nominal values versus spatially-distributed canopy fractional cover, height, and width information derived from point cloud data. The proposed algorithm is tested on imagery from the Utah State University AggieAir sUAS Program as part of the ARS-USDA GRAPEX Project (Grape Remote sensing Atmospheric Profile and Evapotranspiration eXperiment) collected since 2014 over multiple vineyards located in California. The results indicate a robust relationship between in situ LAI measurements and estimated biomass parameters from the point cloud data, and improvement in the agreement between TSEB model output of ET with tower measurements when employing LAI and spatially-distributed canopy structure parameters derived from the point cloud data.
Collapse
Affiliation(s)
- Mahyar Aboutalebi
- Department of Civil and Environmental Engineering, Utah State University, Logan, UT 84322, USA
| | - Alfonso F. Torres-Rua
- Department of Civil and Environmental Engineering, Utah State University, Logan, UT 84322, USA
| | - Mac McKee
- Department of Civil and Environmental Engineering, Utah State University, Logan, UT 84322, USA
| | - William P. Kustas
- U. S. Department of Agriculture, Agricultural Research Service, Hydrology and Remote Sensing Laboratory, Beltsville, MD 20705, USA
| | - Hector Nieto
- Complutum Tecnologías de la Información Geográfica (COMPLUTIG), 28801 Madrid, Spain
| | | | - Alex White
- U. S. Department of Agriculture, Agricultural Research Service, Hydrology and Remote Sensing Laboratory, Beltsville, MD 20705, USA
| | - John H. Prueger
- U. S. Department of Agriculture, Agricultural Research Service, Hydrology and Remote Sensing Laboratory, Beltsville, MD 20705, USA
| | - Lynn McKee
- U. S. Department of Agriculture, Agricultural Research Service, Hydrology and Remote Sensing Laboratory, Beltsville, MD 20705, USA
| | - Joseph Alfieri
- U. S. Department of Agriculture, Agricultural Research Service, Hydrology and Remote Sensing Laboratory, Beltsville, MD 20705, USA
| | - Lawrence Hipps
- Plants, Soils and Climate Department, Utah State University, Logan, UT 84322, USA
| | - Calvin Coopmans
- Department of Electrical and Computer Engineering, Utah State University, Logan, UT 84322, USA
| | - Nick Dokoozlian
- E & J Gallo Winery Viticulture Research, Modesto, CA 95354, USA
| |
Collapse
|
4
|
Estimating Maize Above-Ground Biomass Using 3D Point Clouds of Multi-Source Unmanned Aerial Vehicle Data at Multi-Spatial Scales. REMOTE SENSING 2019. [DOI: 10.3390/rs11222678] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
Abstract
Crop above-ground biomass (AGB) is a key parameter used for monitoring crop growth and predicting yield in precision agriculture. Estimating the crop AGB at a field scale through the use of unmanned aerial vehicles (UAVs) is promising for agronomic application, but the robustness of the methods used for estimation needs to be balanced with practical application. In this study, three UAV remote sensing flight missions (using a multiSPEC-4C multispectral camera, a Micasense RedEdge-M multispectral camera, and an Alpha Series AL3-32 Light Detection and Ranging (LiDAR) sensor onboard three different UAV platforms) were conducted above three long-term experimental plots with different tillage treatments in 2018. We investigated the performances of the multi-source UAV-based 3D point clouds at multi-spatial scales using the traditional multi-variable linear regression model (OLS), random forest (RF), backpropagation neural network (BP), and support vector machine (SVM) methods for accurate AGB estimation. Results showed that crop height (CH) was a robust proxy for AGB estimation, and that high spatial resolution in CH datasets helps to improve maize AGB estimation. Furthermore, the OLS, RF, BP, and SVM methods all maintained an acceptable accuracy for AGB estimation; however, the SVM and RF methods performed slightly more robustly. This study is expected to optimize UAV systems and algorithms for specific agronomic applications.
Collapse
|
5
|
AGB Estimation in a Tropical Mountain Forest (TMF) by Means of RGB and Multispectral Images Using an Unmanned Aerial Vehicle (UAV). REMOTE SENSING 2019. [DOI: 10.3390/rs11121413] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The present investigation evaluates the accuracy of estimating above-ground biomass (AGB) by means of two different sensors installed onboard an unmanned aerial vehicle (UAV) platform (DJI Inspire I) because the high costs of very high-resolution imagery provided by satellites or light detection and ranging (LiDAR) sensors often impede AGB estimation and the determination of other vegetation parameters. The sensors utilized included an RGB camera (ZENMUSE X3) and a multispectral camera (Parrot Sequoia), whose images were used for AGB estimation in a natural tropical mountain forest (TMF) in Southern Ecuador. The total area covered by the sensors included 80 ha at lower elevations characterized by a fast-changing topography and different vegetation covers. From the total area, a core study site of 24 ha was selected for AGB calculation, applying two different methods. The first method used the RGB images and applied the structure for motion (SfM) process to generate point clouds for a subsequent individual tree classification. Per the classification at tree level, tree height (H) and diameter at breast height (DBH) could be determined, which are necessary input parameters to calculate AGB (Mg ha−1) by means of a specific allometric equation for wet forests. The second method used the multispectral images to calculate the normalized difference vegetation index (NDVI), which is the basis for AGB estimation applying an equation for tropical evergreen forests. The obtained results were validated against a previous AGB estimation for the same area using LiDAR data. The study found two major results: (i) The NDVI-based AGB estimates obtained by multispectral drone imagery were less accurate due to the saturation effect in dense tropical forests, (ii) the photogrammetric approach using RGB images provided reliable AGB estimates comparable to expensive LiDAR surveys (R2: 0.85). However, the latter is only possible if an auxiliary digital terrain model (DTM) in very high resolution is available because in dense natural forests the terrain surface (DTM) is hardly detectable by passive sensors due to the canopy layer, which impedes ground detection.
Collapse
|
6
|
Identification of Water Body Extent Based on Remote Sensing Data Collected with Unmanned Aerial Vehicle. WATER 2019. [DOI: 10.3390/w11020338] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The paper presents an efficient methodology of water body extent estimation based on remotely sensed data collected with UAV (Unmanned Aerial Vehicle). The methodology includes the data collection with selected sensors and processing of remotely sensed data to obtain accurate geospatial products that are finally used to estimate water body extent. Three sensors were investigated: RGB (Red Green Blue) camera, thermal infrared camera, and laser scanner. The platform used to carry each of these sensors was an Aibot X6—a multirotor type of UAV. Test data was collected at 6 sites containing different types of water bodies, including 4 river sections, an old river bed, and a part of a lake shore. The processing of collected data resulted in 2.5-D and 2-D geospatial products that were used subsequently for water body extent estimation. Depending on the type of used sensor, the created geospatial product, and the type of the water body and the land cover, three strategies employing image processing tools were developed to estimate water body range. The obtained results were assessed in terms of classification accuracy (distinguishing the water body from the land) and geometrical planar accuracy of the water body extent. The product identified as the most suitable in water body detection was four bands RGB+TIR (Thermal InfraRed) ortho mosaic. It allowed to achieve the average kappa coefficient of the water body identification above 0.9. The planar accuracy of water body extent varied depending on the type of the sensor, the geospatial product, and the test site conditions, but it was comparable with results obtained in similar studies.
Collapse
|
7
|
Greenness Indices from a Low-Cost UAV Imagery as Tools for Monitoring Post-Fire Forest Recovery. DRONES 2019. [DOI: 10.3390/drones3010006] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
During recent years unmanned aerial vehicles (UAVs) have been increasingly used for research and application in both agriculture and forestry. Nevertheless, most of this work has been devoted to improving accuracy and explanatory power, often at the cost of usability and affordability. We tested a low-cost UAV and a simple workflow to apply four different greenness indices to the monitoring of pine (Pinus sylvestris and P. nigra) post-fire regeneration in a Mediterranean forest. We selected two sites and measured all pines within a pre-selected plot. Winter flights were carried out at each of the sites, at two flight heights (50 and 120 m). Automatically normalized images entered an structure from motion (SfM) based photogrammetric software for restitution, and the obtained point cloud and orthomosaic processed to get a canopy height model and four different greenness indices. The sum of pine diameter at breast height (DBH) was regressed on summary statistics of greenness indices and the canopy height model. Excess green index (ExGI) and green chromatic coordinate (GCC) index outperformed the visible atmospherically resistant index (VARI) and green red vegetation index (GRVI) in estimating pine DBH, while canopy height slightly improved the models. Flight height did not severely affect model performance. Our results show that low cost UAVs may improve forest monitoring after disturbance, even in those habitats and situations where resource limitation is an issue.
Collapse
|
8
|
Abstract
This study aimed to characterize vineyard vegetation thorough multi-temporal monitoring using a commercial low-cost rotary-wing unmanned aerial vehicle (UAV) equipped with a consumer-grade red/green/blue (RGB) sensor. Ground-truth data and UAV-based imagery were acquired on nine distinct dates, covering the most significant vegetative growing cycle until harvesting season, over two selected vineyard plots. The acquired UAV-based imagery underwent photogrammetric processing resulting, per flight, in an orthophoto mosaic, used for vegetation estimation. Digital elevation models were used to compute crop surface models. By filtering vegetation within a given height-range, it was possible to separate grapevine vegetation from other vegetation present in a specific vineyard plot, enabling the estimation of grapevine area and volume. The results showed high accuracy in grapevine detection (94.40%) and low error in grapevine volume estimation (root mean square error of 0.13 m and correlation coefficient of 0.78 for height estimation). The accuracy assessment showed that the proposed method based on UAV-based RGB imagery is effective and has potential to become an operational technique. The proposed method also allows the estimation of grapevine areas that can potentially benefit from canopy management operations.
Collapse
|