Yan et al. (2025) Estimation of pear tree leaf area index using fused UAV multispectral and RGB imagery
Identification
- Journal: Smart Agricultural Technology
- Year: 2025
- Date: 2025-12-12
- Authors: Ning Yan, Juanjuan Ma, Jingming Wu, Qi Wang, Xuedong Zhang, Xu Li
- DOI: 10.1016/j.atech.2025.101717
Research Groups
- College of Information Engineering, Tarim University, China
- Key Laboratory of Tarim Oasis Agriculture, Ministry of Education, Tarim University, China
- National-Local Joint Engineering Laboratory of High Efficiency and Superior-Quality Cultivation and Fruit Deep Processing Technology on Characteristic Fruit Trees, China
- Modern Agricultural Engineering Key Laboratory at Universities of Education Department of Xinjiang Uygur Autonomous Region, China
Short Summary
This study developed a UAV-based multi-source and adaptive weighted ensemble framework to precisely estimate pear tree Leaf Area Index (LAI) during the fruit expansion stage, demonstrating that fusing multispectral and RGB imagery with an Optimized Integrated Algorithm (OIA) significantly enhances estimation accuracy and robustness.
Objective
- To achieve rapid, accurate, and non-destructive estimation of pear tree Leaf Area Index (LAI) during the critical fruit expansion stage using UAV-acquired multispectral and RGB imagery.
- To systematically compare the performance of single-source and fused data models, and various machine learning algorithms, with a focus on multi-source data fusion and ensemble algorithms for improving LAI retrieval accuracy.
Study Configuration
- Spatial Scale: A ten-thousand-mu (approximately 6.67 square kilometers) pear orchard in Matan Town, Seventh Regiment, First Division of the Xinjiang Production and Construction Corps, China. 90 representative plots, each 10 meters × 10 meters, were established. UAV flight altitude was 30 meters, yielding a ground sampling distance of approximately 2.5 centimeters for RGB imagery.
- Temporal Scale: Data acquisition and ground measurements were conducted in July 2025, specifically on 7 July, corresponding to the critical fruit expansion stage of pear trees.
Methodology and Data
- Models used:
- Support Vector Machine (SVM)
- Extreme Learning Machine (ELM)
- eXtreme Gradient Boosting (XGBoost)
- Optimized Integrated Algorithm (OIA): A linear weighted fusion strategy integrating SVM, ELM, and XGBoost predictions.
- Data sources:
- UAV-based multispectral imagery (DJI Matrice 3 M): Green (560 nm), Red (650 nm), Red-Edge (730 nm), and Near-Infrared (860 nm) bands. Features extracted include vegetation indices (e.g., NDVI, GNDVI, NDRE, SAVI, MSR) and Gray-Level Co-occurrence Matrix (GLCM) texture features.
- UAV-based RGB imagery (DJI Matrice 3 M, 20-megapixel sensor): Features extracted include color indices (e.g., ExG, NGRDI, GI, GRI, NRBI), GLCM texture features, and 2D discrete Haar wavelet features.
- Ground-measured LAI: Collected using the LAI-2200C Plant Canopy Analyzer (Li-COR, USA) with the A-BBBB observation method.
- Real-time kinematic (RTK) GPS: Used to record coordinates for precise spatial alignment of ground and UAV data.
Main Results
- Multi-source features (vegetation indices, color indices, texture features, and RGB wavelet features) showed significant correlations with measured LAI, with Pearson correlation coefficients up to 0.77 for NDVI and 0.69 for ExG.
- The fusion of multispectral and RGB features substantially improved LAI estimation accuracy and stability compared to single-source models.
- The Optimized Integrated Algorithm (OIA) consistently achieved the best performance across all feature combinations.
- The optimal model, OIA, integrating vegetation indices, multispectral texture features, and RGB wavelet features, achieved the highest accuracy on the testing set with a coefficient of determination (R²) of 0.744 and a Root Mean Square Error (RMSE) of 0.141 m²/m².
- Compared to the optimal multispectral-only model, the fused OIA model increased R² by 0.03 and decreased RMSE by 0.042 m²/m².
- Compared to the optimal RGB-only model, the fused OIA model increased R² by 0.025 and decreased RMSE by 0.038 m²/m².
- The spatial distribution map of LAI, generated by the optimal OIA model, effectively visualized distinct LAI zones (low, medium, higher, high) within the orchard, aligning with pear tree row orientations and reflecting canopy structural heterogeneity.
Contributions
- Innovatively established a UAV-based multi-source and adaptive weighted ensemble framework for LAI estimation in pear trees during the critical fruit expansion stage.
- Demonstrated the feasibility and significant advantages of integrating multispectral and RGB data for fruit tree monitoring.
- Incorporated RGB wavelet features, enriching the representation of canopy structure and providing novel variables for fine-scale fruit tree monitoring.
- Systematically evaluated multiple machine learning models, including the OIA, providing new insights into their accuracy and robustness for fruit tree remote sensing applications.
- Significantly enhanced spatial estimation accuracy of LAI, offering a scalable technical approach for precise monitoring, water and nutrient management, and yield assessment in fruit trees.
Funding
- Oasis Ecological Agriculture Corps Key Laboratory Open Project (Grant 202002)
- Corps Science and Technology Program (Grant 2021CB041, Grant 2021BB023, Grant 2021DB001)
- Tarim University Innovation Team Project (Grant TDZKCX202306, Grant TDZKCX202102)
- National Natural Science Foundation of China (Grant 61563046)
- China Agricultural University-Tarim University Joint Scientific Research Fund (Grant ZNLH202402)
- 2024 Aral City Science and Technology Program Project — Research and Application of an AI-Based Apple Harvesting Robot
Citation
@article{Yan2025Estimation,
author = {Yan, Ning and Ma, Juanjuan and Wu, Jingming and Wang, Qi and Zhang, Xuedong and Li, Xu},
title = {Estimation of pear tree leaf area index using fused UAV multispectral and RGB imagery},
journal = {Smart Agricultural Technology},
year = {2025},
doi = {10.1016/j.atech.2025.101717},
url = {https://doi.org/10.1016/j.atech.2025.101717}
}
Original Source: https://doi.org/10.1016/j.atech.2025.101717