Barboza et al. (2025) Corn Plant Detection Using YOLOv9 Across Different Soil Background Colors, Growth Stages, and UAV Flight Heights
Identification
- Journal: Remote Sensing
- Year: 2025
- Date: 2025-12-20
- Authors: Thiago O. C. Barboza, Adão Felipe dos Santos, Emily K. Bedwell, George Vellidis, Lorena N. Lacerda
- DOI: 10.3390/rs18010014
Research Groups
- Federal University of Lavras (UFLA), Lavras, Brazil
- University of Idaho, Kimberly, USA
- University of Georgia, Athens and Tifton, USA
Short Summary
This study evaluated the YOLOv9-small model for detecting and counting corn plants across varying soil backgrounds, growth stages (V2, V3, V5, V6), and UAV flight heights (30 m, 70 m). It found that V3 and V5 stages at 30 m flight height yielded the highest accuracy, while 70 m is acceptable for V5 to optimize mapping time, demonstrating the model's effectiveness for early-stage corn detection in real-world conditions.
Objective
- To evaluate the performance of the YOLOv9-small model for detecting and counting corn plants under varying soil background colors, growth stages (V2, V3, V5, V6), and UAV flight heights (30 m, 70 m) in real field conditions.
- Hypothesis I: Corn plants with three or more fully developed leaves (V3, V5) offer better detection conditions than earlier stages (V2) due to reduced visual similarity with the background.
- Hypothesis II: Higher UAV flight heights (70 m) can achieve moderate accuracy for corn plant detection across different growth stages.
Study Configuration
- Spatial Scale: Three distinct agricultural fields in Georgia, USA: Stripling Irrigation Research Park (SIRP) in Mitchell County (no-tillage system), Iron Horse I (IHI) in Greene County (conventional tillage, red-brown soil), and Iron Horse II (IHII) in Greene County (conventional tillage, gray/red-brown soil). Plant populations ranged from 74,000 to 75,000 plants per hectare, with row spacing of 0.75 m or 0.90 m.
- Temporal Scale: Four corn vegetative growth stages (V2, V3, V5, V6) during the 2024 growing season, with planting dates from 8 April to 15 May 2024.
Methodology and Data
- Models used: YOLOv9-small (You Only Look Once version 9, small version) deep learning model, incorporating Programmable Gradient Information (PGI) and Generalized Efficient Layer Aggregation Network (GELAN).
- Data sources: Unmanned Aerial Vehicle (UAV) RGB imagery collected using a DJI Mavic 3M drone. Images were captured at 30 m and 70 m flight heights, with 80% front and side overlap. A total of 1920 images were generated for each corn growth stage and flight height, cropped to 640 × 640 pixels. 96,000 corn plant instances were manually annotated with bounding boxes. The dataset was split into 70% for training, 20% for testing, and 10% for validation. Data augmentation (brightness, rotation, contrast) was applied to the training set.
Main Results
- The V3 and V5 corn growth stages consistently showed the highest detection accuracy, with mean Average Precision at 50% Intersection over Union (mAP50) values exceeding 85% in conventional tillage fields.
- Detection performance was slightly lower in gray/red-brown soil conditions (IHII) and no-tillage systems (SIRP) due to increased background interference.
- Increasing the UAV flight height from 30 m to 70 m generally reduced detection accuracy by 8–12%, although precision remained high, particularly at the V5 stage.
- The V2 and V6 stages exhibited the poorest detection performance, especially at 70 m flight height, with mAP50 values around 0.25, which improved to approximately 0.50 at 30 m.
- At 30 m flight height, the model converged more quickly with lower initial classification loss values. The SIRP field at 70 m required nearly 300 epochs for model convergence and showed the highest validation classification loss.
- Confusion matrix analysis revealed significant misclassification of background as corn (67%) at the V2 stage with 30 m flight height, and high misclassification of V2 corn plants as background (95% for SIRP) at 70 m.
- Validation results showed the best performance for V5 at both 30 m (R² = 0.73, Root Mean Square Error (RMSE) = 18 plants for SIRP) and 70 m (R² = 0.79, RMSE = 168 plants for SIRP).
- Soil contrast, particularly gray or dark-hue soil colors, enhanced detection performance, even at higher flight altitudes (70 m).
Contributions
- Comprehensive evaluation of the YOLOv9-small model for corn plant detection under a wide range of real-world agricultural conditions, including diverse soil backgrounds, multiple growth stages, and higher UAV flight altitudes.
- Demonstrated the practical feasibility of using a lightweight deep learning model (YOLOv9-small) for accurate corn plant detection at higher altitudes (70 m), which significantly improves scalability for large-scale field mapping and enables faster decision-making, addressing a key limitation of previous studies focused on ultra-low altitudes.
- Quantified the impact of different soil background colors and no-tillage management practices on model performance, providing critical insights for model robustness in varied field environments.
- Highlighted the trade-offs between spatial resolution, model complexity, and field coverage, offering guidance for optimizing UAV-based crop monitoring strategies in precision agriculture.
- Emphasized the cost-effectiveness and practical applicability of using standard RGB cameras for scalable agricultural object detection tasks, making the technology more accessible to growers and stakeholders.
Funding
- Georgia Commodity Commission for Corn (grant number [CR2409])
- USDA Foreign Agricultural Service (grant number [6000028574])
- National Council for Scientific and Technological Development (CNPq)
- Brazilian Federal Agency for Support and Evaluation of Graduate Education (CAPES)
- Fundação de Amparo a Pesquisa de Minas Gerais (FAPEMIG)
Citation
@article{Barboza2025Corn,
author = {Barboza, Thiago O. C. and Santos, Adão Felipe dos and Bedwell, Emily K. and Vellidis, George and Lacerda, Lorena N.},
title = {Corn Plant Detection Using YOLOv9 Across Different Soil Background Colors, Growth Stages, and UAV Flight Heights},
journal = {Remote Sensing},
year = {2025},
doi = {10.3390/rs18010014},
url = {https://doi.org/10.3390/rs18010014}
}
Original Source: https://doi.org/10.3390/rs18010014