Assessment of canopy cover fraction in sugar beet field using unmanned aerial vehicle imagery and different image segmentation methods

Document Type : Research Paper

Authors

1 Department of Water Sci. and Eng., Faculty of Agricultural and Natural Resources, Imam Khomeini International University, Qazvin, Iran.

2 Department of Water Science and Engineering, Faculty of Agriculture and Natural Resources, Imam Khomeini International University, Qazvin, Iran

Abstract

Canopy cover fraction is one of the most important criteria for investigating the crop growth and yield and is one of the input data of most plant models. Canopy cover fraction is an easier measurement than the other methods which id depended on field observations or image processing beyond the visible ‎spectrum. ‎ In this study, drone images of the sugar beet field in the cropping season of 2015-2016 and on the four dates from late May to late June at the Lindau center of plant sciences research, Switzerland were used. The research was conducted by six plant discrimination indices and three distinct thresholding algorithms to ‎segment sugar beet vegetation. ‎ Then, among the 18 investigated methods, the best 6 methods were evaluated by comparing their values with the ground truth values in 30 different regions of the farm and on four dates from the beginning of the four-leaf stage to the end of the six-leaf stage. Results showed that the ExG, GLI, and ‎RGBVI indices, in combination with the Otsu and Ridler-Calvard thresholding algorithms, ‎demonstrate optimal performance in vegetation segmentation. ‎ The evaluation statistics of NRMSE and R2 for the ExG&Otsu method as the most accurate method ‎were obtained as 5.13 % and 0.96, respectively.‎ Conversely, the RGBVI&RC method exhibits the least accuracy in the initial evaluation, with ‎NRMSE and R2 values of 8.18 % and 0.87, respectively. Comparative analysis of statistical indicators showed that the ExG&Otsu and ExG&RC methods with similar performance, displaying ‎the highest correlation with ground truths. Additionally, the GLI&Otsu method consistently demonstrates the lowest ‎error compared to ‎ground truths. ‎

Keywords

Main Subjects


Assessment of canopy cover fraction in sugar beet field using unmanned aerial vehicle imagery and different image segmentation methods

EXTENDED ABSTRACT

 

Introduction

Canopy cover fraction (CCF) is the fraction of crop canopies projected onto the ground surface. CCF is one of the most important criteria for investigating the crop growth and yield and is one of the input data of ‎most plant models.‎‏ ‏Unlike measurement methods relying on field observations or image processing beyond the visible spectrum, the ‎fraction of canopy cover can be conveniently estimated within the visible spectrum. CCF can be applied‏ ‏in the fields of controlling plant growth conditions, identification of the ‎leaves disease, ‎monitoring the status of necessary nutrients and controlling the plant stress ‎symptoms such as ‎drought stress, nutrient deficiency and weed stress.‎‏ ‏Nowadays segmentation ‎methods of digital images has achieved an important role in the part of ‎image processing in ‎agriculture. Segmentation, mainly means the discrimination of the leaves ‎pixels (green body as ‎foreground) from the pixels of the background. In this instance, different techniques have been ‎employed to segment canopy cover fractions (CCF). One widely used approach involves ‎combining canopy cover discrimination indices with thresholding algorithms. In this study, 18 ‎different methods were applied, comprising six indices and three thresholding algorithms, across ‎four dates and 30 regions within a UAV image of a sugar beet field. Utilizing discrete spatial ‎analysis enables a thorough examination of factors affecting canopy cover estimation, including ‎variations in light intensity, additional phenomena, and other influencing factors.‎

Methods

The dataset of drone images captured by the University of Bonn of the sugar beet field during the 2015-16 cropping season was utilized. These data were prepared using DJI MATRICE 100 drone and with dimensions of 4000 x 2000 pixels in the field of Lindau Plant Research Institute located in Switzerland (‎47.45°N, 8.68°E‎). Canopy cover segmentation was done using ExG, ExGR, ExGB, GLI, VARI, RGBVI indices and Otsu, Ridler-Calvard and Two-Peaks thresholding algorithms.

Results and Discussion

Different segmentation methods were assessed for accuracy through comparison with ground truth images produced using Envi 5.6 software. Initially, 18 methods were evaluated across all growth stages and in 30 regions of the image, utilizing NRMSE and R2 statistics. The top six methods (ExG&Otsu, ExG&Ridler-Calvard, GLI&Otsu, GLI&Ridler-Calvard, RGBVI&Otsu, and RGBVI&Ridler-Calvard) were then selected for detailed analysis on each of the four dates. The study revealed that the choice of indices has a greater impact on method accuracy compared to thresholding algorithms. This is due to the limitation and weakness of some indices in conditions of very high light intensity (such as light reflection) or very low light intensity (such as shadows). Among the indices, three indices ExG (NRMSE=5.13, R2=0.96), GLI (NRMSE=6.74, R2=0.92) and RGBVI (NRMSE=8.15, R2=0.87) showed better performances than ExGR (NRMSE=16.89, R2=0.76), ExGB (NRMSE=10.74, R2=0.77) and VARI (NRMSE=12.87, R2=0.89).

Conclusions

Such an accurate, fast and automated method for estimating CCF from digital images is potentially beneficial for many applications, including crop modelling. Unlike direct field methods, indirect methods such as segmentation image processing method are not destructive, save time and resources, and are less expensive. Selecting a suitable greenness discrimination index for segmentation is crucial. It's important to carefully consider both the strengths and limitations of the chosen index for future research endeavors.

Abdullah, S. L. S., Hambali, H., & Jamil, N. (2012). Segmentation of natural images using an improved ‎thresholding-based technique. Procedia Engineering, 41(Iris), 938–944. ‎https://doi.org/10.1016/j.proeng.2012.07.266‎.
Agapiou, A. (2020). Vegetation extraction using visible-bands from openly licensed unmanned aerial vehicle imagery. Drones, 4(2), 1-15. https://doi.org/10.3390/drones4020027.
Azimi S, Gandhi TK. (2020). Water Stress Identification in Chickpea Images using Machine Learning. IEEE Region 10 Humanitarian Technology Conference, R10-HTC2020-December: https://doi.org/10.1109/R10-HTC49770.2020.9356973‎.
Biabi H, Abdanan Mehdizadeh S, Salehi Salmi M. (2019). Design and implementation of a smart system for water ‎management of lilium flower using image processing. Computers and Electronics in Agriculture, 160:131–143. ‎https://doi.org/10.1016/j.compag.2019.03.019‎.
Chandel NS, Chakraborty SK, Rajwade YA., et al. (2020). Identifying crop water stress using deep learning models. ‎Neural Computing and Applications, 4:. https://doi.org/10.1007/s00521-020-05325-4‎.
Clover, G. R. G., Smith, H. G., Azam-Ali, S. N., et al. (1999). The effects of drought on sugar beet growth in isolation and in combination with beet yellows virus infection. Journal of Agricultural Science, 133(3), 251–261. https://doi.org/10.1017/S0021859699007005.
Coy, A. et al. (2016) Increasing the accuracy and automation of fractional vegetation cover estimation from digital photographs, Remote Sensing, 8(7), 21–25. doi: 10.3390/rs8070474.
Fawcett D, Panigada C, Tagliabue G., et al. (2020). Multi-Scale Evaluation of Drone-Based Multispectral Surface ‎Reflectance and Vegetation Indices in Operational Conditions. Remote Sensing, 2020(12), 514. ‎https://doi.org/10.3390/RS12030514‎.
Gašparović M, Zrinjski M, Barković Đ., et al. (2020). An automatic method for weed mapping in oat fields ‎based on UAV imagery. Computers and Electronics in Agriculture, 173, ‎‎105385. ‎https://doi.org/10.1016/J.COMPAG.2020.105385‎.
Ghosal S, Blystone D, Singh AK., et al. (2018). An explainable deep machine vision framework for plant stress ‎phenotyping. Proceedings of the National Academy of Sciences of the United States of America,115, 4613–4618. https://doi.org/10.1073/pnas.1716999115‎.
Góraj M, Wróblewski C, Ciężkowski W., et al. (2019). Free water table area monitoring on wetlands using satellite and ‎UAV orthophotomaps – Kampinos National Park case study. Meteorology Hydrology and Water Management, 7, 23–30. ‎https://doi.org/10.26491/MHWM/95086‎.
Gooyandeh, M., Mirlatifi, S. M., & Akbari, M. (2019). Estimating Leaf Area Index of a corn silage field using a Modified Commercial Digital Camera. Iranian Journal of Irrigation & Drainage12(6), 1396-1406. (In Persian)
Haddadi, S. R., Soltani M., & Hashemi M. (2023). Evaluation of different vegetation discriminator indices ‎and image ‎processing algorithms to ‎estimate water productivity. Water Management in Agriculture 10(1), 159-‎‎174. (In Persian)
Haddadi, S. R., Soltani, M., & Hashemi, M. (2022). Comparing the accuracy of different image processing methods to estimate sugar beet canopy cover by digital camera images. Water and Irrigation Management, 12(2), 295-308. doi: 10.22059/jwim.2022.336225.954 (In Persian)
He, H. J., Zheng, C. & Sun, D. W. (2016). Image ‎segmentation techniques. International Computer Vision ‎Technology for Food Quality Evaluation: ‎Second Edition. Elsevier Inc. ‎https://doi.org/10.1016/B978-0-12-802232-‎‎0.00002-5‎.
Hernández-Hernández, J. L., García-Mateos, G., González-Esquiva, J.M., et al. (2016). Optimal color space selection method for plant/soil segmentation in agriculture. Computers and Electronics in Agriculture. 122, 124-132. doi: 10.1016/j.compag.2016.01.020.
Inoue Y. (2020). Satellite- and drone-based remote sensing of crops and soils for smart farming – a review. Soil Science and Plant Nutrition, 66, 798–810. https://doi.org/10.1080/00380768.2020.1738899.
Kalischuk M, Paret ML, Freeman JH., et al. (2019). An improved crop scouting technique incorporating unmanned ‎aerial vehicle-assisted multispectral crop imaging into conventional scouting practice for gummy stem blight ‎in Watermelon. Plant Disease, 103, 1642–1650. https://doi.org/10.1094/PDIS-08-18-1373-‎RE/ASSET/IMAGES/LARGE/PDIS-08-18-1373-RE_F6.JPEG.
Latifoltojar, S., Jafari, A., Nassiri, S. M., et al. (2014). Yield estimation of sugar beet based on plant canopy using machine vision methods. Journal of Agricultural Machinery, 4(2), 275–284. (In Persian)
Lee, K. J. & Lee, B. W. (2011). Estimating canopy cover from color digital camera image of rice field, Journal of Crop Science and Biotechnology, 14(2), 151–155. doi: 10.1007/s12892-011-0029-z.
Liu, Y., Hatou, K., Aihara, T., et al (2021). A robust vegetation index based on different uav rgb images to estimate SPAD values of naked barley leaves. Remote Sensing, 13(4), 1-21. https://doi.org/10.3390/rs13040686.
Louhaichi, M, Borman, M. M. & Johnson, D. E. (2001). Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto International, 16(1), 65-70, doi: 10.1080/10106040108542184.
Luna I, Lobo A. (2016). Mapping crop planting quality in sugarcane from UAV imagery: A pilot study in Nicaragua. ‎Remote Sensing, 8, 1–18. https://doi.org/10.3390/rs8060500‎.
Melville B, Lucieer A, Aryal J. (2019). Classification of Lowland Native Grassland Communities Using Hyperspectral ‎Unmanned Aircraft System (UAS) Imagery in the Tasmanian Midlands. Drones, 3, 5. ‎https://doi.org/10.3390/DRONES3010005‎.
Miraki, M., Sohrabi, H., & Fatehi, P. (2022). Citrus trees identification and trees stress detection based on spectral data derived from UAVs. Research in Horticultural Sciences1(1), 27-40. doi: 10.22092/rhsj.2022.127815. (In Persian)
Negash L, Kim HY, Choi HL. (2019). Emerging UAV Applications in Agriculture. 2019 7th International Conference on Robot Intelligence Technology and Applications, RiTA 2019, 254–257. https://doi.org/10.1109/RITAPP.2019.8932853‎.
Nguyen LQ, Bui LK, Cao CX., et al. (2024). Application of artificial neural networks and UAV-based air quality ‎monitoring sensors for simulating dust emission in quarries. Applications of Artificial Intelligence in Mining, Geotechnical and Geoengineering, 7–22. ‎https://doi.org/10.1016/B978-0-443-18764-3.00012-6‎.
Niu, Y., Han, W., Zhang, H., et al. (2021). Estimating fractional vegetation cover of maize under water stress from UAV multispectral imagery using machine learning algorithms. Computers and Electronics in Agriculture, 189(August), 106414. https://doi.org/10.1016/j.compag.2021.106414.
Orak, H., Abdanan Mehdizeh, S., & Sadi, M. (2018). Predicting sugar beet performance by online image processing. Journal of Sugar Beet, 34(2), 181–191. https://doi.org/10.22092/jsb.2019.120670.1178 (In Persian)
Otsu, N. (1979). A Threshold Selection Method from Gray-Level Histograms. IEEE Transaction on Systems, Man and Cybernetics, 20(1), 62–66.
Parker, J. R. )2011(. Algorithms for image prcessing ‎and computer vision. International Journal of Chemical ‎Information and Modeling. 53(9).‎
Possoch, M., Bieker, S., Hoffmeister, D., et al. (2016). Multi-temporal crop surface models combined with the RGB vegetation index from UAV-based images for forage monitoring in grassland. International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, 41, 991–998. https://doi.org/10.5194/ isprsarchives-XLI-B1-991-2016.
Riehle, D., Reiser, D., & Griepentrog, H. W. (2020). Robust index-based semantic plant/background segmentation for RGB- images. Computers and Electronics in Agriculture, 169(December 2019), 105201. https://doi.org/10.1016/j.compag.2019.105201.
Richards, J.A. Remote Sensing Digital Image Analysis Berlin. (1999). Springer-Verlag, 240.‎
Saberioon, M. M., Gholizadeh, A. (2016). Novel approach for estimating nitrogen content in paddy fields using low altitude remote sensing system. International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, 41, 1011–1015. https://doi.org/ 10.5194/isprsarchives-XLI-B1-1011-2016.
Soltani, M. Estimating maize canopy cover percent by means of image processing algorithms. Water and Irrigation Management, 2023; (): -. doi: 10.22059/jwim.2023.364331.1098.
Su J, Coombes M, Liu C., et al. (2018). Wheat Drought Assessment by Remote Sensing Imagery Using Unmanned ‎Aerial Vehicle. Chinese Control Conference, CCC, 2018(July):10340–10344. ‎https://doi.org/10.23919/ChiCC.2018.8484005‎.
Wakamori K, Mizuno R, Nakanishi G., et al. (2020). Multimodal neural network with clustering-based drop for ‎estimating plant water stress. Computers and Electronics in Agriculture, 168, 105118. ‎https://doi.org/10.1016/j.compag.2019.105118‎.
Wan, L., Li, Y., Cen, H., et al. (2018). Combining UAV-based vegetation indices and image classification to estimate flower number in oilseed rape. Remote Sensing, 10(9). https://doi.org/10.3390/rs10091484.
Wenhua M, Yiming W, Yueqing W. (2003). Real-time Detection of Between-row Weeds Using Machine Vision. 2003 ASAE Annual International Meeting, https://doi.org/10.13031/2013.15381)‎.
Woebbecke, D. M., Meyer, G. E., Von Bargen, K., ‎et al. (1995). Color indices for ‎weed identification under various soil, residue, ‎and lighting conditions. Transactions of the ‎American Society of Agricultural Engineers, ‎‎38(1): 259–269.
Yang, B.H., Wang, M.X., Sha, Z.X., et al. (2019). Evaluation of aboveground nitrogen content of winter wheat using digital imagery of unmanned aerial vehicles. Sensors (Basel), 19(20). https://doi.org/ 10.3390/s19204416.
Zhang X, Han L, Dong Y., et al. (2019). A Deep Learning-Based Approach for Automated Yellow Rust Disease ‎Detection from High-Resolution Hyperspectral UAV Images. Remote Sensing, 11, 1554. ‎https://doi.org/10.3390/RS11131554‎.
Zhuang S, Wang P, Jiang B., et al. (2017). Early detection of water stress in maize based on digital images. Computers and Electronics in Agriculture, 140, 461–468. https://doi.org/10.1016/j.compag.2017.06.022‎.