Fakultät Agrarwissenschaften
Permanent URI for this communityhttps://hohpublica.uni-hohenheim.de/handle/123456789/9
Die Fakultät entwickelt in Lehre und Forschung nachhaltige Produktionstechniken der Agrar- und Ernährungswirtschaft. Sie erarbeitet Beiträge für den ländlichen Raum und zum Verbraucher-, Tier- und Umweltschutz.
Homepage: https://agrar.uni-hohenheim.de/
Browse
Browsing Fakultät Agrarwissenschaften by Sustainable Development Goals "2"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
Publication Effects of different ground segmentation methods on the accuracy of UAV-based canopy volume measurements(2024) Han, Leng; Wang, Zhichong; He, Miao; He, Xiongkui; Han, Leng; College of Science, China Agricultural University, Beijing, China; Wang, Zhichong; Tropics and Subtropics Group, Institute of Agricultural Engineering, University of Hohenheim, Stuttgart, Germany; He, Miao; College of Science, China Agricultural University, Beijing, China; He, Xiongkui; College of Science, China Agricultural University, Beijing, ChinaThe nonuniform distribution of fruit tree canopies in space poses a challenge for precision management. In recent years, with the development of Structure from Motion (SFM) technology, unmanned aerial vehicle (UAV) remote sensing has been widely used to measure canopy features in orchards to balance efficiency and accuracy. A pipeline of canopy volume measurement based on UAV remote sensing was developed, in which RGB and digital surface model (DSM) orthophotos were constructed from captured RGB images, and then the canopy was segmented using U-Net, OTSU, and RANSAC methods, and the volume was calculated. The accuracy of the segmentation and the canopy volume measurement were compared. The results show that the U-Net trained with RGB and DSM achieves the best accuracy in the segmentation task, with mean intersection of concatenation (MIoU) of 84.75% and mean pixel accuracy (MPA) of 92.58%. However, in the canopy volume estimation task, the U-Net trained with DSM only achieved the best accuracy with Root mean square error (RMSE) of 0.410 m 3 , relative root mean square error (rRMSE) of 6.40%, and mean absolute percentage error (MAPE) of 4.74%. The deep learning-based segmentation method achieved higher accuracy in both the segmentation task and the canopy volume measurement task. For canopy volumes up to 7.50 m 3 , OTSU and RANSAC achieve an RMSE of 0.521 m 3 and 0.580 m 3 , respectively. Therefore, in the case of manually labeled datasets, the use of U-Net to segment the canopy region can achieve higher accuracy of canopy volume measurement. If it is difficult to cover the cost of data labeling, ground segmentation using partitioned OTSU can yield more accurate canopy volumes than RANSAC.