cc_byParaforos, Dimitrios S.Sharipov, Galibjon M.Heiß, AndreasGriepentrog, Hans W.2024-10-232024-10-232022https://hohpublica.uni-hohenheim.de/handle/123456789/16765https://doi.org/10.3390/agriculture12060885Remote sensing data in agriculture that are originating from unmanned aerial vehicles (UAV)-mounted multispectral cameras offer substantial information in assessing crop status, as well as in developing prescription maps for site-specific variable rate applications. The position accuracy of the multispectral imagery plays an important role in the quality of the final prescription maps and how well the latter correspond to the specific spatial characteristics. Although software products and developed algorithms are important in offering position corrections, they are time- and cost-intensive. The paper presents a methodology to assess the accuracy of the imagery obtained by using a mounted target prism on the UAV, which is tracked by a ground-based total station. A Parrot Sequoia+ multispectral camera was used that is widely utilized in agriculture-related remote sensing applications. Two sets of experiments were performed following routes that go along the north–south and east–west axes, while the cross-track error was calculated for all three planes, but also three-dimensional (3D) space. From the results, it was indicated that the camera’s D-GNSS receiver can offer imagery with a 3D position accuracy of up to 3.79 m, while the accuracy in the horizontal plane is higher compared to the vertical ones.engTotal stationDroneMultispectral imagery3D positioningCross-track error630Position accuracy assessment of a uav-mounted sequoia+ multispectral camera using a robotic total stationArticle1808534271