ACCURACY ANALYSIS OF MEASURING CLOSE-RANGE IMAGE POINTS
Jurate Suziedelyte-Visockiene, Renata Bagdziunaite
Strony: 305-313
Opublikowano: 1 Jan 2013
Wyświetlenia: 254
Pobrania: 29
Streszczenie: The performed investigations are aimed at estimating the accuracy of image processing using different image point measurements. For this purpose, digital close-range images were processed applying photogrammetric software PhotoMod. The measurements have been made employing two methods: stereo and manual mode. Two or more overlapping images are matched when control and tie points are estimated. The images of two objects have been taken for experimental investigation. Control points and tie points were measured switching either to stereo or manual mode applying the required software. The control points of the first object are distributed on the surface of a smooth facade and on the surface of different (a few) levels. The process of image matching includes the calculation of the correlation coefficient, vertical parallax residuals and the root mean square of the object. Following image transformation (adjustment processes) to the created 3D model, the accuracy of the measured points is determined. All these values show the precision of close-range photogrammetric processes. Such accuracy satisfies requirements for creating a proper digital terrain model and orthophoto generation.
Słowa kluczowe: close-range photogrammetry, correlation coefficient, manual mode, stereo mode
Cytowanie artykułu: Jurate Suziedelyte-Visockiene, Renata Bagdziunaite. ACCURACY ANALYSIS OF MEASURING CLOSE-RANGE IMAGE POINTS. Journal of International Scientific Publications: Materials, Methods & Technologies 7, 305-313 (2013). https://www.scientific-publications.net/en/article/1003061/
Powrót do spisu treści tomu
© 2026 The Author(s). This is an open access article distributed under the terms of the
Creative Commons Attribution License https://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. This permission does not cover any third party copyrighted material which may appear in the work requested.
Disclaimer: The Publisher and/or the editor(s) are not responsible for the statements, opinions, and data contained in any published works. These are solely the views of the individual author(s) and contributor(s). The Publisher and/or the editor(s) disclaim any liability for injury to individuals or property arising from the ideas, methods, instructions, or products mentioned in the content.