Publication:
Visual-SLAM Based 3-Dimensional Modelling of Indoor Environments

dc.authorscopusid59460556000
dc.authorscopusid56572918500
dc.authorwosidİlçi, Veli/Aai-1611-2020
dc.contributor.authorOzbayrak, Simla
dc.contributor.authorIlci, Veli
dc.contributor.authorIDİlçi, Veli/0000-0002-9485-874X
dc.contributor.authorIDÖzbayrak, Simla/0009-0003-5398-8972
dc.date.accessioned2025-12-11T01:20:38Z
dc.date.issued2024
dc.departmentOndokuz Mayıs Üniversitesien_US
dc.department-temp[Ozbayrak, Simla; Ilci, Veli] Ondokuz Mayis Univ, Dept Geomat Engn, Samsun, Turkiyeen_US
dc.descriptionİlçi, Veli/0000-0002-9485-874X; Özbayrak, Simla/0009-0003-5398-8972en_US
dc.description.abstractSimultaneous localization and mapping (SLAM) is used in many fields to enable robots to map their surroundings and locate themselves in new circumstances. Visual-SLAM (VSLAM), which uses a camera sensor, and LiDAR-SLAM, which uses a light detection and ranging (LiDAR) sensor, are the most prevalent SLAM methods. Thanks to its benefits, including low-cost compared to LiDAR, low energy consumption, durability, and extensive environmental data, VSLAM is currently attracting much attention. This study aims to produce a three-dimensional (3D) model of an indoor environment using image data captured by the stereo camera located on the Unmanned Ground Vehicle (UGV). Easily measured objects from the field of operation were chosen to assess the generated model's accuracy. The actual dimensions of the objects were measured, and these values were compared to those derived from the VSLAM-based 3D model. When the data were evaluated, it was found that the size of the object produced from the model could be varied by +/- 2cm. The surface accuracy of the 3D model produced has also been analysed. For this investigation, areas where the walls and floor surfaces were flat in the field were selected, and the plane accuracy of these areas was analysed. The plain accuracy values of the specified surfaces were determined to be below +/- 1cm.en_US
dc.description.sponsorshipOndokuz Mayis University Scientific Research Projects [PYO.MUH.1906.22.002, PYO.MUH.1908.22.079]en_US
dc.description.sponsorshipAcknowledgement This study was funded by Ondokuz Mayis University Scientific Research Projects (Projects No: PYO.MUH.1906.22.002, and PYO.MUH.1908.22.079) . We also appreciate the LOCUS-TEAM members for their support during this study.en_US
dc.description.woscitationindexEmerging Sources Citation Index
dc.identifier.doi10.26833/ijeg.1459216
dc.identifier.endpage376en_US
dc.identifier.issn2548-0960
dc.identifier.issue3en_US
dc.identifier.scopus2-s2.0-85211119200
dc.identifier.scopusqualityQ2
dc.identifier.startpage368en_US
dc.identifier.urihttps://doi.org/10.26833/ijeg.1459216
dc.identifier.urihttps://hdl.handle.net/20.500.12712/43038
dc.identifier.volume9en_US
dc.identifier.wosWOS:001375727000006
dc.language.isoenen_US
dc.publisherSelçuk Univ Pressen_US
dc.relation.ispartofInternational Journal of Engineering and Geosciencesen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectIndoor Modelling Visual-SLAM Unmanned Ground Vehicle Stereo Cameraen_US
dc.titleVisual-SLAM Based 3-Dimensional Modelling of Indoor Environmentsen_US
dc.typeArticleen_US
dspace.entity.typePublication

Files