Publication:
High Definition 3D Map Creation Using GNSS/IMU Sensor Integration to Support Autonomous Vehicle Navigation

dc.authorscopusid56572918500
dc.authorscopusid7102287794
dc.contributor.authorIlci, V.
dc.contributor.authorTóth, C.
dc.date.accessioned2020-06-21T12:18:33Z
dc.date.available2020-06-21T12:18:33Z
dc.date.issued2020
dc.departmentOndokuz Mayıs Üniversitesien_US
dc.department-temp[Ilci] Veli, Department of Geomatics Engineering, Ondokuz Mayis Üniversitesi, Samsun, Turkey; [Tóth] Charles K., College of Engineering, Columbus, OH, United Statesen_US
dc.description.abstractRecent developments in sensor technologies such as Global Navigation Satellite Systems (GNSS), Inertial Measurement Unit (IMU), Light Detection and Ranging (LiDAR), radar, and camera have led to emerging state-of-the-art autonomous systems, such as driverless vehicles or UAS (Unmanned Airborne Systems) swarms. These technologies necessitate the use of accurate object space information about the physical environment around the platform. This information can be generally provided by the suitable selection of the sensors, including sensor types and capabilities, the number of sensors, and their spatial arrangement. Since all these sensor technologies have different error sources and characteristics, rigorous sensor modeling is needed to eliminate/mitigate errors to obtain an accurate, reliable, and robust integrated solution. Mobile mapping systems are very similar to autonomous vehicles in terms of being able to reconstruct the environment around the platforms. However, they differ a lot in operations and objectives. Mobile mapping vehicles use professional grade sensors, such as geodetic grade GNSS, tactical grade IMU, mobile LiDAR, and metric cameras, and the solution is created in post-processing. In contrast, autonomous vehicles use simple/inexpensive sensors, require real-time operations, and are primarily interested in identifying and tracking moving objects. In this study, the main objective was to assess the performance potential of autonomous vehicle sensor systems to obtain high-definition maps based on only using Velodyne sensor data for creating accurate point clouds. In other words, no other sensor data were considered in this investigation. The results have confirmed that cm-level accuracy can be achieved. © 2020 by the authors. Licensee MDPI, Basel, Switzerland.en_US
dc.identifier.doi10.3390/s20030899
dc.identifier.issn1424-8220
dc.identifier.issue3en_US
dc.identifier.pmid32046232
dc.identifier.scopus2-s2.0-85079301514
dc.identifier.scopusqualityQ2
dc.identifier.urihttps://doi.org/10.3390/s20030899
dc.identifier.volume20en_US
dc.identifier.wosWOS:000517786200323
dc.identifier.wosqualityQ2
dc.language.isoenen_US
dc.publisherMDPI AG indexing@mdpi.com Postfach Basel CH-4005en_US
dc.relation.ispartofSensorsen_US
dc.relation.journalSensorsen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectAutonomous Vehicleen_US
dc.subjectLiDARen_US
dc.subjectMobile Mappingen_US
dc.subjectPoint Clouden_US
dc.subjectSensor Fusionen_US
dc.titleHigh Definition 3D Map Creation Using GNSS/IMU Sensor Integration to Support Autonomous Vehicle Navigationen_US
dc.typeArticleen_US
dspace.entity.typePublication

Files