Large-scale mapping in complex field scenarios using an autonomous car

Filipe Mutz, Lucas P. Veronese, Thiago Oliveira-Santos, Edilson De Aguiar, Fernando A. Auat Cheein, Alberto Ferreira De Souza

Research output: Contribution to journalArticle

40 Citations (Scopus)


© 2015 Elsevier Ltd. All rights reserved. In this paper, we present an end-to-end framework for precise large-scale mapping with applications in autonomous driving. In special, the problem of mapping complex environments, with features changing from tree-lined streets to urban areas with dense traffic, is studied. The robotic car is equipped with an odometry sensor, a 3D LiDAR Velodyne HDL-32E, a IMU, and a low cost GPS, and the data generated by these sensors are integrated in a pose-based GraphSLAM estimator. A new strategy for identification and correction of odometry data using evolutionary algorithms is presented. This new strategy makes odometry data significantly more consistent with GPS. Loop closures are detected using GPS data, and GICP, a 3D point cloud registration algorithm, is used to estimate the displacement between the different travels over the same region. After path estimation, 3D LiDAR data is used to build an occupancy grid mapping of the environment. A detailed mathematical description of how occupancy evidence can be calculated from the point clouds is given, and a submapping strategy to handle memory limitations is presented as well. The proposed framework is tested in three real world environments with different sizes, and features: a parking lot, a university beltway, and a city neighborhood. In all cases, satisfactory maps were built, with precise loop closures even when the vehicle traveled long distances between them.
Original languageEnglish
Pages (from-to)439-462
Number of pages24
JournalExpert Systems with Applications
Publication statusPublished - 15 Mar 2016

Fingerprint Dive into the research topics of 'Large-scale mapping in complex field scenarios using an autonomous car'. Together they form a unique fingerprint.

  • Cite this