Uncertainty estimation of LiDAR matching aided by dynamic vehicle detection and high definition map

Uncertainty estimation of LiDAR matching aided by dynamic vehicle detection and high definition map

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

LiDAR matching between real-time point clouds and pre-built points map is a popular approach to provide accurate localisation service for autonomous vehicles. However, the performance is severely deteriorated in dense traffic scenes. Unavoidably, dynamic vehicles introduce additional uncertainty to the matching result. The main cause is that the pre-built map can be blocked by the surrounding dynamic vehicles from the view of LiDAR of ego vehicle. A novel uncertainty of LiDAR matching (ULM) estimation method aided by the dynamic vehicle (DV) detection and high definition map is proposed in this Letter. Compared to the conventional Hessian matrix-based ULM estimation approach, the proposed method innovatively estimates the ULM by modelling surrounding DV. Then the authors propose to correlate the ULM with the detected DV and convergence feature of matching algorithm. From the evaluated real-data in an intersection area with dense traffic, the proposed method has exhibited the feasibility of estimating the ULM accurately.


    1. 1)
    2. 2)
      • 2. Wan, G., Yang, X., Cai, R., et al: ‘Robust and precise vehicle localization based on multi-sensor fusion in diverse city scenes’, arXiv preprint arXiv:1711.05805, 2017.
    3. 3)
      • 3. Shetty, A.P.: ‘Gps-Lidar Sensor Fusion Aided by 3d City Models for Uavs’, 2017.
    4. 4)
      • 4. Akai, N., Morales, L.Y., Hirayama, T., et al: ‘Toward Localization-Based Automated Driving in Highly Dynamic Environments: Comparison and Discussion of Observation Models’, The 21st IEEE International Conference on Intelligent Transportation Systems, Maui, HI, USA, November 2017.
    5. 5)
      • 5. Akai, N., Morales, L.Y., Murase, H.: ‘Mobile robot localization considering class of sensor observations’. Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Madrid, Spain, October 2018.
    6. 6)
      • 6. Wu, B., Wan, A., Yue, X., et al: ‘Squeezeseg: convolutional neural nets with recurrent crf for real-time road-object segmentation from 3d lidar point cloud’. 2018 IEEE Int. Conf. on Robotics and Automation (ICRA), 2018.
    7. 7)
      • 7. Akai, N., Morales, L.Y., Takeuchi, E., et al: ‘Robust localization using 3d Ndt scan matching with experimentally determined uncertainty and road marker matching’. Intelligent Vehicles Symp. (IV), 2017, 2017.
    8. 8)
      • 8. Merten, H.: ‘The three-dimensional normal-distributions transform’, Threshold, 2008, 10, p. 3.

Related content

This is a required field
Please enter a valid email address