Longitudinal error improvement by visual odometry trajectory trail and road segment matching

Longitudinal error improvement by visual odometry trajectory trail and road segment matching

For access to this article, please select a purchase option:

Buy eFirst article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Intelligent Transport Systems — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

As one of the key requirements in the intelligent vehicle, accurate and precise localisation is essential to ensure swift route planning during the drive. In this study, the authors would like to reduce the longitudinal positioning error that remains as a challenge in accurate localisation. To solve this, they propose a data fusion method by integrating information from visual odometry (VO), noisy GPS, and road information obtained from the publicly available digital map with particle filter. The curve of the VO trajectory trail is compared with road segments curve to increase longitudinal accuracy. This method is validated by KITTI dataset, tested with different GPS noise conditions, and the results show improved localisation for both lateral and longitudinal positioning errors.


    1. 1)
      • 1. Kos, T., Markezic, I., Pokrajcic, J.: ‘Effects of multipath reception on GPS positioning performance’. Proc. ELMAR-2010, Zadar, Croatia, September 2010, pp. 399402.
    2. 2)
      • 2. Zair, S., Le Hegarat-Mascle, S., Seignez, E.: ‘A-contrario modeling for robust localization using raw GNSS data’, IEEE Trans. Intell. Transp. Syst., 2016, 17, (5), pp. 13541367.
    3. 3)
      • 3. Hata, A.Y., Osorio, F.S., Wolf, D.F.: ‘Robust curb detection and vehicle localization in urban environments’. 2014 IEEE Intelligent Vehicles Symp. Proc, Dearborn, MI, 2014, pp. 12571262.
    4. 4)
      • 4. Kim, D., Chung, T., Yi, K.: ‘Lane map building and localization for automated driving using 2D laser rangefinder’. 2015 IEEE Intelligent Vehicles Symp., Seoul, South Korea, 2015, pp. 680685.
    5. 5)
      • 5. Jeon, D., Choi, H.: ‘Multi-sensor fusion for vehicle localization in real environment’. 2015 15th Int. Conf. Control, Automation and Systems, Busan, South Korea, 2015, pp. 411415.
    6. 6)
      • 6. Gu, Y., Hsu, L-T., Kamijo, S.: ‘Passive sensor integration for vehicle self-localization in urban traffic environment’, Sensors, 2015, 15, (12), pp. 3019930220.
    7. 7)
      • 7. Brubaker, M.A., Geiger, A., Urtasun, R.: ‘Map-based probabilistic visual self-localization’, IEEE Trans. Pattern Anal. Mach. Intell., 2016, 38, (4), pp. 652665.
    8. 8)
      • 8. Lu, W., Seignez, E., Rodriguez, F.S.A., et al: ‘Lane marking based vehicle localization using particle filter and multi-kernel estimation’. 2014 13th Int. Conf. Control Automation Robotics & Vision, Singapore, 2014, pp. 601606.
    9. 9)
      • 9. Schreiber, M., Hellmund, A.M., Stiller, C.: ‘Multi-drive feature association for automated map generation using low-cost sensor data’. 2015 IEEE Intelligent Vehicles Symp. (IV), Seoul, South Korea, 2015, pp. 11401147.
    10. 10)
      • 10. Tao, Z., Bonnifait, P., Frémont, V., et al: ‘Mapping and localization using GPS, lane markings and proprioceptive sensors’. 2013 IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Tokyo, Japan, 2013, pp. 406412.
    11. 11)
      • 11. Suhr, J.K., Jang, J., Min, D., et al: ‘Sensor fusion-based low-cost vehicle localization system for complex urban environments’, IEEE Trans. Intell. Transp. Syst., 2017, 18, (5), pp. 10781086.
    12. 12)
      • 12. Yu, Y., Zhao, H., Davoine, F., et al: ‘Monocular visual localization using road structural features’. 2014 IEEE Intelligent Vehicles Symp. Proc., Dearborn, MI, 2014, pp. 693699.
    13. 13)
      • 13. Bak, A., Gruyer, D., Bouchafa, S., et al: ‘Multi-sensor localization – visual odometry as a low cost proprioceptive sensor’. 2012 15th Int. IEEE Conf. Intelligent Transportation Systems, Anchorage, AK, 2012, pp. 13651370.
    14. 14)
      • 14. Alonso, I.P., Llorca, D.F., Gavilan, M., et al: ‘Accurate global localization using visual odometry and digital maps on urban environments’, IEEE Trans. Intell. Transp. Syst., 2012, 13, (4), pp. 15351545.
    15. 15)
      • 15. Schreiber, M., Königshof, H., Hellmund, A.M., et al: ‘Vehicle localization with tightly coupled GNSS and visual odometry’. 2016 IEEE Intelligent Vehicles Symp. (IV), Gothenburg, Germany, 2016, pp. 858863.
    16. 16)
      • 16. Marouane, C., Maier, M., Leupold, A., et al: ‘Visual odometry using motion vectors from visual feature points’. 2016 Int. Conf. Indoor Positioning and Indoor Navigation (IPIN), Alcala de Henares, Spain, 2016, pp. 18.
    17. 17)
      • 17. Zeng, Z., Zhang, T., Li, Q., et al: ‘Curvedness feature constrained map matching for low-frequency probe vehicle data’, Int. J. Geogr. Inf. Sci., 2016, 30, (4), pp. 660690.
    18. 18)
      • 18. Awang Salleh, D.N.S.D.A., Seignez, E.: ‘Monocular visual odometry with road probability distribution factor for lane-level vehicle localization’. 2016 14th Int. Conf. Control, Automation, Robotics and Vision (ICARCV), Phuket, Thailand, November 2016, pp. 16.
    19. 19)
      • 19. Limsoonthrakul, S., Dailey, M.N., Parnichkun, M.: ‘Intelligent vehicle localization using GPS, compass, and machine vision’. 2009 IEEE/RSJ Int. Conf. Intelligent Robots and Systems, St. Louis, MO, 2009, pp. 39813986.
    20. 20)
      • 20. Cheng, Y., Maimone, M.W., Matthies, L.: ‘Visual odometry on the Mars exploration rovers – a tool to ensure accurate driving and science imaging’, IEEE Robot. Autom. Mag., 2006, 13, (2), pp. 5462.
    21. 21)
      • 21. Rosten, E., Drummond, T.: ‘Machine learning for high-speed corner detection’. 9th European Conf. Computer Vision – Volume Part I (ECCV'06), Berlin, Heidelberg, 2006, pp. 430443.
    22. 22)
      • 22. Rosten, E., Drummond, T.: ‘Fusing points and lines for high performance tracking’. Tenth IEEE Int. Conf. Computer Vision (ICCV'05), Beijing, China, 2005, vol. 1, pp. 15081515.
    23. 23)
      • 23. Guerrero, M.: ‘A comparative study of three image matching algorithms: sift, surf, and fast’. PhD thesis, Utah State University, 2011.
    24. 24)
      • 24. Tomasi, C., Kanade, T.: ‘Detection and tracking of point features’, Int. J. Comput. Vis., 1991, 9, (3), pp. 137154.
    25. 25)
      • 25. Nister, D.: ‘An efficient solution to the five-point relative pose problem’, IEEE Trans. Pattern Anal. Mach. Intell., 2004, 26, (6), pp. 756770.
    26. 26)
      • 26. Richard, H., Zisserman, A.: ‘Multiple view geometry in computer vision’ (Cambridge University Pres, Cambridge, 2004, 2nd edn.).
    27. 27)
      • 27. Kilinc, A.S., Baybura, T.: ‘Determination of minimum horizontal curve radius used in the design of transportation structures, depending on the limit value of comfort criterion lateral jerk’. FIG Working Week, Rome, Italy, May 2012.
    28. 28)
      • 28. Geiger, A., Lenz, P., Stiller, C., et al: ‘Vision meets robotics: The KITTI dataset’, Int. J. Robot. Res., 2013, 32, (11), pp. 12311237.
    29. 29)
      • 29. Mao, A., Harrison, C.G.A., Dixon, T.H.: ‘Noise in GPS coordinate time series’, J. Geophys. Res., Solid Earth, 1999, 104, (B2), pp. 27972816.
    30. 30)
      • 30. Huston, C.D., Cornish, D.J.: ‘Method and apparatus for calibration of a GPS receiver’. U.S. Patent No. 5, 751, 244, 12 May 1998.
    31. 31)
      • 31. Geiger, A., Ziegler, J., Stiller, C.: ‘Stereoscan: dense 3d reconstruction in real-time’. IEEE Intelligent Vehicles Symp. (IV), Baden, Germany, June 2011, pp. 963968.
    32. 32)
      • 32. Mur-Artal, R., Tardós, J.D.: ‘ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras’, IEEE Trans. Robot., 2017, 33, (5), pp. 12551262.

Related content

This is a required field
Please enter a valid email address