access icon free Intersection detection and recognition for autonomous urban driving using a virtual cylindrical scanner

In this study, the authors propose an effective real-time approach for intersection detection and recognition during autonomous driving in an unknown urban environment. The authors approach use point cloud data acquired by a three-dimensional laser scanner mounted on the vehicle. Intersection detection and recognition are formulated as a classification problem whereby roads are classified as segments or intersections and intersections are subclassified as T-shaped or +-shaped. They first construct a novel model called a virtual cylindrical scanner for efficient feature-level representation of the point cloud data. Then they use support vector machine classifiers to resolve the classification problem according to the features extracted. A series of experiments on real-world data sets and in a simulation environment demonstrate the effectiveness and robustness of the authors approach, even in highly dynamic urban environment. They also performed simulation experiments to investigate effects of several critical factors on their proposed approach, such as other vehicles on the road and the advance detection distance.

Inspec keywords: optical scanners; automated highways; image classification; feature extraction; support vector machines; shape recognition; data acquisition

Other keywords: intersection detection; dynamic urban environment; support vector machine classifier; real-world data sets; autonomous urban driving; feature-level representation; three-dimensional laser scanner; +-shaped intersections; real-time approach; feature extraction; simulation environment; road classification; unknown urban environment; point cloud data acquisition; virtual cylindrical scanner; intersection recognition; T-shaped intersections

Subjects: Image recognition; Computer vision and image processing techniques; Knowledge engineering techniques; Traffic engineering computing

References

    1. 1)
    2. 2)
    3. 3)
    4. 4)
    5. 5)
    6. 6)
    7. 7)
      • 22. Lidar, V.: ‘Inc (2008) high definition lidar hdl-64e s2 specifications’. Available at http://www.velodyne.com/lidar/products/specifications.aspx.
    8. 8)
    9. 9)
    10. 10)
    11. 11)
    12. 12)
    13. 13)
    14. 14)
    15. 15)
    16. 16)
      • 25. Thrun, S., Montemerlo, M., Dahlkamp, H., et al: ‘Stanley: The robot that won the DARPA grand challenge’. The 2005 DARPA Grand Challenge, 2007, pp. 143.
    17. 17)
      • 23. von Hundelshausen, F., Himmelsbach, M., Hecker, F., Mueller, A., Wuensche, H.: ‘Driving with tentacles: Integral structures for sensing and motion’, J. Field Robot., 2008, 25, (9), pp. 640673 (doi: 10.1002/rob.20256).
    18. 18)
      • 10. Kluge, K., Thorpe, C.: ‘Intersection detection in the YARF road following system’. Proc. IEEE Int. Conf. Intelligent Autonomous Systems, 1993, pp. 145155.
    19. 19)
      • 28. Chen, T., Dai, B., Liu, D., Liu, Z.: ‘LIDAR-based long range road intersection detection’. Proc. Sixth Int. Conf. Image and Graphics, 2011, pp. 754759.
    20. 20)
      • 13. Rasmussen, C.: ‘Road shape classification for detecting and negotiating intersections’. Proc. IEEE Intell. Veh. Symp., 2003, pp. 422427.
    21. 21)
      • 20. Seo, Y., Urmson, C.: ‘A perception mechanism for supporting autonomous intersection handling in urban driving’. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2008. IROS 2008., 2008, pp. 18301835.
    22. 22)
      • 11. Jochem, T., Pomerleau, D., Thorpe, C.: ‘Vision-based neural network road and intersection detection and traversal’. Proc. Int. Conf. IEEE Intelligent Robotic Systems, 1995, vol. 3, pp. 344349.
    23. 23)
      • 9. Crisman, J., Thorpe, C.: ‘SCARF: a color vision system that tracks roads and intersections’, IEEE Trans. Robot. Autom., 1993, 9, (1), pp. 4958 (doi: 10.1109/70.210794).
    24. 24)
      • 21. Aycard, O., Baig, Q., Bota, S., et al: ‘Intersection safety using LIDAR and stereo vision sensors’. Proc. IEEE Intell. Veh. Syst., 2011, pp. 863869.
    25. 25)
      • 17. Wijesoma, W., Kodagoda, K., Balasuriya, A., Challa, S.: ‘Road curb tracking in an urban environment’. Proc. Int. Conf. Information Fusion, 2003, vol. 1, pp. 261268.
    26. 26)
      • 6. Sotelo, M., Rodriguez, F., Magdalena, L.: ‘Virtuous: vision-based road transportation for unmanned operation on urban-like scenarios’, IEEE Trans. Intell. Transp. Syst., 2004, 5, (2), pp. 6983 (doi: 10.1109/TITS.2004.828175).
    27. 27)
      • 15. Álvarez, J., Gevers, T., Diego, F., López, A.: ‘Road geometry classification by adaptive shape models’, IEEE Trans. Intell. Transp. Syst., 2012, 12, (9), pp. 110 (doi: 10.1109/TITS.2012.2221088).
    28. 28)
      • 22. Lidar, V.: ‘Inc (2008) high definition lidar hdl-64e s2 specifications’. Available at http://www.velodyne.com/lidar/products/specifications.aspx.
    29. 29)
      • 1. McCall, J., Trivedi, M.: ‘Video-based lane estimation and tracking for driver assistance: survey, system, and evaluation’, IEEE Trans. Intell. Transp. Syst., 2006, 7, (1), pp. 2037 (doi: 10.1109/TITS.2006.869595).
    30. 30)
      • 5. Sato, T., Akamatsu, M.: ‘Analysis of drivers’ preparatory behaviour before turning at intersections’, IET Intell. Transp. Syst., 2009, 3, (4), pp. 379389 (doi: 10.1049/iet-its.2008.0099).
    31. 31)
      • 7. Cui, Y., Ge, S.: ‘Autonomous vehicle positioning with GPS in urban canyon environments’, IEEE Trans. Robot. Autom., 2003, 19, (1), pp. 1525 (doi: 10.1109/TRA.2002.807557).
    32. 32)
      • 29. Zhu, Q., Chen, L., Li, Q., Li, M., Nuchter, A., Wang, J.: ‘3d LIDAR point cloud based intersection recognition for autonomous driving’. Intelligent Vehicles Symp. (IV), 2012 IEEE, IEEE, 2012, pp. 456461.
    33. 33)
      • 26. Elfes, A.: ‘Sonar-based real-world mapping and navigation’, IEEE J. Robot. Autom., 1987, 3, (3), pp. 249265 (doi: 10.1109/JRA.1987.1087096).
    34. 34)
      • 14. Rossle, S., Kruger, V., Gengenbach, G.: ‘Real-time vision-based intersection detection for a drivers warning assistant’. Proc. Intell. Veh. Symp., 1993, pp. 340344.
    35. 35)
      • 8. Fathi, A., Krumm, J.: ‘Detecting road intersections from GPS traces’, Geogr. Inf. Sci., 2010, pp. 5669 (doi: 10.1007/978-3-642-15300-6_5).
    36. 36)
      • 2. Amditis, A., Bimpas, M., Thomaidis, G., et al: ‘A situation-adaptive lane-keeping support system: overview of the safelane approach’, IEEE Trans. Intell. Transp. Syst., 2010, 11, (3), pp. 617629 (doi: 10.1109/TITS.2010.2051667).
    37. 37)
      • 27. Petrovskaya, A., Thrun, S.: ‘Model based vehicle detection and tracking for autonomous urban driving’, Auton. Robots, 2009, 26, (2), pp. 123139 (doi: 10.1007/s10514-009-9115-1).
    38. 38)
      • 30. Mitchell, T.: ‘Machine learning. 1997’ (McGraw-Hill, Burr Ridge, IL, 1997).
    39. 39)
      • 18. Larsson, J., Broxvall, M., Saffiotti, A.: ‘Laser based intersection detection for reactive navigation in an underground mine’. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2008. IROS 2008, 2008, pp. 22222227.
    40. 40)
      • 12. Jochem, T., Pomerleau, D., Thorpe, C.: ‘Vision based intersection navigation’. Proc. IEEE Intell. Veh. Symp., 1996, pp. 391396.
    41. 41)
      • 24. Kammel, S., Ziegler, J., Pitzer, B., et al: ‘Team annieway's autonomous system for the 2007 DARPA urban challenge’, J. Field Robot., 2008, 25, (9), pp. 615639 (doi: 10.1002/rob.20252).
    42. 42)
      • 16. Kodagoda, K., Wijesoma, W., Balasuriya, A.: ‘Road curb and intersection detection using a 2d LMS’. Proc. IEEE Int. Conf. Intelligent Robotic Systems, 2002, vol. 1, pp. 1924.
    43. 43)
      • 3. Álvarez, J., Lopez, A.: ‘Road detection based on illuminant invariance’, IEEE Trans. Intell. Transp. Syst., 2011, 12, (1), pp. 184193 (doi: 10.1109/TITS.2010.2076349).
    44. 44)
      • 19. Peterson, K., Ziglar, J., Rybski, P.: ‘Fast feature detection and stochastic parameter estimation of road shape using multiple LIDAR’. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2008. IROS 2008, 2008, pp. 612619.
    45. 45)
      • 4. Chen, L., Li, Q., Li, M., Zhang, L., Mao, Q.: ‘Design of a multi-sensor cooperation travel environment perception system for autonomous vehicle’, Sensors, 2012, 12, (9), pp. 1238-61240-4 (doi: 10.3390/s120912386).
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-its.2012.0202
Loading

Related content

content/journals/10.1049/iet-its.2012.0202
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading