© The Institution of Engineering and Technology
In this study, the authors propose an effective real-time approach for intersection detection and recognition during autonomous driving in an unknown urban environment. The authors approach use point cloud data acquired by a three-dimensional laser scanner mounted on the vehicle. Intersection detection and recognition are formulated as a classification problem whereby roads are classified as segments or intersections and intersections are subclassified as T-shaped or +-shaped. They first construct a novel model called a virtual cylindrical scanner for efficient feature-level representation of the point cloud data. Then they use support vector machine classifiers to resolve the classification problem according to the features extracted. A series of experiments on real-world data sets and in a simulation environment demonstrate the effectiveness and robustness of the authors approach, even in highly dynamic urban environment. They also performed simulation experiments to investigate effects of several critical factors on their proposed approach, such as other vehicles on the road and the advance detection distance.
References
-
-
1)
-
15. Álvarez, J., Gevers, T., Diego, F., López, A.: ‘Road geometry classification by adaptive shape models’, IEEE Trans. Intell. Transp. Syst., 2012, 12, (9), pp. 1–10 (doi: 10.1109/TITS.2012.2221088).
-
2)
-
3. Álvarez, J., Lopez, A.: ‘Road detection based on illuminant invariance’, IEEE Trans. Intell. Transp. Syst., 2011, 12, (1), pp. 184–193 (doi: 10.1109/TITS.2010.2076349).
-
3)
-
23. von Hundelshausen, F., Himmelsbach, M., Hecker, F., Mueller, A., Wuensche, H.: ‘Driving with tentacles: Integral structures for sensing and motion’, J. Field Robot., 2008, 25, (9), pp. 640–673 (doi: 10.1002/rob.20256).
-
4)
-
24. Kammel, S., Ziegler, J., Pitzer, B., et al: ‘Team annieway's autonomous system for the 2007 DARPA urban challenge’, J. Field Robot., 2008, 25, (9), pp. 615–639 (doi: 10.1002/rob.20252).
-
5)
-
26. Elfes, A.: ‘Sonar-based real-world mapping and navigation’, IEEE J. Robot. Autom., 1987, 3, (3), pp. 249–265 (doi: 10.1109/JRA.1987.1087096).
-
6)
-
27. Petrovskaya, A., Thrun, S.: ‘Model based vehicle detection and tracking for autonomous urban driving’, Auton. Robots, 2009, 26, (2), pp. 123–139 (doi: 10.1007/s10514-009-9115-1).
-
7)
-
22. Lidar, V.: ‘Inc (2008) high definition lidar hdl-64e s2 specifications’. .
-
8)
-
J. McCall ,
M. Trivedi
.
Video-based lane estimation and tracking for driver assistance: survey, system, and evaluation.
IEEE Trans. Intell. Transp. Syst.
,
1 ,
20 -
37
-
9)
-
9. Crisman, J., Thorpe, C.: ‘SCARF: a color vision system that tracks roads and intersections’, IEEE Trans. Robot. Autom., 1993, 9, (1), pp. 49–58 (doi: 10.1109/70.210794).
-
10)
-
5. Sato, T., Akamatsu, M.: ‘Analysis of drivers’ preparatory behaviour before turning at intersections’, IET Intell. Transp. Syst., 2009, 3, (4), pp. 379–389 (doi: 10.1049/iet-its.2008.0099).
-
11)
-
Y.J. Cui ,
S.S. Ge
.
Autonomous vehicle positioning with GPS in urban canyon environments.
IEEE Trans. Robot. Autom.
,
1 ,
15 -
25
-
12)
-
6. Sotelo, M., Rodriguez, F., Magdalena, L.: ‘Virtuous: vision-based road transportation for unmanned operation on urban-like scenarios’, IEEE Trans. Intell. Transp. Syst., 2004, 5, (2), pp. 69–83 (doi: 10.1109/TITS.2004.828175).
-
13)
-
4. Chen, L., Li, Q., Li, M., Zhang, L., Mao, Q.: ‘Design of a multi-sensor cooperation travel environment perception system for autonomous vehicle’, Sensors, 2012, 12, (9), pp. 1238-6–1240-4 (doi: 10.3390/s120912386).
-
14)
-
8. Fathi, A., Krumm, J.: ‘Detecting road intersections from GPS traces’, Geogr. Inf. Sci., 2010, pp. 56–69 (doi: 10.1007/978-3-642-15300-6_5).
-
15)
-
2. Amditis, A., Bimpas, M., Thomaidis, G., et al: ‘A situation-adaptive lane-keeping support system: overview of the safelane approach’, IEEE Trans. Intell. Transp. Syst., 2010, 11, (3), pp. 617–629 (doi: 10.1109/TITS.2010.2051667).
-
16)
-
25. Thrun, S., Montemerlo, M., Dahlkamp, H., et al: ‘Stanley: The robot that won the DARPA grand challenge’. The 2005 DARPA Grand Challenge, 2007, pp. 1–43.
-
17)
-
23. von Hundelshausen, F., Himmelsbach, M., Hecker, F., Mueller, A., Wuensche, H.: ‘Driving with tentacles: Integral structures for sensing and motion’, J. Field Robot., 2008, 25, (9), pp. 640–673 (doi: 10.1002/rob.20256).
-
18)
-
10. Kluge, K., Thorpe, C.: ‘Intersection detection in the YARF road following system’. Proc. IEEE Int. Conf. Intelligent Autonomous Systems, 1993, pp. 145–155.
-
19)
-
28. Chen, T., Dai, B., Liu, D., Liu, Z.: ‘LIDAR-based long range road intersection detection’. Proc. Sixth Int. Conf. Image and Graphics, 2011, pp. 754–759.
-
20)
-
13. Rasmussen, C.: ‘Road shape classification for detecting and negotiating intersections’. Proc. IEEE Intell. Veh. Symp., 2003, pp. 422–427.
-
21)
-
20. Seo, Y., Urmson, C.: ‘A perception mechanism for supporting autonomous intersection handling in urban driving’. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2008. IROS 2008., 2008, pp. 1830–1835.
-
22)
-
11. Jochem, T., Pomerleau, D., Thorpe, C.: ‘Vision-based neural network road and intersection detection and traversal’. Proc. Int. Conf. IEEE Intelligent Robotic Systems, 1995, vol. 3, pp. 344–349.
-
23)
-
9. Crisman, J., Thorpe, C.: ‘SCARF: a color vision system that tracks roads and intersections’, IEEE Trans. Robot. Autom., 1993, 9, (1), pp. 49–58 (doi: 10.1109/70.210794).
-
24)
-
21. Aycard, O., Baig, Q., Bota, S., et al: ‘Intersection safety using LIDAR and stereo vision sensors’. Proc. IEEE Intell. Veh. Syst., 2011, pp. 863–869.
-
25)
-
17. Wijesoma, W., Kodagoda, K., Balasuriya, A., Challa, S.: ‘Road curb tracking in an urban environment’. Proc. Int. Conf. Information Fusion, 2003, vol. 1, pp. 261–268.
-
26)
-
6. Sotelo, M., Rodriguez, F., Magdalena, L.: ‘Virtuous: vision-based road transportation for unmanned operation on urban-like scenarios’, IEEE Trans. Intell. Transp. Syst., 2004, 5, (2), pp. 69–83 (doi: 10.1109/TITS.2004.828175).
-
27)
-
15. Álvarez, J., Gevers, T., Diego, F., López, A.: ‘Road geometry classification by adaptive shape models’, IEEE Trans. Intell. Transp. Syst., 2012, 12, (9), pp. 1–10 (doi: 10.1109/TITS.2012.2221088).
-
28)
-
22. Lidar, V.: ‘Inc (2008) high definition lidar hdl-64e s2 specifications’. .
-
29)
-
1. McCall, J., Trivedi, M.: ‘Video-based lane estimation and tracking for driver assistance: survey, system, and evaluation’, IEEE Trans. Intell. Transp. Syst., 2006, 7, (1), pp. 20–37 (doi: 10.1109/TITS.2006.869595).
-
30)
-
5. Sato, T., Akamatsu, M.: ‘Analysis of drivers’ preparatory behaviour before turning at intersections’, IET Intell. Transp. Syst., 2009, 3, (4), pp. 379–389 (doi: 10.1049/iet-its.2008.0099).
-
31)
-
7. Cui, Y., Ge, S.: ‘Autonomous vehicle positioning with GPS in urban canyon environments’, IEEE Trans. Robot. Autom., 2003, 19, (1), pp. 15–25 (doi: 10.1109/TRA.2002.807557).
-
32)
-
29. Zhu, Q., Chen, L., Li, Q., Li, M., Nuchter, A., Wang, J.: ‘3d LIDAR point cloud based intersection recognition for autonomous driving’. Intelligent Vehicles Symp. (IV), 2012 IEEE, IEEE, 2012, pp. 456–461.
-
33)
-
26. Elfes, A.: ‘Sonar-based real-world mapping and navigation’, IEEE J. Robot. Autom., 1987, 3, (3), pp. 249–265 (doi: 10.1109/JRA.1987.1087096).
-
34)
-
14. Rossle, S., Kruger, V., Gengenbach, G.: ‘Real-time vision-based intersection detection for a drivers warning assistant’. Proc. Intell. Veh. Symp., 1993, pp. 340–344.
-
35)
-
8. Fathi, A., Krumm, J.: ‘Detecting road intersections from GPS traces’, Geogr. Inf. Sci., 2010, pp. 56–69 (doi: 10.1007/978-3-642-15300-6_5).
-
36)
-
2. Amditis, A., Bimpas, M., Thomaidis, G., et al: ‘A situation-adaptive lane-keeping support system: overview of the safelane approach’, IEEE Trans. Intell. Transp. Syst., 2010, 11, (3), pp. 617–629 (doi: 10.1109/TITS.2010.2051667).
-
37)
-
27. Petrovskaya, A., Thrun, S.: ‘Model based vehicle detection and tracking for autonomous urban driving’, Auton. Robots, 2009, 26, (2), pp. 123–139 (doi: 10.1007/s10514-009-9115-1).
-
38)
-
30. Mitchell, T.: ‘Machine learning. 1997’ (McGraw-Hill, Burr Ridge, IL, 1997).
-
39)
-
18. Larsson, J., Broxvall, M., Saffiotti, A.: ‘Laser based intersection detection for reactive navigation in an underground mine’. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2008. IROS 2008, 2008, pp. 2222–2227.
-
40)
-
12. Jochem, T., Pomerleau, D., Thorpe, C.: ‘Vision based intersection navigation’. Proc. IEEE Intell. Veh. Symp., 1996, pp. 391–396.
-
41)
-
24. Kammel, S., Ziegler, J., Pitzer, B., et al: ‘Team annieway's autonomous system for the 2007 DARPA urban challenge’, J. Field Robot., 2008, 25, (9), pp. 615–639 (doi: 10.1002/rob.20252).
-
42)
-
16. Kodagoda, K., Wijesoma, W., Balasuriya, A.: ‘Road curb and intersection detection using a 2d LMS’. Proc. IEEE Int. Conf. Intelligent Robotic Systems, 2002, vol. 1, pp. 19–24.
-
43)
-
3. Álvarez, J., Lopez, A.: ‘Road detection based on illuminant invariance’, IEEE Trans. Intell. Transp. Syst., 2011, 12, (1), pp. 184–193 (doi: 10.1109/TITS.2010.2076349).
-
44)
-
19. Peterson, K., Ziglar, J., Rybski, P.: ‘Fast feature detection and stochastic parameter estimation of road shape using multiple LIDAR’. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2008. IROS 2008, 2008, pp. 612–619.
-
45)
-
4. Chen, L., Li, Q., Li, M., Zhang, L., Mao, Q.: ‘Design of a multi-sensor cooperation travel environment perception system for autonomous vehicle’, Sensors, 2012, 12, (9), pp. 1238-6–1240-4 (doi: 10.3390/s120912386).
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-its.2012.0202
Related content
content/journals/10.1049/iet-its.2012.0202
pub_keyword,iet_inspecKeyword,pub_concept
6
6