access icon free Cooperative perception in autonomous ground vehicles using a mobile-robot testbed

Autonomous vehicles are limited in their perception capabilities to the field of view of their onboard sensors. Additionally, the environment may not be completely perceivable due to the presence of occlusions and blind spots. To overcome this challenge, wireless vehicle-to-vehicle communication could be employed to send and receive sensory information about the surroundings among vehicles within the vicinity. This form of cooperative perception (CP) turns every vehicle into a moving sensor platform, extending each vehicle's field of view and line of sight. This study proposes one such technique for CP over a short range. The system uses visual and inertial sensors, augmented by a positioning system, to perform cooperative relative localisation between two vehicles that share a common field of view. This allows one vehicle to locate the other vehicle in its frame of reference. Subsequently, information about objects in the field of view of one vehicle, localised using a monocular camera is relayed to the other vehicle through communication. A mobile multi-robot testbed was developed to emulate autonomous vehicles and to experimentally evaluate the proposed method through a series of driving scenario test cases, in which CP could be effective and crucial to the safety and comfort of driving.

Inspec keywords: vehicular ad hoc networks; mobile robots; multi-robot systems; cameras

Other keywords: inertial sensors; positioning system; wireless vehicle-to-vehicle communication; onboard sensors; autonomous vehicles; mobile multirobot testbed; moving sensor platform; receive sensory information; cooperative relative localisation; perception capabilities; cooperative perception; autonomous ground vehicles

Subjects: Mobile robots; Mobile radio systems

References

    1. 1)
      • 13. Aramrattana, M., Larsson, T., Jansson, J., et al: ‘Dimensions of cooperative driving, its and automation’. Intelligent Vehicles Symp. (IV), Seoul, South Korea, 2015, pp. 144149.
    2. 2)
      • 23. Ramos, H., Boukerche, A., Pazzi, R., et al: ‘Cooperative target tracking in vehicular sensor networks’, IEEE Wirel. Commun., 2012, 19, (5), pp. 6673.
    3. 3)
      • 10. Oliveira, M., Santos, V., Sappa, A.D.: ‘Multimodal inverse perspective mapping’, Inf. Fusion, 2015, 24, pp. 108121.
    4. 4)
      • 30. Martinez, A., Fernández, E.: ‘Learning ROS for robotics programming’ (Packt Publishing Ltd., Birmingham, UK, 2013).
    5. 5)
      • 6. Kim, S.W., Liu, W., Ang, M.H., et al: ‘The impact of cooperative perception on decision making and planning of autonomous vehicles’, IEEE Intell. Transp. Syst. Mag., 2015, 7, (3), pp. 3950.
    6. 6)
      • 5. Kim, S.W., Qin, B., Chong, Z.J., et al: ‘Multivehicle cooperative driving using cooperative perception: design and experimental validation’, IEEE Trans. Intell. Transp. Syst., 2015, 16, (2), pp. 663680.
    7. 7)
      • 19. Adamey, E., Kurt, A., Ozgüner, U.: ‘Cooperative traffic mapping using onboard sensing and V2V communication in mixed-traffic environments’. Second Int. Symp. Future Active Safety Technology, FAST-zero, Nagoya, Japan, 2013, pp. 16.
    8. 8)
      • 20. Adamey, E., Ozbilgin, G., Ozguner, U.: ‘Collaborative vehicle tracking in mixed-traffic environments: scaled-down tests using Simville’, SAE Technical Paper, 2015.
    9. 9)
      • 22. de Ponte Müller, F.: ‘Survey on ranging sensors and cooperative techniques for relative positioning of vehicles’, Sensors, 2017, 17, (2), p. 271.
    10. 10)
      • 27. Hoiem, D., Efros, A.A., Hebert, M.: ‘Putting objects in perspective’, Int. J. Comput. Vis., 2008, 80, (1), pp. 315.
    11. 11)
      • 29. Koubâa, A.: ‘Robot operating system (ROS): the complete reference’, vol. 1, (Springer, Cham, Switzerland, 2016).
    12. 12)
      • 14. Kim, S.W., Chong, Z.J., Qin, B., et al: ‘Cooperative perception for autonomous vehicle control on the road: motivation and experimental results’. 2013 IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Tokyo, Japan, 2013, pp. 50595066.
    13. 13)
      • 18. Liu, W., Kim, S.W., Chong, Z.J., et al: ‘Motion planning using cooperative perception on urban road’. 2013 Sixth IEEE Conf. Robotics, Automation and Mechatronics (RAM), Manila, Philippines, 2013, pp. 130137.
    14. 14)
      • 9. Rodehorst, V., Heinrichs, M., Hellwich, O.: ‘Evaluation of relative pose estimation methods for multi-camera setups’, Int. Arch. Photogramm. Remote Sens. (ISPRS'08), 2008, 37, pp. 135140.
    15. 15)
      • 3. Bagloee, S.A., Tavana, M., Asadi, M., et al: ‘Autonomous vehicles: challenges, opportunities, and future implications for transportation policies’, J. Mod. Transp., 2016, 24, (4), pp. 284303.
    16. 16)
      • 8. Brückner, M., Bajramovic, F., Denzler, J.: ‘Experimental evaluation of relative pose estimation algorithms’. VISAPP, Funchal, Portugal, 2008, vol. 2, pp. 431438.
    17. 17)
      • 21. WaÌğsik, A., Ventura, R., Pereira, J.N., et al: ‘LIDAR-based relative position estimation and tracking for multi-robot systems’. Robot 2015: Second Iberian Robotics Conf., Lisbon, Portugal, 2016, pp. 316.
    18. 18)
      • 28. Janai, J., Güney, F., Behl, A., et al: ‘Computer vision for autonomous vehicles: problems, datasets and state-of-the-art’, arXiv preprint arXiv:170405519, 2017.
    19. 19)
      • 2. Pendleton, S.D., Andersen, H., Du, X., et al: ‘Perception, planning, control, and coordination for autonomous vehicles’, Machines, 2017, 5, (1), p. 6.
    20. 20)
      • 4. Administration, N.H.T.S.: ‘2015 motor vehicle crashes: overview’, Traffic Saf. Facts Res. Note, 2016, 2016, pp. 19.
    21. 21)
      • 7. Eskandarian, A.: ‘Handbook of intelligent vehicles’ (Springer, London, UK, 2012).
    22. 22)
      • 1. Ozguner, U., Acarman, T., Redmill, K.A.: ‘Autonomous ground vehicles’ (Artech House, Norwood, MA, USA, 2011).
    23. 23)
      • 31. Bradski, G.: ‘The OpenCV library’, Dr. Dobb's Journal of Software Tools, 2000.
    24. 24)
      • 11. Merino, L., Caballero, F., Ferruz, J., et al: ‘MultiUAV cooperative perception techniques’. in Ollero, A., Maza, I. (Eds.): ‘Multiple heterogeneous unmanned aerial vehicles’ (Springer, Berlin, Germany, 2007), pp. 67110.
    25. 25)
      • 26. Szeliski, R.: ‘Computer vision: algorithms and applications’ (Springer Science & Business Media, London, UK, 2010).
    26. 26)
      • 12. Achtelik, M.W., Weiss, S., Chli, M., et al: ‘Collaborative stereo’. 2011 IEEE/RSJ Int. Conf. Intelligent Robots and Systems, San Francisco, CA, USA, 2011, pp. 22422248.
    27. 27)
      • 15. Kim, S.W., Liu, W.: ‘Cooperative autonomous driving: a mirror neuron inspired intention awareness and cooperative perception approach’, IEEE Intell. Transp. Syst. Mag., 2016, 8, (3), pp. 2332.
    28. 28)
      • 17. Kim, S.W., Bandyopadhyay, T., Qin, B., et al: ‘Vehicle autonomy using cooperative perception for mobility-on-demand systems’, in Carbone, G., Gomez-Bravo, F. (Eds.): ‘Motion and operation planning of robotic systems’ (Springer, Cham, Switzerland, 2015), pp. 331360.
    29. 29)
      • 16. Kim, S.W., Liu, W., Ang, M.H., et al: ‘Cooperative autonomous driving using cooperative perception and mirror neuron inspired intention awareness’. 2014 Int. Conf. Connected Vehicles and Expo (ICCVE), Vienna, Austria, 2014, pp. 369376.
    30. 30)
      • 25. Hartley, R., Zisserman, A.: ‘Multiple view geometry in computer vision’ (Cambridge University Press, New York, NY, USA, 2003).
    31. 31)
      • 24. Møgelmose, A., Trivedi, M.M., Moeslund, T.B.: ‘Trajectory analysis and prediction for improved pedestrian safety: integrated framework and evaluations’. Intelligent Vehicles Symp. (IV), Seoul, South Korea, 2015, pp. 330335.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-its.2018.5607
Loading

Related content

content/journals/10.1049/iet-its.2018.5607
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading