access icon openaccess Aerodrome situational awareness of unmanned aircraft: an integrated self-learning approach with Bayesian network semantic segmentation

It is expected that soon there will be a significant number of unmanned aerial vehicles (UAVs) operating side-by-side with manned civil aircraft in national airspace systems. To be able to integrate UAVs safely with civil traffic, a number of challenges must be overcome first. This study investigates situational awareness of UAVs’ autonomous taxiing in an aerodrome environment. The research work is based on a real outdoor experimental data collected at the Walney Island Airport, the UK. It aims to further develop and test UAVs’ autonomous taxiing in a challenging outdoor environment. To address various practical issues arising from the outdoor aerodrome such as camera vibration, taxiway feature extraction, and unknown obstacles, the authors develop an integrated approach that combines the Bayesian-network based semantic segmentation with a self-learning method to enhance situational awareness of UAVs. Detailed analysis of the outdoor experimental data shows that the integrated method developed in this study improves the robustness of situational awareness for autonomous taxiing.

Inspec keywords: autonomous aerial vehicles; image segmentation; learning (artificial intelligence); aerospace computing; air traffic; control engineering computing; Bayes methods

Other keywords: Bayesian network semantic segmentation; manned civil aircraft; UK; unmanned aerial vehicles; real outdoor experimental data collection; unmanned aircraft; outdoor aerodrome environment; integrated self-learning approach; civil traffic; national airspace systems; UAV autonomous taxiing; Walney Island Airport; aerodrome situational awareness

Subjects: Aerospace control; Telerobotics; Control engineering computing; Computer vision and image processing techniques; Aerospace engineering computing; Other topics in statistics; Mobile robots

References

    1. 1)
      • 3. Loh, B.Y.R., Roe, T.: ‘UAVs in civil airspace: safety requirements’, IEEE Aerosp. Electron. Syst. Mag., 2009, 24, (1), pp. 517.
    2. 2)
      • 12. Cheng, H.Y., Jeng, B.S., Tseng, P.T., et al: ‘Lane detection with moving vehicles in the traffic scenes’, IEEE Trans. Intell. Transp. Syst., 2006, 7, (4), pp. 571582.
    3. 3)
      • 10. Durrie, J., Gerritsen, T., Frew, E.W., et al: ‘Vision aided inertial navigation on an uncertain map using a particle filter’. IEEE Int. Conf. on Robotics and Automation, 2009, ICRA 09, Kobe, Japan, 2009, pp. 41894194.
    4. 4)
      • 16. Achanta, R., Hemami, S., Estrada, F., et al: ‘Frequency-tuned salient region detection’. IEEE Conf. on Computer Vision and Pattern Recognition, 2009, CVPR 2009, Miami, FL, USA, 2009, pp. 15971604.
    5. 5)
      • 8. Eaton, W., Chen, W.H.: ‘Image segmentation for automated taxiing of unmanned aircraft’. 2015 Int. Conf. on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 2015, pp. 18.
    6. 6)
      • 14. Gaikwad, V., Lokhande, S.: ‘Lane departure identification for advanced driver assistance’, IEEE Trans. Intell. Transp. Syst., 2015, 16, (2), pp. 910918.
    7. 7)
      • 17. Coombes, M., Eaton, W., Chen, W.H.: ‘Unmanned ground operations using semantic image segmentation through a Bayesian network’. Int. Conf. on Unmanned Aircraft Systems (ICUAS), 2016, pp. 868877.
    8. 8)
      • 11. Sotelo, M.A., Rodriguez, F.J., Magdalena, L., et al: ‘A color vision-based lane tracking system for autonomous driving on unmarked roads’, Auton. Robots, 2004, 16, (1), pp. 95116.
    9. 9)
      • 7. Lu, B., Coombes, M., Li, B., et al: ‘Improved situation awareness for autonomous taxiing through selflearning’, IEEE Trans. Intell. Transp. Syst., 2016, 17, pp. 35533564.
    10. 10)
      • 13. Satzoda, R.K., Trivedi, M.M.: ‘Drive analysis using vehicle dynamics and vision-based lane semantics’, IEEE Trans. Intell. Transp. Syst., 2015, 16, (1), pp. 918.
    11. 11)
      • 9. Durrie, T.G.E.W.F.J., Pledgie, S.: ‘Vision-aided inertial navigation on an uncertain map using a particle filter’. Proc. of the IEEE Int. Conf. on Robotics and Automation, 2009, pp. 41894194.
    12. 12)
      • 4. Dalamagkidis, V.K.P. K., Piegl, L.A.: ‘Current status and future perspectives for unmanned aircraft system operations in the us’, J. Intell. Robotic Syst., 2008, 52, (2), pp. 313329.
    13. 13)
      • 6. Lu, B., Li, B., Chen, W.H.: ‘Map-enhanced visual taxiway extraction for autonomous taxiing of uavs’, IFAC-PapersOnLine, 2015, 48, (9), pp. 4954.
    14. 14)
      • 1. Hausamann, D., Zirnig, W., Schreier, G., et al: ‘Monitoring of gas pipelines – a civil UAV application’, Aircr. Eng. Aerosp. Technol., 2005, 77, (5), pp. 352360.
    15. 15)
      • 15. Zhang, Y., Gong, J., Tian, J.: ‘Robust lane detection and tracking using improved Hough transform and Gaussian mixture model’, in: Zhang, T., Sang, N., (Eds): ‘Seventh international symposium on multispectral image processing and pattern recognition (MIPPR)’, vol. 8003 (International Society for Optics and Photonics, Guilin, China, 2011), pp. 18.
    16. 16)
      • 5. Loegering, G., Harris, S.: ‘Landing dispersion results - global hawk auto-land system’. AIAA's 1st Technical Conf. and Workshop on Unmanned Aerial Vehicles, Portsmouth, VA, USA, 2002.
    17. 17)
      • 2. Li, Z., Liu, Y., Hayward, R., et al: ‘Knowledge-based power line detection for uav surveillance and inspection systems’. 2008 23rd Int. Conf. Image and Vision Computing, Christchurch, New Zealand, 2008, pp. 16.
    18. 18)
      • 18. Wang, H., Wei, Z., Wang, S., et al: ‘A vision-based obstacle detection system for unmanned surface vehicle’. IEEE 5th Int. Conf. on Robotics, Automation and Mechatronics (RAM), Qingdao, China, 2011, pp. 364369.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-its.2017.0101
Loading

Related content

content/journals/10.1049/iet-its.2017.0101
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading