Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon openaccess Unit dual quaternion-based pose optimisation for visual runway observations

This study addresses the pose estimation problem of an aircraft runway using visual observations in a landing approach scenario. The authors utilised the fact that the geodetic coordinates of most runways are known precisely with highly visible markers. Thus, the runway observations can increase the level of situational awareness during the landing approach, providing additional redundancy of navigation and less reliance on global positioning system. A novel pose optimisation algorithm is proposed utilising unit dual quaternion for the runway corner observations obtained from a monocular camera. The estimated runway pose is further fused with an inertial navigation system in an extended Kalman filter. An open-source flight simulator is used to collect and process the visual and flight dataset during the landing approach, demonstrating reliable runway pose estimates and the improved inertial navigation solution.

References

    1. 1)
      • 7. Serra, P., Cunha, R., Hamel, T., et al: ‘Landing of a quadrotor on a moving target using dynamic image based visual servo control’, IEEE Trans. Robot., 2016, 32, (6), pp. 15241535.
    2. 2)
      • 6. Ashford, R.: ‘Flight safety digest: a study of fatal approach-andlanding accidents worldwide 1980–1996’, Flight Safety Digest, 1998, 17, (2/3).
    3. 3)
      • 3. Liang, Q., Liu, M.: ‘A Tightly Coupled VLC-Inertial Localization System by EKF’, IEEE Robot. Autom. Lett., 2020, 5, (2), pp. 31293136.
    4. 4)
      • 18. Han, P., Cheng, Z., Chang, L.: ‘Automatic runway detection based on unsupervised classification in polsar image’. 2016 Integrated Communications Navigation and Surveillance (ICNS), Herndon, USA, April 2016, pp. 6E3-16E3-8.
    5. 5)
      • 11. Coutard, L., Chaumette, F., Pflimlin, J.-M.: ‘Automatic landing on aircraft carrier by visual servoing’. 2011 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, San Francisco, USA, 2011, pp. 28432848.
    6. 6)
      • 16. Wang, X., Yu, C., Lin, Z.: ‘A dual quaternion solution to attitude and position control for rigid-body coordination’, IEEE Trans. Robot., 2012, 28, (5), pp. 11621170.
    7. 7)
      • 20. Smith, S.M., Brady, J.M.: ‘SUSAN - a new approach to low level image processing’. Int. Journal of Computer Vision, 1997, 23, (1), pp. 4578.
    8. 8)
      • 15. Wu, Y., Hu, X., Hu, D., et al: ‘Strapdown inertial navigation system algorithms based on dual quaternions’, IEEE Trans. Aerosp. Electron. Syst., 2005, 41, (1), pp. 110132.
    9. 9)
      • 13. Gibert, V., Plestan, F., Burlion, L., et al: ‘Visual estimation of deviations for the civil aircraft landing’, Control Eng. Pract., 2018, 75, pp. 1725.
    10. 10)
      • 5. Miiller, M.G., Steidle, F., Schuster, M.J., et al: ‘Robust visual-inertial state estimation with multiple odometries and efficient mapping on an MAV with ultrawide FOV stereo vision’. 2018 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Madrid, Spain, October 2018, pp. 37013708.
    11. 11)
      • 14. Moore, A.J., Schubert, M., Dolph, C., et al: ‘Machine vision identification of airport runways with visible and infrared videos’, J. Aerosp. Inf. Syst., 2016, 13, pp. 266277.
    12. 12)
      • 9. Tang, D., Jiao, Y., Chen, J.: ‘On automatic landing system for carrier plane based on integration of INS, GPS and vision’. 2016 IEEE Chinese Guidance, Navigation and Control Conf. (CGNCC), Nanjing, China, August 2016, pp. 22602264.
    13. 13)
      • 1. Kim, J., Byun, H., Guivant, : ‘Compressed Pseudo-SLAM: Pseudorange Integrated Generalised Compressed SLAM’, Australasian Conference of Robotics and Automation, Brisbane, December 2020.
    14. 14)
      • 8. Xu, L., Luo, H.: ‘Towards autonomous tracking and landing on moving target’. 2016 IEEE Int. Conf. on Real-time Computing and Robotics (RCAR), Angkor Wat, Cambodia, June 2016, pp. 620628.
    15. 15)
      • 17. Cheng, J., Kim, J., Shao, J., et al: ‘Robust linear pose graph-based SLAM’, Robot. Auton. Syst., 2015, 72, pp. 7182.
    16. 16)
      • 2. Usenko, V., Demmel, N., Schubert, D., et al: ‘Visual-inertial mapping with non-linear factor recovery’, IEEE Robot. Autom. Lett., 2020, 5, (2), pp. 422429.
    17. 17)
      • 12. Kus, M.: ‘Autonomous carrier landing of a fixed-wing UAV with airborne deck motion estimation’. Ph.D. dissertation, 2019.
    18. 18)
      • 10. Nguyen, K.D., Ha, C.: ‘Vision-based hardware-in-the-loop simulation for unmanned aerial vehicles’, in ‘Intelligent computing theories and application’ (Springer International Publishing, Wuhan, China, 2018), pp. 7283.
    19. 19)
      • 4. Forster, C., Carlone, L., Dellaert, F., et al: ‘On-manifold preintegration for real-time visual–inertial odometry’, IEEE Trans. Robot., 2017, 33, (1), pp. 121.
    20. 20)
      • 19. Abu-Jbara, K., Alheadary, W., Sundaramorthi, G., et al: ‘A robust vision-based runway detection and tracking algorithm for automatic UAV landing’. 2015 Int. Conf. on unmanned aircraft systems (ICUAS), Denver, USA, 2015, pp. 11481157.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-csr.2020.0029
Loading

Related content

content/journals/10.1049/iet-csr.2020.0029
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address