Image fusion via feature residual and statistical matching

Image fusion via feature residual and statistical matching

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

In view of the shortcoming of traditional image fusion based on discrete wavelet transform (DWT) with unclear textural information, an effective visible light and infrared image fusion algorithm via feature residual and statistical matching is proposed in this study. First, the source images are decomposed into low-frequency coefficients and high-frequency coefficients by DWT. Second, two different fusion schemes are designed for the low-frequency coefficients and high frequency ones, respectively. The low-frequency coefficients are fused by a local feature residual-based scheme to achieve adaptive fusion; the high-frequency coefficients are accomplished by a local statistical matching-based scheme to extract the edge information effectively. Finally, the fused image is obtained by inverse DWT. Experimental results demonstrate that the proposed method can produce a more accurate fused image, leading to an improved performance compared with existing methods.


    1. 1)
      • 1. Bai, X., Zhou, F., Xue, B.: ‘Fusion of infrared and visual images through region extraction by using multi scale center-surround top-hat transform’, Opt. Express, 2011, 19, (9), pp. 84448457.
    2. 2)
      • 2. Zhao, H., Shang, Z., Tang, Y.Y., et al: ‘Multi-focus image fusion based on the neighbor distance’, Pattern Recognit., 2013, 46, (3), pp. 10021011.
    3. 3)
      • 3. Li, S., Kang, X., Hu, J., et al: ‘Image matting for fusion of multi-focus image in dynamic scenes’, Inf. Fusion, 2013, 14, (2), pp. 147162.
    4. 4)
      • 4. Chen, Y.Q., Chen, L.Q., Gu, H.J., et al: ‘Technology for multi-focus image fusion based on wavelet transform’. Third Int. Workshop on Advanced Computational Intelligence (IWACI), 2010, pp. 405408.
    5. 5)
      • 5. Chu, H., Li, J., Zhu, W.L.: ‘Multi-focus image fusion scheme with wavelet transform’, Opto-Electron. Eng., 2005, 32, (8), pp. 5963.
    6. 6)
      • 6. Bhatnagar, G., Raman, B.: ‘A new image fusion technique based on directive contrast’, Elcvia: Electron. Lett. Comput. Vis. Image Anal., 2009, 8, (2), pp. 1838.
    7. 7)
      • 7. Niu, Y., Xu, S., Wu, L., et al: ‘Airborne infrared and visible image fusion for target perception based on target region segmentation and discrete wavelet transform’, Math. Probl. Eng., 2012, 15, (2), pp. 723748.
    8. 8)
      • 8. Yang, Y., Huang, S., Gao, J., et al: ‘Multi-focus image fusion using an effective discrete wavelet transform based algorithm’, Meas. Sci. Rev., 2014, 14, (2), pp. 102108.
    9. 9)
      • 9. Zhang, Y., Chen, L., Jia, J., et al: ‘Multi-focus image fusion based on non-negative matrix factorization and difference images’, Signal Process., 2014, 105, (12), pp. 8497.
    10. 10)
      • 10. Li, H., Guo, L., Liu, H.: ‘Current research on wavelet-based image fusion algorithms’. Proc. of SPIE, 2005, vol. 5813, pp. 360367.
    11. 11)
      • 11. Ansar, M., Vimal, K.V.: ‘Performance evaluation of image fusion algorithms for underwater images – a study based on PCA and DWT’, Graph. Signal Process., 2014, 12, pp. 6569.
    12. 12)
      • 12. Vijayrajan, R., Muttan, S.: ‘Discrete wavelet transform based principal component averaging fusion for medical images’, AEU-Int. J. Electron. Commun., 2015, 69, pp. 896902.
    13. 13)
      • 13. Zhang, J.P., Zhang, Y.: ‘Hyperspectral image classification based on multiple features during multiresolution fusion’, J. Infrared Millim. Waves, 2004, 23, (5), pp. 346348.
    14. 14)
      • 14. Zheng, Y., Zheng, Q., Shao, L., et al: ‘A novel objective image quality metric for image fusion based on Renyi entropy’, Inf. Technol. J., 2008, 7, (6), pp. 930935.
    15. 15)
      • 15. Li, T., Wang, Y.: ‘Biological image fusion using a NSCT based variable-weight method’, Inf. Fusion, 2011, 12, (2), pp. 8592.
    16. 16)
      • 16. Mehra, I., Nishchal, N.K.: ‘Wavelet-based image fusion for securing multiple images through asymmetric keys’, Opt. Commun., 2015, 335, pp. 153160.
    17. 17)
      • 17. Yang, Y., Park, D.S., Huang, S., et al: ‘Fusion of CT and MR images using an improved wavelet based method’, J. X-Ray Sci. Technol., 2010, 18, (2), pp. 157170.
    18. 18)
      • 18. Burt, P.J., Kolczynski, R.J.: ‘Enhanced image capture through fusion’. Fourth Int. Conf. on Computer Vision, 1993, pp. 173182.
    19. 19)
      • 19. Fetaya, E., Ullman, S.: ‘Learning local invariant Mahalanobis distances’. ICML, 2015, pp. 111.
    20. 20)
      • 20. Zhang, Y.J.: ‘Nonnegative matrix factorization: a comprehensive review’, IEEE Trans. Knowl. Data Eng., 2013, 25, (6), pp. 13361353.
    21. 21)
      • 21. Zhang, J., Wei, L., Miao, Q., et al: ‘Image fusion based on non-negative matrix factorization’. IEEE Int. Conf. Image Process., 2004, vol. 2, no. 2, pp. 973976.
    22. 22)
      • 22. Liu, W., Huang, J., Zhao, Y.: ‘Image fusion based on PCA and undecimated discrete wavelet transform’, Neural Inf. Process., 2006, 4233, pp. 481488.
    23. 23)
      • 23. Haghighat, M.B.A., Aghagolzadeh, A., Seyedarabi, H.: ‘A non-reference image fusion metric based on mutual information of image features’, Comput. Electr. Eng., 2011, 37, (5), pp. 744756.
    24. 24)
      • 24. Bai, X.: ‘Infrared and visual image fusion through feature extraction by morphological sequential toggle operator’, Infrared Phys. Technol., 2015, 71, pp. 7786.

Related content

This is a required field
Please enter a valid email address