http://iet.metastore.ingenta.com
1887

FQI: feature-based reduced-reference image quality assessment method for screen content images

FQI: feature-based reduced-reference image quality assessment method for screen content images

For access to this article, please select a purchase option:

Buy article PDF
$19.95
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Image Processing — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

In this study, a reduced-reference image-quality-assessment (IQA) method for screen content images, named as feature-quality-index (FQI) is proposed. The proposed method is based on the fact that the human visual system is more sensitive towards change in features than intensity or structure. Reduced features from the reference and distorted images are first extracted. In order to find the preserved features in the distorted image, a feature matching process with a reduced number of distance calculations is proposed, namely reduced-distance method. To reflect the importance of the matched features and their distance, the inner product between the normalised scale and distance vector is obtained. Extensive comparisons are performed on two available benchmark databases namely SIQAD and QACS, with eight reduced-reference, and nine full-reference state-of-the-art IQA techniques to demonstrate the consistency, accuracy, and robustness of the proposed FQI. The subjective evaluation of mean opinion score shows that FQI outperforms the current state-of-the-art IQA techniques.

References

    1. 1)
      • 1. Shen, H., Lu, Y., Wu, F., et al: ‘A high-performance remote computing platform’. IEEE Int. Conf. on Pervasive Computing and Communications, Galveston, USA, 2009, pp. 16.
    2. 2)
      • 2. Lu, Y., Li, S., Shen, H.: ‘Virtualized screen: a third element for cloud–mobile convergence’, IEEE Multimedia, 2011, 18, (2), pp. 411.
    3. 3)
      • 3. Chang, T.H., Li, Y.: ‘Deep shot: a framework for migrating tasks across devices using mobile phone cameras’. Proc. SIGCHI Conf. on Human Factors in Computing Systems, Vancouver, Canada, 2011, pp. 21632172.
    4. 4)
      • 4. Xu, J., Joshi, R., Cohen, R.A.: ‘Overview of the emerging HEVC screen content coding extension’, IEEE Trans. Circuits Syst. Video Technol., 2016, 26, (1), pp. 5062.
    5. 5)
      • 5. Borji, A., Itti, L.: ‘State-of-the-art in visual attention modeling’, IEEE Trans. Pattern Anal. Mach. Intell., 2013, 35, (1), pp. 185207.
    6. 6)
      • 6. Wang, Z., Bovik, A.C.: ‘A universal image quality index’, IEEE Signal Process. Lett., 2002, 9, (3), pp. 8184.
    7. 7)
      • 7. Saha, A., Wu, Q.M.J.: ‘Full-reference image quality assessment by combining global and local distortion measures’, Signal Process., 2016, 128, pp. 186197.
    8. 8)
      • 8. Wang, Z., Bovik, H.R., Sheikh, R., et al: ‘Image quality assessment: from error measurement to structural similarity’, IEEE Trans. Image Process., 2004, 13, (4), pp. 600612.
    9. 9)
      • 9. Chandler, D.M., Hemami, S.S.: ‘VSNR: a wavelet-based visual signal-to-noise ratio for natural images’, IEEE Trans. Image Process., 2007, 16, (9), pp. 22842298.
    10. 10)
      • 10. Gu, K., Zhai, G., Yang, X., et al: ‘An efficient color image quality metric with local-tuned-global model’. IEEE Int. Conf. on Image Processing (ICIP), Paris, France, 2014, pp. 506510.
    11. 11)
      • 11. Fang, Y., Zeng, K., Wang, Z., et al: ‘Objective quality assessment for image retargeting based on structural similarity’, IEEE J. Emerg. Sel. Top. Circuits Syst., 2014, 4, (1), pp. 95105.
    12. 12)
      • 12. Wang, Z., Simoncelli, E.P., Bovik, A.C.: ‘Multiscale structural similarity for image quality assessment’. Conf. Record of the Thirty-Seventh Asilomar Conf. on Signals, Systems and Computers, Pacific Grove, USA, 2003, vol. 2, pp. 13981402.
    13. 13)
      • 13. Liu, A., Lin, W., Narwaria, M.: ‘Image quality assessment based on gradient similarity’, IEEE Trans. Image Process., 2012, 21, (4), pp. 15001512.
    14. 14)
      • 14. Gu, K., Zhai, G., Yang, X., et al: ‘Structural similarity weighting for image quality assessment’. IEEE Int. Conf. on Multimedia and Expo Workshops (ICMEW), San Jose, USA, 2013, pp. 16.
    15. 15)
      • 15. Gao, F., Yu, J.: ‘Biologically inspired image quality assessment’, Signal Process., 2016, 124, pp. 210219.
    16. 16)
      • 16. Zhang, L., Zhang, L., Mou, X., et al: ‘FSIM: a feature similarity index for image quality assessment’, IEEE Trans. Image Process., 2011, 20, (8), pp. 23782386.
    17. 17)
      • 17. Gu, K., Wang, S., Yang, H., et al: ‘Saliency-guided quality assessment of screen content images’, IEEE Trans. Multimed., 2016, 18, (6), pp. 10981110.
    18. 18)
      • 18. Gu, K., Qiao, J., Min, X., et al: ‘Evaluating quality of screen content images via structural variation analysis’, IEEE Trans. Vis. Comput. Graphics, 2018, 24, (10), pp. 26892701.
    19. 19)
      • 19. Yang, H., Fang, Y., Lin, W.: ‘Perceptual quality assessment of screen content images’, IEEE Trans. Image Process., 2015, 24, (11), pp. 44084421.
    20. 20)
      • 20. Ni, Z., Ma, L., Zeng, H., et al: ‘Gradient direction for screen content image quality assessment’, IEEE Signal Process. Lett., 2016, 23, (10), pp. 13941398.
    21. 21)
      • 21. Shi, S., Zhang, X., Wang, S., et al: ‘Study on subjective quality assessment of screen content images’. Picture Coding Symp. (PCS), Cairns, Australia, 2015, pp. 7579.
    22. 22)
      • 22. Wu, Q., Li, H., Meng, F., et al: ‘Blind image quality assessment based on multichannel feature fusion and label transfer’, IEEE Trans. Circuits Syst. Video Technol., 2016, 26, (3), pp. 425440.
    23. 23)
      • 23. Saad, M.A., Bovik, A.C., Charrier, C.: ‘Blind image quality assessment: a natural scene statistics approach in the DCT domain’, IEEE Trans. Image Process., 2012, 21, (8), pp. 33393352.
    24. 24)
      • 24. Qian, J., Tang, L., Jakhetiya, V., et al: ‘Towards efficient blind quality evaluation of screen content images based on edge-preserving filter’, Electron. Lett., 2017, 53, (9), pp. 592594.
    25. 25)
      • 25. Gu, K., Zhai, G., Lin, W., et al: ‘Learning a blind quality evaluation engine of screen content images’, Neurocomputing, 2016, 196, pp. 140149.
    26. 26)
      • 26. Wang, Z., Simoncelli, E.P.: ‘Reduced-reference image quality assessment using a wavelet-domain natural image statistic model’, Hum. Vis. Electron. Imag., 2005, 5666, pp. 149159.
    27. 27)
      • 27. Narwaria, M., Lin, W., McLoughlin, I.V., et al: ‘Fourier transform-based scalable image quality measure’, IEEE Trans. Image Process., 2012, 21, (8), pp. 33643377.
    28. 28)
      • 28. Li, Q., Wang, Z.: ‘Reduced-reference image quality assessment using divisive normalization-based image representation’, IEEE. J. Sel. Top. Signal. Process., 2009, 3, (2), pp. 202211.
    29. 29)
      • 29. Gu, K., Zhai, G., Yang, X., et al: ‘A new reduced-reference image quality assessment using structural degradation model’. IEEE Int. Symp. on Circuits and Systems (ISCAS), Beijing, China, 2013, pp. 10951098.
    30. 30)
      • 30. Zhang, M., Wufeng Xue, X.M.: ‘Reduced reference image quality assessment based on statistics of edge’, Proc. SPIE, 2011, 7876, pp. 78767876.
    31. 31)
      • 31. Min, X., Gu, K., Zhai, G., et al: ‘Saliency-induced reduced-reference quality index for natural scene and screen content images’, Signal Process., 2018, 145, pp. 127136.
    32. 32)
      • 32. Gu, K., Zhai, G., Lin, W., et al: ‘The analysis of image contrast: from quality assessment to automatic enhancement’, IEEE Trans. Cybern., 2016, 46, (1), pp. 284297.
    33. 33)
      • 33. Wang, S., Gu, K., Zhang, X., et al: ‘Reduced-reference quality assessment of screen content images’, IEEE Trans. Circuits Syst. Video Technol., 2018, 28, (1), pp. 114.
    34. 34)
      • 34. Che, Z., Zhai, G., Gu, K., et al: ‘Reduced-reference quality metric for screen content image’, IEEE International Conference on Image Processing (ICIP), Beijing, 2017, pp. 18521856.
    35. 35)
      • 35. Wang, S., Gu, K., Zhang, X., et al: ‘Subjective and objective quality assessment of compressed screen content images’, IEEE J. Emerg. Sel. Top. Circuits Syst., 2016, 6, (4), pp. 532543.
    36. 36)
      • 36. Lowe, D.G.: ‘Distinctive image features from scale-invariant keypoints’, Int. J. Comput. Vis., 2004, 60, (2), pp. 91110.
    37. 37)
      • 37. Temel, D., AlRegib, G.: ‘ReSIFT: reliability-weighted sift-based image quality assessment’. IEEE Int. Conf. on Image Processing (ICIP), Phoenix, USA, 2016, pp. 20472051.
    38. 38)
      • 38. Wen, G.L., Liu, G., Zheng, S.G., et al: ‘Enhanced image quality evaluation based on sift feature’. Int. Conf. on Machine Learning and Cybernetics (ICMLC), Lanzhou, China, 2014, vol. 1, pp. 221226.
    39. 39)
      • 39. Sun, T., Ding, S., Xu, X.: ‘No-reference image quality assessment through sift intensity’, Appl. Math. Inf. Sci., 2014, 8, (4), pp. 19251934.
    40. 40)
      • 40. Chen, G., Coulombe, S.: ‘An image visual quality assessment method based on sift features’, J. Pattern Recognit. Res., 2013, 1, pp. 8597.
    41. 41)
      • 41. Decombas, M., Dufaux, F., Renan, E., et al: ‘A new object based quality metric based on SIFT and SSIM’. 19th IEEE Int. Conf. on Image Processing (ICIP), Orlando, USA, 2012, pp. 14931496.
    42. 42)
      • 42. Wells, W.M.: ‘Efficient synthesis of Gaussian filters by cascaded uniform filters’, IEEE Trans. Pattern Anal. Mach. Intell., 1986, 8, (2), pp. 234239.
    43. 43)
      • 43. Burt, P.J., Adelson, E.H.: ‘The Laplacian pyramid as a compact image code’, IEEE Trans. Commun., 1983, 31, (4), pp. 532540.
    44. 44)
      • 44. Harris, C., Stephens, M.: ‘Citeseer. A combined corner and edge detector’. Proc. Fourth Alvey Vision Conf., Manchester, UK, 1988, Vol. 15, no. 50, pp. 147151.
    45. 45)
      • 45. Zhang, L., Shen, Y., Li, H.: ‘VSI: a visual saliency-induced index for perceptual image quality assessment’, IEEE Trans. Image Process., 2014, 23, (10), pp. 42704281.
    46. 46)
      • 46. Wang, Z.: ‘SSIM index implementation version 1’. Available at https://ece.uwaterloo.ca/∼z70wang/research/ssim/ssim.m, accessed October 2018.
    47. 47)
      • 47. Wang, Z.: ‘SSIM index implementation version 1’. Available at https://ece.uwaterloo.ca/∼z70wang/research/ssim/ssim_index.m, accessed October 2018.
    48. 48)
      • 48. Rohaly, A.M., Corriveau, P.J., Libert, J.M., et al: ‘Video quality experts group: current results and future directions’. Int. Conf. on Visual Communications and Image Processing (VCIP), Perth, Australia, 2000, Vol. 4067, pp. 742754.
    49. 49)
      • 49. Larson, E.C., Chandler, D.M.: ‘Unveiling relationships between regions of interest and image fidelity metrics’. Int. Conf. on Visual Communications and Image Processing, Xi'an, China, 2008, Vol. 6822, pp. 68222A168222A16.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-ipr.2018.5496
Loading

Related content

content/journals/10.1049/iet-ipr.2018.5496
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address