access icon free Entropic uncertainty inequalities on sparse representation

In this study, some new entropic inequalities on sparse representation for pairs of bases are investigated. First, the generalised Shannon entropic uncertainty principle and the generalised Rényi entropic uncertainty principle via new derived Hausdorff–Young inequality are proved. These new derived uncertainty principles show that signals cannot have unlimited concentration related to minimum entropies in pairs of bases. Second, the conditions of uniqueness of sparse representation under minimum entropies are given. This study also demonstrates that the entropic greedy-like algorithms can achieve the ‘sparsest’ representation for minimum entropies approximately. Third, the relations between the minimum l 0 solution and minimum entropy are discussed as well. It shows that even if the sparsest representation of l 0-norm is obtained, the entropy cannot always be the minimum and it is possible that the entropy is limited to a interval. These new derived inequalities will be primarily to contribute to a better understanding of sparse representation in the sense of limited entropic uncertainty bounds. Finally, the experiments are shown to verify the authors’ ideas.

Inspec keywords: signal representation; greedy algorithms; minimum entropy methods

Other keywords: generalised Rényi entropic uncertainty principle; l0-norm sparse signal representation; entropic greedy-like algorithm; generalised Shannon entropic uncertainty principle; Hausdorff-Young inequality; minimum entropic uncertainty inequality

Subjects: Signal processing and detection; Signal processing theory

References

    1. 1)
    2. 2)
    3. 3)
    4. 4)
    5. 5)
    6. 6)
    7. 7)
      • 28. Compressed Sensing. Available at http://www.dsp.ece.rice.edu/CS/.
    8. 8)
    9. 9)
    10. 10)
    11. 11)
    12. 12)
    13. 13)
    14. 14)
    15. 15)
    16. 16)
    17. 17)
    18. 18)
    19. 19)
      • 38. Birula, I.B.: ‘Entropic uncertainty relations in quantum mechanics’, in Accardi, L., von Waldenfels, W. (Eds.): ‘Quantum probability and applications II’ (Springer, Berlin, 1985), LectureNotes in Mathematics1136, p. 90.
    20. 20)
      • 46. Kapur, J.N., Baciu, G., Kesavan, H.K.: ‘On the relationship between variance and minimum entropy’, Univesity of Waterloo, Waterloo, Ontario, Canada, Intern. Publ., 1994.
    21. 21)
    22. 22)
    23. 23)
    24. 24)
    25. 25)
    26. 26)
    27. 27)
    28. 28)
    29. 29)
    30. 30)
    31. 31)
    32. 32)
      • 37. Renyi, A.: ‘On measures of information and entropy’. Proc. of the Fourth Berkeley Symp. on Mathematics, Statistics and Probability, 1960, p. 547.
    33. 33)
      • 26. Pati, Y.C., Rezaiifar, R., Krishnaprasad, P.S.: ‘Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition’. Proc. of the 27th Annual Asilomar Conf. on Signals Systems and Computers, 1–3 November 1993.
    34. 34)
    35. 35)
    36. 36)
      • 41. Maassen, H.: ‘A discrete entropic uncertainty relation’, Quantum Probab. Appl., 1988, 5, pp. 263266.
    37. 37)
      • 18. Donoho, D.L.: ‘Neighborly polytopes and sparse solutions of underdetermined linear equations’. Technical Report 2005-4, Statistics Department, Stanford University, Stanford, CA, 2005.
    38. 38)
    39. 39)
    40. 40)
    41. 41)
    42. 42)
      • 29. Somaraju, R., Hanlen, L.W.: ‘Uncertainty principles for signal concentration’. Communications Theory Workshop, 2006. Proceedings, 10.1109/AUSCTW.2006.1625252.
    43. 43)
    44. 44)
    45. 45)
    46. 46)
    47. 47)
    48. 48)
      • 12. Davis, G., Mallat, S., Avellaneda, M.: ‘Adaptive greedy approximations’ (Springer-Verlag, New York, 1997).
    49. 49)
    50. 50)
      • 36. Renyi, A.: ‘Some fundamental questions of information theory’, MTA III Oszt. Kozl., 1960.
    51. 51)
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-spr.2014.0072
Loading

Related content

content/journals/10.1049/iet-spr.2014.0072
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading