Energy-efficient signal acquisition in wireless sensor networks: a compressive sensing framework

Energy-efficient signal acquisition in wireless sensor networks: a compressive sensing framework

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Wireless Sensor Systems — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

The sampling rate of the sensors in wireless sensor networks (WSNs) determines the rate of its energy consumption, since most of the energy is used in sampling and transmission. To save the energy in WSNs and thus prolong the network lifetime, the authors present a novel approach based on the compressive sensing (CS) framework to monitor 1-D environmental information in WSNs. The proposed technique is based on CS theory to minimise the number of samples taken by sensor nodes. An innovative feature of the proposed approach is a new random sampling scheme that considers the causality of sampling, hardware limitations and the trade-off between the randomisation scheme and computational complexity. In addition, a sampling rate indicator feedback scheme is proposed to enable the sensor to adjust its sampling rate to maintain an acceptable reconstruction performance while minimising the number of samples. A significant reduction in the number of samples required to achieve acceptable reconstruction error is demonstrated using real data gathered by a WSN located in the Hessle Anchorage of the Humber Bridge.


    1. 1)
    2. 2)
    3. 3)
    4. 4)
    5. 5)
    6. 6)
    7. 7)
    8. 8)
    9. 9)
    10. 10)
    11. 11)
      • Hamamoto, T., Nagao, S., Aizawa, K.: `Real-time objects tracking by using smart image sensor and FPGA', Proc. 2002 Int. Conf. on Image Processing, 2002, 3, p. III-441–III-444.
    12. 12)
      • Chen, W., Wassell, I.J.: `Energy efficient signal acquisition via compressive sensing in wireless sensor networks', Sixth Int. Symp. on Wireless Pervasive Computing, ISWPC, 2011.
    13. 13)
      • Available at:, 2009.
    14. 14)
    15. 15)
    16. 16)
    17. 17)
    18. 18)
    19. 19)
    20. 20)
    21. 21)
    22. 22)
      • Available at:, 2009.
    23. 23)
    24. 24)
      • Malioutov, D., Cetin, M., Willsky, A.: `Homotopy continuation for sparse signal representation', Proc. IEEE Int. Conf. on Acoustics, Speech, and Signal Processing (ICASSP 2005), 2005, 5, p. v/733–v/736.
    25. 25)
    26. 26)
      • P. Garrigues , L.E. Ghaoui . A homotopy algorithm for the LASSO with online observations. Adv. Neural Inf. Process. Syst. , 489 - 496
    27. 27)
    28. 28)
      • Pati, Y., Rezaiifar, R., Krishnaprasad, P.: `Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition', Record of the 27th Asilomar Conf. on Signals, Systems and Computers, November 1993, 1, p. 40–44.
    29. 29)
    30. 30)
    31. 31)
    32. 32)
    33. 33)
      • Boyle, F.A., Haupt, J., Fudge, G.L., Yeh, C.-C.A.: `Detecting signal structure from randomly-sampled data', IEEE/SP 14th Workshop on Statistical Signal Processing, SSP ’07, 2007, p. 326–330.
    34. 34)
      • Dang, T., Bulusu, N., Hu, W.: `Lightweight acoustic classification for cane-toad monitoring', 2008 42nd Asilomar Conf. on Signals, Systems and Computers, 2008, p. 1601–1605.
    35. 35)
    36. 36)
      • Charbiwala, Z., Kim, Y., Zahedi, S., Friedman, J., Srivastava, M.B.: `Energy efficient sampling for event detection in wireless sensor networks', Proc. 14th ACM/IEEE Int. Symp. on Low Power Electronics and Design, ISLPED’09, 2009, New York, NY, USA, p. 419–424.
    37. 37)
    38. 38)
    39. 39)

Related content

This is a required field
Please enter a valid email address