Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

Consumer electronics control system based on hand gesture moment invariants

Consumer electronics control system based on hand gesture moment invariants

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Almost all consumer electronic equipment today uses remote controls for user interfaces. However, the variety of physical shapes and functional commands that each remote control features also raises numerous problems: the difficulties in locating the required remote control, the confusion with the button layout, the replacement issue and so on. The consumer electronics control system using hand gestures is a new innovative user interface that resolves the complications of using numerous remote controls for domestic appliances. Based on one unified set of hand gestures, this system interprets the user hand gestures into pre-defined commands to control one or many devices simultaneously. The system has been tested and verified under both incandescent and fluorescent lighting conditions. The experimental results are very encouraging as the system produces real-time responses and highly accurate recognition towards various gestures.

References

    1. 1)
      • L. Fausett . (1994) Fundamentals of neural networks.
    2. 2)
      • Quam, D.L.: `Gesture recognition with a dataglove', Proc. 1990 IEEE National Aerospace and Electronics Conf., 1990, 2, p. 755–760.
    3. 3)
      • Shan, C., Wei, Y., Qiu, X., Tan, T.: `Gesture recognition using temporal template based trajectories', Proc. 17th Int. Conf. Pattern Recognition, 2004, 3, p. 954–957.
    4. 4)
    5. 5)
    6. 6)
      • Hasanuzzaman, M., Zhang, T., Ampornaramveth, V., Kiatisevi, P., Shirai, Y., Ueno, H.: `Gesture based human–robot interaction using a frame based software platform', IEEE Int. Conf. Man Cybernet., 2004, 3, p. 2883–2888.
    7. 7)
      • Marius, D., Pennathur, S., Rose, K.: `Face detection using colour thresholding, and Eigen image template matching', EE368: Digital Image Processing Project, May 2003, Stanford University, Standford, CA, http://www.stanford.edu/class/ee368/Project_03/Project/reports/ee368group15.pdf, accessed 20, August.
    8. 8)
      • J. Lee , T.L. Kunii . Model-based analysis of hand posture. IEEE Comput. Graphics Appl. , 77 - 86
    9. 9)
      • M. Zobl , M. Geiger , B. Schuller , M. Lang , G. Rigoll . A real-time system for hand gesture controlled operation of in-car devices. Proc. Int. Con. Multimedia and Expo , 541 - 544
    10. 10)
      • P. Premaratne . (2003) ISAR ship classification: an alternative approach.
    11. 11)
      • S. Haykin . (1999) Neural networks.
    12. 12)
      • Maggioni, C.: `A novel gestural input device for virtual reality', 1993 IEEE Annual Virtual Reality Int. Symp., 1993, p. 118–124.
    13. 13)
      • Davis, J., Shah, M.: `Recognizing hand gestures', Proc. European Conf. Computer Vision, Stockholm, 1994, p. 331–340.
    14. 14)
      • R.C. Gonzalez , R.E. Woods , S.L. Eddins . (2004) Digital image processing using MATLAB.
    15. 15)
      • Lee, L.K., Ki, S., Choi, Y., Lee, M.H.: `Recognition of hand gesture to human–computer interaction', IEEE 26th Annual Conf., 2000, 3, p. 2117–2122.
    16. 16)
      • Q. Zhongliang , W. Wenjun . (1992) Automatic ship classification by superstructure moment invariants and two-stage classifier.
    17. 17)
      • Kuno, Y., Sakamoto, M., Sakata, K., Shirai, Y.: `Vision-based human computer interface with user centred frame', Proc. IROS'94, 1994.
    18. 18)
      • Cipolla, R., Okamoto, Y., Kuno, Y.: `Robust structure from motion using motion parallax', Proc. IEEE Int. Conf. Comput. Vision, 1993, p. 374–382.
    19. 19)
    20. 20)
      • K. Gurney . (1997) An introduction to neural networks.
    21. 21)
      • Wang, C., Cannon, D.J.: `A virtual end-effector pointing system in point-and-direct robotics for inspection of surface flaws using a neural network-based skeleton transform', Proc. IEEE Int. Conf. Robot. Automation, 1993, 3, p. 784–789.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi_20060198
Loading

Related content

content/journals/10.1049/iet-cvi_20060198
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address