Recognizing human actions based on motion information and SVM
Recognizing human actions based on motion information and SVM
- Author(s):
- DOI: 10.1049/cp:20060648
For access to this article, please select a purchase option:
Buy conference paper PDF
Buy Knowledge Pack
IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.
2nd IET International Conference on Intelligent Environments (IE 06) — Recommend this title to your library
Thank you
Your recommendation has been sent to your librarian.
- Author(s): Source: 2nd IET International Conference on Intelligent Environments (IE 06), 2006 page ()
- Conference: 2nd IET International Conference on Intelligent Environments (IE 06)
- DOI: 10.1049/cp:20060648
- ISBN: 0 86341 663 2
- Location: Athens, Greece
- Conference date: 5-6 July 2006
- Format: PDF
In this paper, we propose a new system for human action recognition with a view to applications in security systems, man-machine communications and intelligent environments. Our system is based on very simple features in order to achieve high-speed recognition in real-world applications. We have chosen three main techniques to build a system that can work in real-time. Firstly, we choose motion history images and related features. Secondly, we use a template matching methods instead of state-space methods that need expensive modelling processes; finally, we use linear classifier support vector machine (SVM) for fast classification. Experimental results show that this system can achieve good performance in human action recognition in realtime embedded applications, such as intelligent environments. (7 pages)
Inspec keywords: gesture recognition; image classification; support vector machines; image motion analysis; image matching
Subjects: User interfaces; Optical, image and video signal processing; Knowledge engineering techniques; Computer vision and image processing techniques
Related content
content/conferences/10.1049/cp_20060648
pub_keyword,iet_inspecKeyword,pub_concept
6
6