Object detection and tracking under Complex environment using deep learning-based LPM

Object detection and tracking under Complex environment using deep learning-based LPM

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Object detection and tracking under complex environment are challenging because of the disturbances induced by background clutter, illumination changes, occlusions and other factors. The bulk of traditional algorithms basically rely on hand-crafted features, which are not sufficiently robust to a complex environment. Moreover, the processes of detection and tracking are separated, which leads to the overall efficiency not high. In this study, a novel local probability model (LPM)-based mean shift (MS) algorithm is proposed to integrate object detection and tracking. The main contributions include: (i) a new framework based on the combination of LPM and MS is established for the integration of object tracking and detection. (ii) For object detection, the training and prediction of LPM are built by stacked denoising autoencoders based deep learning. (iii) For object tracking, an MS tracking algorithm leveraging LPM is modified to improve the tracking efficiency under a complex environment. Experimental results demonstrate that the proposed method is superior to the colour histograms based MS and histograms of oriented gradients based MS in terms of robustness and tracking accuracy.

Related content

This is a required field
Please enter a valid email address