Accurate gaze estimation based on average-binary-connected-component-centroid

Accurate gaze estimation based on average-binary-connected-component-centroid

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Iris centre is widely used in feature-based gaze estimation system. However, the performance of various existing iris centre localisation methods is degraded in low-resolution images. To address the problem, an accurate regression-based gaze estimation system is proposed by replacing the iris centre with a novel average-binary-connected-component-centroid (ABCCC) feature. First, ABCCC is obtained by averaging a series of binary connected component centroids, which are calculated by different percentile grey thresholds in the normalised eye region of interest. Then, head pose is estimated by employing the OpenCV iterative algorithm. Thereafter, ABCCC and head pose are mapped to compute the gaze using a linear mapping functions, which are regressed in calibration phase. Compared with traditional methods using iris centre, ABCCC is easier to calculate and shows a stronger correlation with the change of gaze, which improves the accuracy of the gaze estimation system. Experimental results on EYEDIAP database verify the effectiveness of the proposed method.


    1. 1)
    2. 2)
      • 2. Wang, Y., Zhao, T., Ding, X., et al: ‘Head pose-free eye gaze prediction for driver attention study’. BigComp, Jeju, South Korea, February 2017, pp. 4246.
    3. 3)
      • 3. Tripathi, S., Guenter, B.: ‘A statistical approach to continuous self-calibrating eye gaze tracking for head-mounted virtual reality systems’. IEEE Winter Conf. on Applications of Computer Vision (WACV), Santa Rosa, CA, USA, March 2017, pp. 862870.
    4. 4)
      • 4. Mora, K.A.F., Monay, F., Odobez, J.M.: ‘Eyediap: a database for the development and evaluation of gaze estimation algorithms from rgb and rgb-d cameras’. Proc. of the Symp. on Eye Tracking Research and Applications (ETRA), Safety Harbor, Florida, USA, March 2014, pp. 255258.
    5. 5)
      • 5. Kazemi, V., Sullivan, J.: ‘One millisecond face alignment with an ensemble of regression trees’. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Columbus, OH, USA, June 2014, pp. 18671874.
    6. 6)
    7. 7)
      • 7. Ghiass, R.S., Arandjelovic, O.: ‘Highly accurate gaze estimation using a consumer RGB-D sensor’. Proc. of the Int. Joint Conf. on Artificial Intelligence (IJCAI), New York, USA, July 2016, pp. 33683374.

Related content

This is a required field
Please enter a valid email address