Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

Eye tracking for interaction: adapting multimedia interfaces

Eye tracking for interaction: adapting multimedia interfaces

For access to this article, please select a purchase option:

Buy chapter PDF
$16.00
(plus tax if applicable)
Buy Knowledge Pack
10 chapters for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Signal Processing to Drive Human-Computer Interaction: EEG and eye-controlled interfaces — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

This chapter describes how eye tracking can be used for interaction. The term eye tracking refers to the process of tracking the movement of eyes in relation to the head, to estimate the direction of eye gaze. The eye gaze direction can be related to the absolute head position and the geometry of the scene, such that a point -of -regard (POR) may be estimated. We call the sequential estimations of the POR gaze signals in the following, and a single estimation gaze sample. In Section 5.1, we provide basic description of the eye anatomy, which is required to understand the technologies behind eye tracking and the limitations of the same. Moreover, we discuss popular technologies to perform eye tracking and explain how to process the gaze signals for real-time interaction. In Section 5.2, we describe the unique challenges of eye tracking for interaction, as we use the eyes primarily for perception and potentially overload them with interaction. In Section 5.3, we survey graphical interfaces for multimedia access that have been adapted to work effectively with eye-controlled interaction. After discussing the state-of-the-art in eye-controlled multimedia interfaces, we outline in Section 5.4 how the contextualized integration of gaze signals might proceed in order to provide richer interaction with eye tracking.

Chapter Contents:

  • 5.1 Tracking of eye movements
  • 5.1.1 Anatomy of the eye
  • 5.1.2 Techniques to track eye movements
  • 5.1.2.1 Electro oculography
  • 5.1.2.2 Scleral search coils
  • 5.1.2.3 Video oculography
  • 5.1.3 Gaze signal processing
  • 5.1.3.1 Calibration of the eye-tracking system
  • 5.1.3.2 Error modeling of gaze signals
  • 5.1.3.3 Filtering of gaze signals
  • 5.1.3.4 Online filtering of gaze signals for eye-controlled interaction
  • 5.2 Eye-controlled interaction
  • 5.2.1 Selection methods
  • 5.2.2 Unimodal interaction
  • 5.2.2.1 Eye pointing
  • 5.2.2.2 Eye typing
  • 5.2.3 Multimodal interaction
  • 5.2.4 Emulation software
  • 5.3 Adapted multimedia interfaces
  • 5.3.1 Adapted single-purpose interfaces
  • 5.3.1.1 Drawing with eye movements
  • 5.3.1.2 Writing with eye movements
  • 5.3.1.3 Gaming with eye movements
  • 5.3.1.4 Social media with eye movements
  • 5.3.2 Framework for eye-controlled interaction
  • 5.3.2.1 Gaze-adapted interface with eyeGUI
  • 5.3.2.2 Architecture of eyeGUI
  • 5.3.3 Adaptation of interaction with multimedia in the Web
  • 5.3.3.1 Eye-controlled Web browsing
  • 5.3.3.2 Introspection of dynamic web page interfaces
  • 5.3.3.3 Gaze-adapted interaction with webpages
  • 5.3.3.4 Challenges and limitations
  • 5.4 Contextualized integration of gaze signals
  • 5.4.1 Multimedia browsing
  • 5.4.2 Multimedia search
  • 5.4.3 Multimedia editing
  • 5.5 Summary
  • References

Inspec keywords: gaze tracking; graphical user interfaces

Other keywords: graphical interfaces; eye anatomy; eye tracking; absolute head position; eye gaze direction; POR gaze signals; multimedia interfaces; eye-controlled interaction; point-of-regard

Subjects: Graphical user interfaces

Preview this chapter:
Zoom in
Zoomout

Eye tracking for interaction: adapting multimedia interfaces, Page 1 of 2

| /docserver/preview/fulltext/books/ce/pbce129e/PBCE129E_ch5-1.gif /docserver/preview/fulltext/books/ce/pbce129e/PBCE129E_ch5-2.gif

Related content

content/books/10.1049/pbce129e_ch5
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address