Arnon Amir IBM Almaden Research Center
Eye detection is the problem of finding eyes' location in video frames. Eye gaze tracking is the task of detecting where the eyes are looking at. Either has multiple applications. Eye detection is used for face detection
and recognition, surveillance, iris recognition, video conference,
autostereoscopic displays and more. Commercial eye gaze tracking systems are available for several decades and are being used in psychology, ophthalmology, and for special purposes in critical tasks. Tracking gaze, or the point of regard on a computer screen, was also shown to be beneficial for many computer applications and could serve as a generic input device to improve Human Computer Interaction (HCI). However, despite of the long research of eye movements, most existing remote gaze trackers are still not adequate for daily HCI use. A major limitation of such systems is the need for manual initialization and user calibration at the beginning of every session, after which the user must not move his/her head. In this talk I will present the work we have done in the IBM Attentive Environments project in this domain. Going from detection of eyes, through detection of eye contact, and up to a gaze tracking system which requires no user calibration and allows free head motion. The talk will include a live demonstration of the first prototype of an embedded system for eye detection. As time permits, I will describe several applications of eye gaze tracking for HCI.
Joint work with Carlos Morimoto, David Koons, Myron Flickner and David Beymer