Emerging Technologies

NaturalEyezer: an interaction system based on natural reading eye movement detection

Eye gaze-based interaction can free the hands, so many researchers have tried to use gaze behavior in human–computer interfaces. However, solving “Midas touch” problem is important for interactions based on human activities such as hand gestures, speech recognition, and eye gaze. For example, if a user sees a point, the system needs to detect whether the user observing the point is looking deliberately or looking without concentrating to determine the user’s internal state. The easiest way to solve this problem is to use a trigger action such as pushing a button in combination with a voice command. Trigger actions are used frequently, but many unrelated trigger actions can be problematic. This can affect the convenience of using a natural user interface. Thus, we propose a novel method and design an eye-tracker prototype that can recognize natural gaze movements during reading in real-time with high accuracy. Interactions based on natural eye movements can be designed using our proposed eye movement detection method to solve the Midas touch problem. If a user tries to read a sentence written in a foreign language, the system recognizes the eye movements during reading and translates the sentence, which is spoken aloud automatically. Thus, the user’s desire to read initiates this interaction. The system can selectively support the user’s reading behavior if they want to read, without any requirement for an unrelated trigger action. Our prototype eye tracker can also be used with high wearability on a daily basis. Thus, we consider that our method may contribute to advancing research into human vision and allow natural gaze-based interactions in daily life.

Yoshio Ishiguro The University of Tokyo

Jun Rekimoto
The University of Tokyo / Sony Computer Science Laboratories, Inc.