Our eyes constantly indicate what we are interested in and are a remote pointer our attention. They are accurate, fast, and hold a lot of information about what we are doing. For these reasons, eye tracking is an interesting modality for human-computer interaction. There has been much research on the use of the eyes to control interfaces. However, our eyes are our primary sensor to understand the world around us and are not naturally used as means of control. Eye-based interfaces can thus feel frustrating, uncomfortable, or counter-intuitive. Which information can we harvest from the eyes without disrupting the sensory process? I will cover the tools necessary to perform eye tracking, then detail ways and strategies to create spontaneous and seamless eye-based interfaces.
Bio [ web: http://www.melodie-vidal.eu/ ]
Mélodie Vidal is a 4th year PhD student in Human-Computer Interactions at Lancaster University, UK. Her research focuses on the use of eye movements to create natural and seamless interactive experiences. She recently spent 7 months at Nokia working on wearable computing interfaces. Mélodie holds a Masters of Artificial Intelligence and a degree in Software Engineering from INSA Toulouse, France.