EyeControl - Innovative Eye Gaze Research
Short Description
Eye gaze research has recently made significant contributions in the fields of human visual perception, attention and non-verbal communication. Recent technological advances in affordable mobile tracker technology are rapidly opening up new applications.
While research has so far focused primarily on usability studies and on assessing the effectiveness of advertisements or websites, the EyeControl project aims to create a whole new way of human-machine interaction in industrial manufacturing settings by introducing eye gaze as a modality of implicit and explicit interaction with industrial machines.
Eye-Tracking Approach for Eye-Gaze-Based Interaction
EyeControl aims at establishing eye gaze as an interaction modality with industrial machinery, building upon eye-tracking techniques developed within the research team of the consortium (opportunistic gaze sensing, real-time cognitive load estimation from gaze behaviour). It will ultimately develop a framework for gaze-only machine controls, opposing traditional augmented and mixed reality solutions to human-machine interaction in industrial manufacturing.
This project develops a methodological apparatus of universal, general purpose, re-usable control components (e.g. for pointing, selecting, manipulating, etc.), serving as a repository of plug-and-play modules to assist the creation of gazebased controls in a wide range of industrial application domains (construction, maintenance, repair, quality assurance, etc.).
Technologically, EyeControl builds upon mobile eye-tracking sensors to
- analyse gaze behaviour in real time,
- evaluate perception and consciousness based on cognitive load models of industrial workers and
- develop explicit and implicit interaction triggers as a means of gaze-based expressions of industrial machinery control.
Collaboration with World Market Leaders
EyeControl will be developed and validated together with world-leading Austrian and European industrial stakeholders (Industry 4.0) with the aim
- of optimising product quality based on human visual inspections and
- of optimising cognitive load sensitive, guided interactions in complex assembly tasks.
The anticipated research results are expected to lay the ground for a new generation of human-machine interaction modalities, with potential applications in other domains such as medical engineering and maintenance engineering.
Project Partners
Consortium Manager
Johannes Kepler University Linz
Further Project Partners
- Research Studios Austria
- Forschungsgesellschaft mbH
- voestalpine Polynorm GmbH & Co. KG
- voestalpine Stahl GmbH
Contact Address
Project Coordinator
Prof. Mag. Dr Alois Ferscha
E-mail: ferscha@pervasive.jku.at