EyeControl

The EyeControl project developed an innovative interaction system based on eye-tracking to enable efficient and intuitive human-machine interactions. Mobile eye-tracking sensors were used to reduce technological overload and integrate gaze control into various application systems.

Short Description

Enabling efficient interaction with future cognition-aware systems requires innovative interaction modalities which are suited to the increasing need for fast, seamless integration of information processing into everyday life and interactions. One challenge associated with the digital transformation is the increasing fusion of the digital with the physical world. In terms of the interaction between humans and machines, this challenge is being met with augmented reality, mixed reality and virtual reality solutions. This increasingly requires methods of interaction that go beyond traditional input media (mouse, keyboard, voice input).

device-free approach to explicit and implicit interactions between humans and machines, based on the use of the human gaze as the method of human interaction that comes closest to perception, information processing and thereby also cognition. The human gaze is particularly suitable as a method for implementing fast, efficient and convenient user interaction, as

  • it is the main information perception sensor,
  • (ii) services based on eye gaze are directly exploiting natural user behaviour for true user-centred design and
  • are more versatile than device-based methods of interaction. Furthermore, gaze-based interaction is
  • free from wear-and-tear, maintenance-free and
  • is hygienic, enables
  • remote access and interaction at a distance, which increases convenience and safety, provides
  • deep insights into the user's activities and keeps
  • hands free for manual activities to be carried out at the same time.

The EyeControl project created a natural and intuitive system for gaze control that is based on the evaluation and interpretation of gaze behaviour with regard to underlying attention and intention mechanisms. The aim of this was not to replace conventional pointing and selection methods and devices, but rather to create new possibilities for interaction based on cognition and perception.

The EyeControl project

  • used mobile eye-tracking sensors to reduce technological overload
  • identified characteristic gaze features (gaze behaviour, somatic indicators, gaze modelling) in order to achieve an independent gaze control system
  • supplemented conventional technical approaches with findings from cognitive psychology and HCI in order to distinguish between conscious and unconscious processing and perception so as to overcome the Midas touch problem
  • developed a general eye-gaze-based control tool kit, adaptable to various fields of application

Working with voestalpine, the EyeControl system was used in two use cases in the field of quality assurance for cyber-physical systems (visual inspection and complex workflow support). The technologies developed were transferred from initial implementations in the laboratory via iterative on-site field studies to a final on-site installation and evaluation of the developed system.

Publications

Brochure: Digital Technologies (2024)

Ready for the Future: Smart, Green and Visionary. Project Highlights of the Years 2016 to 2021. FFG: Olaf Hartmann, Anita Hipfinger, Peter Kerschl
Publisher: Federal Ministry for Climate Action, Environment, Energy, Mobility, Innovation, and Technology
English, 72 Seiten

Publication Downloads

Project Partners

Consortium leader

  • Johannes Kepler Universität Linz/Institut für Pervasive Computing

Additional consortium partners

  • Project coordinatorResearch Studios Austria FG – Pervasive Computing Applications
  • voestalpine Stahl GmbH
  • voestalpine Automotive Components Schwäbisch Gmünd GmbH & Co KG