What is it about?

Automatic gaze estimation not based on commercial and expensive eye tracking hardware solutions can enable several applications in the fields of human-computer interaction (HCI) and human behavior analysis. In this work, we propose a real-time user calibration-free gaze estimation system that does not need person-dependent calibration, can deal with illumination changes and head pose variations, and can work with a wide range of distances from the camera. Our solution is based on a 3-D appearance-based method that processes the images from a built-in laptop camera. Real-time performance is obtained by combining head pose information with geometrical eye features to train a machine learning algorithm.

Featured Image

Why is it important?

Because very few camera-based systems proposed in the literature are both real-time and robust. In this work, we provided real user-computer interaction examples.

Perspectives

The system is ready to be used in real-time applications. Future works will investigate how to improve performance by using state of the art deep learning approaches.

Dario Cazzato
Universite du Luxembourg

Read the Original

This page is a summary of: Real-time gaze estimation via pupil center tracking, Paladyn Journal of Behavioral Robotics, February 2018, De Gruyter,
DOI: 10.1515/pjbr-2018-0002.
You can read the full text:

Read

Contributors

The following have contributed to this page