What is it about?

This paper introduces a classification model that utilizes webcam-based eye-tracking data to classify Alzheimer's disease (AD). The aim is to develop lightweight and cost-effective predictors for initial screening of AD. Previous research used high-end eye-trackers during picture description and reading tasks for AD classification. Our results demonstrate that our webcam gaze classifier, Webcam-GRU, performs better than a majority class Baseline, indicating its ability to detect predictive signals from webcam-based gaze data. This suggests that using an affordable webcam instead of an expensive eye-tracker could facilitate the development of more accessible screening tools for dementia. Our study is the first to investigate webcam gaze data for AD classification, providing insights into its potential as a substitute for high-end eye-tracking data. We used deep learning models to train our classifiers, bypassing the challenge of identifying meaningful features. Webcam-GRU showed better performance on reading tasks, suggesting its suitability for tasks with similar gaze patterns. However, combining webcam-based gaze data with other modalities, such as language data, could enhance classification in tasks with more varied gaze patterns.

Featured Image

Why is it important?

This research is important for several reasons. Firstly, it contributes to the development of lightweight and low-cost predictors for Alzheimer's disease (AD) screening, which can be used as an initial screening tool. By leveraging webcam-based eye-tracking data, this study explores the potential of using common and affordable webcams instead of expensive high-end eye-trackers. The results show promising evidence that webcam-based gaze data can provide predictive signals for AD classification, offering a viable solution to develop more accessible screening tools for dementia. Furthermore, this study is the first to investigate the use of webcam gaze data for AD classification, providing valuable insights into its potential as a substitute for high-end eye-tracking data. It extends the understanding of web-based gaze data's effectiveness in user classification tasks, which were previously reliant on high-end eye-trackers. The research also explores the use of deep learning models to train classifiers end-to-end, addressing the challenge of identifying meaningful features in web-based gaze data that lacks specific constructs found in high-end eye-tracking data. The findings suggest that the approach used in this study is particularly suitable for classification tasks involving similar gaze patterns across participants, as demonstrated by the better performance on reading tasks. However, the study also highlights the value of combining webcam-based gaze data with other modalities, such as language data, for classification tasks with more varied gaze patterns. Overall, this research opens up possibilities for more accessible and cost-effective AD screening methods and sheds light on the potential of web-based gaze data in various user classification tasks beyond AD classification alone.

Read the Original

This page is a summary of: Classification of Alzheimer's using Deep-learning Methods on Webcam-based Gaze Data, Proceedings of the ACM on Human-Computer Interaction, May 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3591126.
You can read the full text:

Read

Contributors

The following have contributed to this page