What is it about?

There has long been an interest in understanding how we decide when and where to move our eyes, and psychophysical experiments have uncovered many underlying mechanisms. Under controlled laboratory conditions, objects in the scene play an important role in guiding our attention. Due to the visual complexity of the world around us, however, it is hard to assess experimentally how objects influence eye movements when observing dynamic real-world scenes. Computational models have proved to be a powerful tool for investigating visual attention, but existing models are either only applicable to images or restricted to predicting where humans look on average. Here, we present a computational framework for simulating where and when humans decide to move their eyes when observing dynamic real-world scenes.

Featured Image

Why is it important?

Using our framework, we can assess the influence of objects on the model predictions. We find that including object-based attention in the modeling increases the resemblance of simulated eye movements to human gaze behavior, showing that objects play indeed an important role in guiding our gaze when exploring the world around us. We hope that the availability of this framework encourages more research on attention in dynamic real-world scenes.


The world around us is dynamic and much more complex than your typical stimulus in psychological experiments. These experiments are usually restricted to static images or compositions of simple geometrical forms, and previous models describing how humans explore their environments typically only work in such reduced scenarios. With our modeling framework, we found a simple but powerful approach to test different assumptions about how the visual system might work.

Nicolas Roth
Technische Universitat Berlin

Read the Original

This page is a summary of: Objects guide human gaze behavior in dynamic real-world scenes, PLoS Computational Biology, October 2023, PLOS,
DOI: 10.1371/journal.pcbi.1011512.
You can read the full text:

Open access logo



The following have contributed to this page