What is it about?

Attention helps us distinguish important speech from surrounding noise. Traditionally it's been assumed that attention achieves this by tuning neural processes to relevant sounds, i.e., like you tune into a particular radio frequency. Using novel integration of EEG data (good temporal resolution) and fMRI data (good spatial resolution) we show that selective attentional mechanisms in natural scenes are more complex than assumed, due to being influenced by expectations, experiences, and listening conditions (see video at the bottom of the page).

Featured Image

Why is it important?

Our findings highlight two key insights into the dynamics of auditory attention. Firstly, attention rapidly facilitates perception by integrating and updating sensory expectations (i.e., enhances predictive coding). Secondly, attention operates on slower timescales, potentially reflecting plastic changes in the auditory regions of the brain. We suggest both mechanisms act to optimise the differentiation between relevant and irrelevant sounds in the brain. Our insights not only enhance our understanding of human cognition but also have practical implications, such as improving AI transcription and sensory aids in handling challenging noisy speech conditions.

Perspectives

I hope this article will help us rethink how our nervous system processes sensory information. We seldom realise our perception of the world is actually an elaborate illusion our brain creates. Because attention selects information relevant for our current behavioural goals and expectations, the brain can make sense of noisy information within a fraction of a second. Shifting the perspective to studying natural scenes with a lot information may provide a lens into how the brain actually achieves this.

Patrik Wikman
University of Helsinki

Read the Original

This page is a summary of: Attention to audiovisual speech shapes neural processing through feedback-feedforward loops between different nodes of the speech network, PLoS Biology, March 2024, PLOS,
DOI: 10.1371/journal.pbio.3002534.
You can read the full text:

Read
Open access logo

Contributors

The following have contributed to this page