What is it about?
Spoken languages are expressed through the audio-oral modality, while signed languages use the visuo-spatial modality. This study focuses on the influence of language modality when accessing a language in the brain. It also explores the impact of modality when accessing a spoken language when seeing a signed language, and vice versa (accessing a signed language when hearing a spoken language). To this end, we run a series of eye-tracking experiments.
Featured Image
Why is it important?
Our findings show how spoken and signed language access develop in time and how these languages relate in bilinguals. In the case of signed language processing, this study teases apart the contribution of two components of the signs: location and handshape, and the time-course of each (location precedes handshape).
Perspectives
Read the Original
This page is a summary of: Lexical access in bimodal bilinguals, Sign Language & Linguistics, July 2022, John Benjamins,
DOI: 10.1075/sll.00070.vil.
You can read the full text:
Resources
Language modality shapes the dynamics of word and sign recognition
Journal article in Cognition with data from the PhD thesis,
Cross-modal and cross-language activation in bilinguals reveals lexical competition even when words or signs are unheard or unseen
Journal article in PNAS with data taken from the same study
Contributors
The following have contributed to this page