What is it about?

Spoken languages are expressed through the audio-oral modality, while signed languages use the visuo-spatial modality. This study focuses on the influence of language modality when accessing a language in the brain. It also explores the impact of modality when accessing a spoken language when seeing a signed language, and vice versa (accessing a signed language when hearing a spoken language). To this end, we run a series of eye-tracking experiments.

Featured Image

Why is it important?

Our findings show how spoken and signed language access develop in time and how these languages relate in bilinguals. In the case of signed language processing, this study teases apart the contribution of two components of the signs: location and handshape, and the time-course of each (location precedes handshape).

Perspectives

This article is a summary of my PhD thesis, directed by Dr. Manuel Carreiras and Dr. Brendan Costello. I intended to make a contribution by disentangling the connections between a bilingual’s languages when those connections do not rely on phonological overlap or any kind of shared phonology (words and signs cannot sound or look like each other).

Saul Villameriel
Centro de Normalización Lingüística de la Lengua de Signos Española, CNLSE

Read the Original

This page is a summary of: Lexical access in bimodal bilinguals, Sign Language & Linguistics, July 2022, John Benjamins,
DOI: 10.1075/sll.00070.vil.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page