What is it about?

In this paper we investigate the issue of sensitive information leakage to third-party voice applications in voice assistant ecosystems, focusing on the leakage of sensitive information via the conversational interface.

Featured Image

Why is it important?

Current privacy and security measures for third-party voice applications are not sufficient to prevent the leakage of all types of sensitive information via the conversational interface. Preventing the leakage of sensitive information via the conversational interface of third-party voice applications remains an outstanding challenge that has yet to be effectively addressed.

Perspectives

The results from using our testing infrastructure to interact with a subset of Google Actions and Alexa Skills show several data leakages, including the leakage by a third-party voice application of personal data, employment data, and mental health data. We make key recommendations for the redesign of voice assistant architectures to better prevent the leakage of sensitive information via the conversational interface of third-party voice applications in the future.

Guillermo Suarez-Tangil
IMDEA Networks Institute

Read the Original

This page is a summary of: Leakage of Sensitive Information to Third-Party Voice Applications, July 2022, ACM (Association for Computing Machinery),
DOI: 10.1145/3543829.3544520.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page