What is it about?

Algorithmic systems that recommend content often lack transparency about how they come to their suggestions. One area in which recommender systems are increasingly prevalent is online news distribution. In this paper, we explore how a lack of transparency of (news) recommenders can be tackled by involving users in the design of interface elements. In the context of automated decision-making, legislative frameworks such as the GDPR in Europe introduce a specific conception of transparency, granting 'data subjects' specific rights and imposing obligations on service providers.

Featured Image

Why is it important?

An important question is how people using personalized recommender systems relate to the issue of transparency, not as legal data subjects but as users. This paper builds upon a two-phase study on how users conceive of transparency and related issues in the context of algorithmic news recommenders. We organized co-design workshops to elicit participants' 'algorithmic imaginaries' and invited them to ideate interface elements for increased transparency. This revealed the importance of combining legible transparency features with features that increase user control. We then evaluated mock-up prototypes to investigate users' preferences and concerns when dealing with design features to increase transparency and control. Our investigation illustrates how users' expectations and impressions of news recommenders closely relate to their news-reading practices.

Perspectives

On a broader level, we show how transparency and control are conceptually intertwined. Transparency without control leaves users frustrated. Conversely, without a basic level of transparency into how a system works, users remain unsure of the impact of controls.

Oscar Alvarado
Universidad de Costa Rica

We show thoroughly the process of involving users in the evaluation and co-design of new ways to implement the right to an explanation, foreseen in the GDPR. We aim at empowering users to understand and gain autonomy in the use of algorithmic recommendations for news.

Luciana Monteiro-Krebs
Universidade Federal do Rio Grande do Sul

Read the Original

This page is a summary of: 'Transparency is Meant for Control' and Vice Versa: Learning from Co-designing and Evaluating Algorithmic News Recommenders, Proceedings of the ACM on Human-Computer Interaction, November 2022, ACM (Association for Computing Machinery),
DOI: 10.1145/3555130.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page