What is it about?

This is a preliminary user study looking into users' perceptions of and reactions to fairness objectives in personalized recommender systems. Users typically think of recommender systems as working on their behalf, but fairness might require objectives that include other stakeholders.

Featured Image

Why is it important?

There is increasing interest in fairness in machine learning, generally, and fairness in recommender systems, but there is relatively little understanding of how these kinds of objectives can be made transparent to users and how users react when fairness is injected into what is typically understood as a user-focused application. Greater understanding of these questions will go a long way towards ensuring that fairness-aware recommendation will gain user acceptance when it is deployed.

Perspectives

Recommender systems in practice often include multiple objectives beyond just personalization -- often these are business KPIs or larger ecosystem concerns. Users are largely ignorant of this, partly because platforms emphasize personalization and are rarely transparent about these other objectives. I think transparency is particularly important when the system works differently that users expect, so we really need to get better at this.

Robin Burke
University of Colorado Boulder

Read the Original

This page is a summary of: Fairness and Transparency in Recommendation: The Users’ Perspective, June 2021, ACM (Association for Computing Machinery),
DOI: 10.1145/3450613.3456835.
You can read the full text:

Read

Contributors

The following have contributed to this page