What is it about?

This experiment shows intelligent driving assistance, like autosteering, can suppress social norms. It hinders people from taking turns giving way, and communication doesn't help. Machine intelligence involved in human decision-making without normative commitments could cause social norms of reciprocity to collapse.

Featured Image

Why is it important?

Autonomous assistants, such as active driver assistance systems, are increasingly available to enhance individual safety. However, the social implications of these systems have been largely ignored. To investigate them with robust causality, we have designed and performed a new cyber-physical lab experiment. Our findings suggest that social reciprocity norms may break down when machine intelligence is involved in human decision-making without any normative commitments.

Read the Original

This page is a summary of: Emergence and collapse of reciprocity in semiautomatic driving coordination experiments with humans, Proceedings of the National Academy of Sciences, December 2023, Proceedings of the National Academy of Sciences,
DOI: 10.1073/pnas.2307804120.
You can read the full text:

Read

Contributors

The following have contributed to this page