What is it about?

As people increasingly interact with social AI systems and robots, it is important to understand how trust in these technologies develops. We ran a study in which participants interacted with a social robot during a card-based divination task inspired by Tarot rituals. They bonded by discussing personal issues and interpreting the cards together. We found that different kinds of trust develop in different ways. Trust in the robot’s competence formed quickly, while affective trust (trust relevant to emotional bonds and feeling supported) grew more gradually through repeated interactions. In addition, robots with a warmer, more socialized attitude were trusted more overall.

Featured Image

Why is it important?

As AI systems increasingly act as companions, assistants, and conversational partners, trust becomes more than just believing that a system works correctly. People may also form emotional connections with these technologies. Our work highlights the affective dimension of trust, which is especially relevant for the described emotional meaningful interactions. We also introduce a novel study method: the Card Divination Task, which creates a setting where people naturally share emotional vulnerbilities and bond with the robot. Understanding these different forms of trust can help designers build social robots and AI systems that interact with people more responsibly.

Read the Original

This page is a summary of: “What’s on your mind?”: Understanding the Development of Multidimensional Trust in Social Robots, March 2026, ACM (Association for Computing Machinery),
DOI: 10.1145/3757279.3785556.
You can read the full text:

Read

Contributors

The following have contributed to this page