What is it about?

In many robotic applications a robot body should have a functional shape that cannot include bio-inspired elements, but it would still be important that the robot can express emotions, moods, or a character, to make it acceptable, and to involve its users. Dynamic signals from movement can be exploited to provide this expression, while the robot is acting to perform its task. A research effort has been started to find general emotion expression models for actions that could be applied to any kind of robot to obtain believable and easily detectable emotional expressions. On his path, the need for a unified representation of emotional expression emerged. A framework to define action characteristics that could be used to represent emotions is proposed in this paper. Guidelines are provided to identify quantitative models and numerical values for parameters, which can be used to design and engineer emotional robot actions. A set of robots having different shapes, movement possibilities, and goals have been implemented following these guidelines. Thanks to the proposed framework, different models to implement emotional expression could now be compared in a sound way. The question mentioned in the title can now be answered in a justified way.

Featured Image

Why is it important?

We provide guidelines to identify quantitative models to express emotions thorugh robot's movement

Perspectives

With this paper and the research still ongoing, we are trying to frame the problems related to emotional expression through movement. In literature we can find many presentations of emotional robots, but it is hard to find a unifying model to face the problem. Hare we identify some features and how they are related to emotional expression, also considering different robot embodiements.

Prof. Andrea Bonarini
Politecnico di Milano

Read the Original

This page is a summary of: Can my robotic home cleaner be happy? Issues about emotional expression in non-bio-inspired robots, Adaptive Behavior, October 2016, SAGE Publications,
DOI: 10.1177/1059712316664187.
You can read the full text:

Read

Contributors

The following have contributed to this page