What is it about?
In the foreseeable future, as more sophisticated artificial intelligence techniques are expected to be integrated into robotic systems, robots collaborating with humans are expected to have their own intentions. In such a scenario, if the intentions of a robot do not match those of the human, conflicting behaviors will arise. Such behaviors create undesired interactions, which make the task tiring for the human partner unless the robot recognizes these conflicts and reacts to resolve them. This research proposes an approach to enable the collaborative robot to detect the conflicts that occur during the interaction with human partner and act accordingly, hence the interaction between human and robot becomes more efficient and natural.
Featured Image
Photo by Aarón Blanco Tejedor on Unsplash
Why is it important?
During the interaction between human and a proactive robot, conflicts may naturally arise if the partners have different movement intentions. To resolve these conflicts, we argue that it is important to study the dyadic interaction behaviors rather than the individual behavior of human partner. Our proposed conflict resolution mechanism reduced human force and effort significantly compared to the case of a passive robot that always follows the human partner and a proactive robot that cannot resolve conflicts.
Perspectives
This article is developed through insights from industrial experts using robotics in factory floors. This collaboration not only enabled us to create a comprehensive piece but also highlighted a promising area of research for further development.
Dr Ayse Kucukyilmaz
University of Nottingham
Read the Original
This page is a summary of: Resolving Conflicts During Human-Robot Co-Manipulation, March 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3568162.3576969.
You can read the full text:
Contributors
The following have contributed to this page