All Stories

  1. How do Naturalistic Visuo-Auditory Cues Guide Human Attention? Insights from Systematic Explorations in Visual Perception of Embodied Multimodal Interaction
  2. “Take Nothing on Its Look”: Revealing Users’ Expectations and Experiences in Social Human–Robot Interaction
  3. Language Models for Human-Robot Interaction
  4. The DREAM Dataset: Supporting a data-driven study of autism spectrum disorder and robot enhanced therapy
  5. Simultaneous recognition and reproduction of demonstrated behavior