All Stories

  1. Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture Input
  2. The Eclectic User Experience of Combined On-Screen and On-Wrist Vibrotactile Feedback in Touchscreen Input
  3. Might as Well Be on Mars: Insights on the Extraterrestrial Applicability of Interaction Design Frameworks from Earth
  4. What Is the User Experience of Eyes-Free Touch Input with Vibrotactile Feedback Decoupled from the Touchscreen?
  5. Lifelogging in Mixed Reality
  6. A Systematic Literature Review of Gestures and Referents in Gesture Elicitation Studies
  7. From Natural to Non-Natural Interaction: Embracing Interaction Design Beyond the Accepted Convention of Natural
  8. Leveraging Sensorimotor Realities for Assistive Technology Design Bridging Smart Environments and Virtual Worlds
  9. Accessibility Research in Digital Audiovisual Media: What Has Been Achieved and What Should Be Done Next?
  10. An Expressivity-Complexity Tradeoff?: User-Defined Gestures from the Wheelchair Space are Mostly Deictic
  11. New Insights into User-Defined Smart Ring Gestures with Implications for Gesture Elicitation Studies
  12. Understanding Wheelchair Users’ Preferences for On-Body, In-Air, and On-Wheelchair Gestures
  13. iFAD Gestures: Understanding Users’ Gesture Input Performance with Index-Finger Augmentation Devices
  14. Fingerhints: On-Finger Kinesthetic Notifications
  15. “I Could Wear It All of the Time, Just Like My Wedding Ring:” Insights into Older People’s Perceptions of Smart Rings
  16. RadarSense: Accurate Recognition of Mid-Air Hand Gestures with Radar Sensing and Few Training Examples
  17. The User Experience of Journeys in the Realm of Augmented Reality Television
  18. Scenario-based Exploration of Integrating Radar Sensing into Everyday Objects for Free-Hand Television Control
  19. Informing Future Gesture Elicitation Studies for Interactive Applications that Use Radar Sensing
  20. RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowledge on User-Defined Gestures
  21. Transhumanism as a Philosophical and Cultural Framework for Extended Reality Applied to Human Augmentation
  22. Tap4Light: Smart Lighting Interactions by Tapping with a Five-Finger Augmentation Device
  23. Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor Impairments
  24. Designing Interactive Experiences in the Interplay between Ambient Intelligence and Mixed Reality
  25. Measuring the User Experience of Vibrotactile Feedback on the Finger, Wrist, and Forearm for Touch Input on Large Displays
  26. Are Ambient Intelligence and Augmented Reality Two Sides of the Same Coin? Implications for Human-Computer Interaction
  27. WearSkill
  28. Personalized wearable interactions with WearSkill
  29. Clarifying Agreement Calculations and Analysis for End-User Elicitation Studies
  30. How Do HCI Researchers Describe Their Software Tools? Insights From a Synopsis Survey of Tools for Multimodal Interaction
  31. Demonstration of GestuRING, a Web Tool for Ring Gesture Input
  32. GestuRING: A Web-based Tool for Designing Gesture Input with Rings, Ring-Like, and Ring-Ready Devices
  33. Accessibility of Interactive Television and Media Experiences: Users with Disabilities Have Been Little Voiced at IMX and TVX
  34. Taking That Perfect Aerial Photo: A Synopsis of Interactions for Drone-based Aerial Photography and Video
  35. AR-TV and AR-Diànshì: Cultural Differences in Users’ Preferences for Augmented Reality Television
  36. MR4ISL: A Mixed Reality System for Psychological Experiments Focused on Social Learning and Social Interactions
  37. Software Architecture Based on Web Standards for Gesture Input with Smartwatches and Smartglasses
  38. Challenges in Designing Inclusive Immersive Technologies
  39. XR4ISL: Enabling Psychology Experiments in Extended Reality for Studying the Phenomenon of Implicit Social Learning
  40. Uncovering Practical Security and Privacy Threats for Connected Glasses with Embedded Video Cameras
  41. A Research Agenda Is Needed for Designing for the User Experience of Augmented and Mixed Reality: A Position Paper
  42. From Do You See What I See? to Do You Control What I See? Mediated Vision, From a Distance, for Eyewear Users
  43. Design Space and Users’ Preferences for Smartglasses Graphical Menus: A Vignette Study
  44. Addressing Inattentional Blindness with Smart Eyewear and Vibrotactile Feedback on the Finger, Wrist, and Forearm
  45. Aggregating Life Tags for Opportunistic Crowdsensing with Mobile and Smartglasses Users
  46. A Newcomer's Guide to EICS, the Engineering Interactive Computing Systems Community
  47. Life-Tags