What is it about?
In recent times, social virtual reality (VR) platforms have become more popular, allowing individuals to interact with each other in real-time using voice and gestures. However, this type of communication can result in new types of harmful behavior, posing challenges for moderators. To investigate this, we conducted research in three VR environments (AltspaceVR, Horizon Worlds, Rec Room), observing 100 scheduled events and conducting 11 interviews to identify harms and explore moderator's assessment of real-time interactions and their use of moderation tools.
Featured Image
Photo by stephan sorkin on Unsplash
Why is it important?
The research on potentially harmful behavior in social VR platforms is important for several reasons. First, as more people use these platforms, there is a growing need to understand the risks and challenges associated with real-time communication in virtual environments. Second, the study sheds light on the limitations of current moderation practices and tools, and suggests ways in which they can be improved to address the unique challenges of social VR. Third, by identifying specific types of harmful behavior, the research provides a foundation for developing targeted interventions to prevent and mitigate harm in these environments. Overall, this study is a valuable contribution to the emerging field of research on social VR and online safety.
Read the Original
This page is a summary of: Challenges of Moderating Social Virtual Reality, April 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3544548.3581329.
You can read the full text:
Contributors
The following have contributed to this page







