What is it about?

This study examines BitChute, an alternative video-sharing platform popular with people who feel mainstream social media censors them. In 2021, BitChute introduced an “incitement to hatred” policy within Europe. By analyzing over 5.2 million user comments and 800,000 video descriptions, researchers found hate speech increased after the policy’s adoption, possibly as a backlash from users.

Featured Image

Why is it important?

Understanding how content moderation works, or backfires, on alternative platforms is crucial. These platforms often attract users who resist moderation, so rules intended to reduce harmful speech may have the opposite effect. The findings help policymakers, researchers, and platform designers better anticipate challenges when creating rules for online spaces that value free expression but still need to address hate.

Read the Original

This page is a summary of: Content Moderation and Hate Speech on Alternative Platforms: A Case Study of BitChute, Proceedings of the ACM on Human-Computer Interaction, May 2025, ACM (Association for Computing Machinery),
DOI: 10.1145/3710950.
You can read the full text:

Read

Contributors

The following have contributed to this page