What is it about?

Vision-based cognitive services (CogS) have become crucial in a wide range of applications, from real-time security and social networks to smartphone applications. When it comes to facial analysis, these services can be misleading or even inaccurate, raising ethical concerns such as the amplification of social stereotypes. We analyzed popular Image Analysis Services that infer emotion from a person's face, and document evidence that CogS may actually be more likely than humans to perpetuate the stereotype of the "angry black man" and often attribute black race individuals with "emotions of hostility".

Featured Image

Read the Original

This page is a summary of: Emotion-based Stereotypes in Image Analysis Services, July 2020, ACM (Association for Computing Machinery),
DOI: 10.1145/3386392.3399567.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page