While the rules of content moderation may seem opaque or poorly defined to everyday users, in Abuse Standards Violations artists Eva & Franco Mattes help to visualize the clearly delineated lines of today’s social media platforms. These internal-only documents, leaked to the artists in the course of their investigation, are used to train new online moderators and define the acceptable boundaries of content, text and imagery, like: gore, hate speech, pornography and violence.

Today an increasing degree of moderation work is subject to automated or algorithmic decision making, but large portions still require a level of human interpretation. This labor is most often fulfilled by precarious gig-workers from around the globe. Long and irregular work hours, combined with the need to routinely view disturbing imagery, leads to an extremely high turnover rate in the field. Part time moderators rarely know what platform(s) they are contracted to— the origin of these documents remains a mystery.