posted by Amy Harrison
06/05/2022
We describe what the internet's first digital responders do to keep the web safe for us all
What is a Content Moderator? In today’s world, you might instantly make a link to ‘Content Creators’; individuals who post pictures or videos online, some of whom are able to make a living from it. Most of these videos and photographs are harmless, they depict the latest fashion trends or the most ‘instragramable’ place to go on holiday (who hasn’t seen a video of the Greek islands popping up on their timeline and felt the desire to immediately dig out their swimsuit and jump on a plane?). But what about the videos that we don’t want to see? The violent and traumatic videos that stick in our minds and haunt us? The videos uploaded by paedophiles, abusers and terrorists.
This is where Content Moderators, Analysts, and Specialists step in. Content Moderators are the individuals who work behind our screens to keep us safe online. They identify and remove any images and videos that are illegal or potentially harmful. One aspect of this is removing Child Sexual Abuse Material (CSAM); this is content that depicts the sexual abuse of children who have been coerced, forced or groomed by the perpetrator and have no control over where the images end up. These videos can be extremely violent, graphic and upsetting to view. CSA causes significant and ongoing psychological and physical trauma for the child victim, and the production and distribution of this CSAM online further perpetuates their trauma. These online videos can be flagged by members of the public and sent to content moderators to assess, so that the image or video can be removed as quickly as possible. It also means that details about the perpetrator, victim and location of the crime can be detected, and the relevant authorities are notified so that legal action can be taken. The actions of Content Moderators are vital in protecting the victim from the further traumatisation of having hundreds, if not thousands of people witness their abuse, or of having the video distributed more extensively by other perpetrators. They also provide vital support to the police by referring these videos on to teams who can track down and arrest the perpetrator so that they can’t reoffend. And finally, content Moderators protect us too, the public, from seeing these traumatic videos, so that we can focus on daydreaming about that trip to Greece, rather than witnessing the horrors of online abuse.
But what affect does the job have on CM’s themselves? The repeated exposure to extremely traumatic images, videos and sounds is likely to have a lasting psychological impact. Much like exposure to a single traumatic event, such as a sudden bereavement or a serious accident, may make you feel detached, terrified and alone, or may even leave you with intrusive images, anxiety and symptoms of Post Traumatic Stress Disorder; chronic and repeated exposure to traumatic CSAM images can leave CM’s experiencing symptoms of Burnout, Vicarious Trauma, Secondary Traumatic Stress, Compassion Fatigue and even PTSD.
Yet CM’s are not only subject to the adverse psychological impacts of viewing the content, they are also exposed to significant workplace stressors. CM’s often view multiple images of CSAM per hour in order to meet pre-determined quotas. They often work alone, in restricted rooms, to meet these high quota’s whilst preventing others from witnessing the distressing images on their screens. The combination of workplace stressors, such as the pressure to meet high level quota’s in relative isolation, with the extreme content of the job, can lead to a culture where CM’s take minimal breaks, interact less with their colleagues and are thus less likely to build supportive relationships at work. Considering this, they are likely at an increased risk of VT, burnout and PTSD. Whilst CM’s are often routinely offered psychological therapies at their workplace to support them to discuss, and process this chronic exposure and subsequent psychological difficulties, many CM’s may feel unable to talk about what they have seen with their families, friends and partners. Due to the graphic nature of the content, many CM’s have concerns that by disclosing what they have seen, heard and witnessed each day, they may traumatise their family member, partner or friends too. This leaves CM’s to deal with their experiences alone whilst continuing to protect other people; children online, victims of CSAM, the public and their own support systems.
Content Moderation is vital in protecting children from secondary trauma and in protecting the general public from exposure to such unthinkable acts. Yet relatively little is known about the individuals completing this important work. A research team from the Centre of Abuse and Trauma Studies (CATS) at Middlesex University is working with CM’s to understand the impact of exposure and whether anything more can be done to support them. These superheroes sit in the shadows of social media, classifying and removing CSAM to keep children, and us, safe online.