Logo close icon

Why Daniel Motaung’s lawsuit could be good news for us all

posted by Ruth Spence

18/08/2022

Daniel Motaung’s case may make Facebook liable for the mental health of its outsourced content moderators, opening the doors for other lawsuits. This is something that we should all support.

We're used to the idea of social media being bad for our mental health, what fewer people realise is it comes at a terrible price to a hidden workforce too. Content moderators, often based in the developing world, remove harmful content but get post-traumatic stress disorder (PTSD) from the relentless horror they have to witness. That could be about to change if an upcoming court case finds Facebook liable for the PTSD suffered by a worker in Kenya. Ex-content moderator Daniel Motaung and legal firm Foxglove are suing Sama and Facebook after his working conditions left him with the disorder. TikTok, Meta and Microsoft have all been sued by content moderators before, however if the lawsuit by Daniel is successful Facebook will be forced to make specific changes that prioritises the health and wellbeing of their moderators in Kenya. This could open the door to moderators in other countries following suit. Something we should all support.

We rely on moderators to police our online spaces and remove content like child abuse, terrorism, hate speech and misinformation. Their online decisions impacts our offline lives more than we might like to think - when the process fails events like the mass shooting in Buffalo occur, or information about abortion is erroneously removed. Content moderation is always going to be a bit subjective, driven by platform policies that are open to interpretation and change with the zeitgeist. This means firms need a workforce that can reliably and accurately make complex and nuanced decisions at scale. However, the first annual report of Meta’s Oversight Board demonstrated that moderators might be performing poorly. The report published 20 decisions, 70% of which overturned Meta’s original content moderation decisions. In a further sample of 130 content removal decisions, Meta identified 51 cases (40%) where the original moderation decision was wrong. If this rate of failure is accurate, there could be up to half a million cases where Meta’s own rules have been incorrectly applied. This means 500,000 cases of hate speech, misinformation or graphic material staying up, or legitimate posts and accounts being removed. This rate of failure could be increasing too, as Meta’s latest transparency report showed the number of appeals to Facebook and Instagram had grown by 66%.

There are many reasons why moderators might make the wrong decision, including working in unfamiliar languages and contexts. However, the effects of stress should not be underestimated – moderators have high workloads, and stringent accuracy quotas with consequences for any workers who are not able to meet both. Chronic work stress is associated with desensitization, this means the ability to accurately assess potentially harmful content can be reduced. To some extent this is protective - a failure to habituate to the content may increase the risk of developing mental health problems, such as those highlighted in the lawsuits against technology companies. However, adapting to viewing the content can also be a sign of trouble, as internalised moral standards become side-stepped and the brutality of some content might not seem so bad when contrasted with something worse. This unconscious process of weighing up the immorality of one post compared to another can make less extreme examples appear benign, one moderator told us “you get used to it. So basically, you just devolve a resiliency towards anything... Sometimes, you admire that some people can even eat in this kind of environment. I have the same feelings to my job. I have no feelings.”

We have found that symptoms of mental health problems like PTSD and anxiety are common amongst moderators. This can get in the way of problem solving or thinking creatively, one moderator said “when I go into stress mode, I have panic attacks, and most of the time they happen when I saw something that was a little bit too hard, and at that time, like I said, I can't think straight” – the ability to think critically to apply company policy is crucial given how ambiguous many posts are, especially problematic posts designed to thwart moderation attempts. If content moderators are provided with adequate mental health support as Motaung’s law suit requires, it will be better for us all - an engaged workforce is likely to be a better functioning workforce. Companies like Meta owe it to their staff's mental health and their users to improve the situation. A court may soon make that decision for them.

View all posts

In this section

Back to top