Facebook work filtering posts ‘cost me my humanity’
Readers may find material in this article disturbing.
Behind the scenes on Facebook, thousands of moderators protect users from graphic content by filtering out posts that break its rules. The BBC has spoken to one moderator based in Kenya, who is taking legal action against its parent company Meta.
On his first day of work screening posts, South African Trevin Brownie watched a man take his own life.
“The problem was not [the taking of his own life]. The problem was the three-year-old boy that was in the video with this guy. So the boy was playing on the floor with these toys, like not even understanding what’s happening”.
It took two or three minutes for the child to realise something was wrong and to call out for his dad. He then started crying. Eventually an adult entered the room, and the recording was stopped.
“I felt sick. I, you know, I was vomiting because I didn’t understand why people would even do things like that,” Mr Brownie said.
In the course of his work, Mr Brownie would see the worst of humanity – from child abuse to torture and suicide bombings.
His experience, he believes, deadened his feelings. The tremor in his voice and his sympathy suggests he still cares deeply about others, but Mr Brownie believes part of his humanity is gone.
“Because I’m basically so much used to death and seeing death. It became a norm for me,” he says. Deaths no longer affect him as he feels they should.
Mr Brownie sees those who work in moderation as a front line of defence protecting users, especially during the pandemic, when many relied on the internet. The way Facebook connects people around the world also appeals to him.
In January, Facebook’s main moderation hub for east Africa, Sama, announced it would stop providing content-review services to social media firms.
Last month Sama laid off 260 moderators, including Mr Brownie, as it concentrated on work annotating videos to help train artificial intelligence computer vision systems.
“I sacrificed my human side for this job. I don’t think you can give any more than your soul, and then to be kicked out like this,” Mr Brownie said.
He is concerned about the future, as he and his fiancée hoped to get married, and his family in South Africa rely upon money he sends them.
Mr Brownie says he would have not taken the job if he had known what it involved, but feels it is important work he is good at, and where he earned promotion to a more senior role. He wants his employment to continue, but with more support for his mental health.
He is one of a group of 184 moderators, supported by the campaign group Foxglove, who are taking legal action against Meta, Facebook’s parent company, Sama, and Meta’s new contractor, Luxembourg-based firm Majorel.
Meta has sought to extricate itself from the action, but a ruling on Thursday now means it can be sued for unfair termination.
Cori Crider, a director at Foxglove, called the decision “a milestone” and said that “no tech giant, however wealthy, should be above the law”.
An interim ruling against Meta and Sama already means the moderators contracts cannot be terminated and they must still be paid until the case is decided.
The moderators say they were laid-off in retaliation for complaints about working conditions and attempts to form a union.
They also allege they were unfairly discriminated against and refused work at Majorel “on the basis that they previously worked at the [Sama] facility”, the petition to the court states.
Text messages shared with the moderators’ legal team, and seen by the BBC, show moderators interested in applying for a job at Majorel were told by a third-party recruiter that: “The company will not accept candidates from Sama. It’s a strict no.”
Meta has declined to comment, citing continuing legal action. But the company requires its contractors to provide round-the-clock on-site support with trained practitioners, and access to private healthcare from the first day of employment.
Majorel declined to comment while legal action was continuing.
A Sama spokesperson told the BBC that it paid moderators fair, local living wages that were among the top 12 paying jobs in Kenya.
It said it provided “extensive mental health services, including on-site licenced and trained mental health professionals, a 24-hour hotline and virtual consultations. In addition, employees are free to see a mental health professional of their choosing using the healthcare benefits”.
Its wellbeing service will continue for 12 months after the last day of employment.
Sama said accusations against the firm have proved to be untrue, which was why “former moderators are suing to keep their jobs – other companies offer a fraction of the pay and benefits compared to Sama”.
The BBC has also seen emails sent to Sama from a small number of moderators, expressing their frustration that the injunction means the company cannot pay termination benefits such as free flights to home countries. Two emails praise working conditions at Sama, and one person voices their unhappiness at the court action.
In February, a Kenyan court ruled that Meta could be sued by ex-moderator Daniel Motaung over claims of poor working conditions.
Meta also faces legal action in Nairobi concerning allegations its algorithm helped fuel the viral spread on social media of hate and violence during Ethiopia’s civil war.
If you are having thoughts of suicide, or know someone who might be, you can find support lines via Befrienders Worldwide. In the UK, you can call the Samaritans help-line on 116 123 or visit samaritans.org.
Related Topics
- Kenya
- Meta
-
Meta can be sued by ex-moderator, judge rules
-
7 February
-
-
Son sues Meta over father’s killing in Ethiopia
-
14 December 2022
-
Published at Mon, 24 Apr 2023 23:42:11 +0000