TikTok to lay off hundreds of UK content moderators

TikTok to lay off hundreds of UK content moderators

Tom Gerken

Technology reporter

Getty Images TikTok logo on a phoneGetty Images

TikTok is planning to lay off hundreds of staff in the UK which moderate the content that appears on the social media platform.

According to TikTok, the plan would see work moved to its other offices in Europe as it invests in the use of artificial intelligence (AI) to scale up its moderation.

“We are continuing a reorganisation that we started last year to strengthen our global operating model for Trust and Safety, which includes concentrating our operations in fewer locations globally,” a TikTok spokesperson told the BBC.

But a spokesperson for the Communication Workers Union (CWU) said the decision was “putting corporate greed over the safety of workers and the public”.

“TikTok workers have long been sounding the alarm over the real-world costs of cutting human moderation teams in favour of hastily developed, immature AI alternatives,” CWU National Officer for Tech John Chadfield said.

He added the cuts had been announced “just as the company’s workers are about to vote on having their union recognised”.

But TikTok said it would “maximize effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements”.

Impacted staff work in its Trust and Safety team in London, as well as hundreds more workers in the same department in parts of Asia.

TikTok uses a combination of automated systems and human moderators. According to the firm, 85% of posts which break the rules are removed by its automated systems, including AI.

According to the firm, this investment is helping to reduce how often human reviewers are exposed to distressing footage.

Affected staff will be able to apply to other internal roles and will be given priority if they meet the job’s minimum requirements.

‘Major investigation’

The move comes at a time when the UK has increased the requirements of companies to check the content which appears on their platforms, and particularly the age of those viewing it.

The Online Safety Act came into force in July, bringing with it potential fines of up to 10% of a business’ total global turnover for non-compliance.

TikTok brought in new parental controls that month, which allowed parents to block specific accounts from interacting with their child, as well as giving them more information about the privacy settings their older teenagers are using.

But it has also faced criticism in the UK for not doing enough, with the UK data watchdog launching what it called a “major investigation” into the firm in March.

TikTok told the BBC at the time its recommender systems operated under “strict and comprehensive measures that protect the privacy and safety of teens”.

Published at Fri, 22 Aug 2025 12:32:06 +0000

Leave a comment

Your email address will not be published. Required fields are marked *