Unions and campaigners have called on MPs to urgently review TikTok’s plans to axe hundreds of jobs in the UK amid growing fears over online safety.
Organisers said the cuts, which would vastly reduce the platform’s trust and safety team in the country, risk leaving users exposed to harmful online content - including deepfakes and abuse.
In an open letter penned to Chi Onwurah MP, Chair of the Science, Innovation and Technology Committee, campaigners and members from the Trades Union Congress (TUC) and Communication Workers Union (CWU), said the government must “act now” to investigate concerns over UK online safety and workers’ rights.
It comes after TikTok’s Beijing-based owner ByteDance announced in August that 400 members of staff in its London office would be made redundant, with their roles being reallocated to offices across Europe and outsourced to third-party providers.
ByteDance denied it was deprioritising online safety and the decision was linked to plans to scale up its use of artificial intelligence (AI) in content moderation, aiming to “maximise effectiveness and speed “.
But workers and unions criticised the move, which they said would place up to 30 million users of the app in the UK at greater risk of exposure to harmful content, including eating disorder, self-harm, and suicide posts.
Speaking to The Independent, CWU national officer John Chadfield called on TikTok to “invest time and resources” into content moderation.
“Their current strategy priorities of monetising TikTok Shop and onboarding influencers isn't one that means that safety can take a backseat,” he said. “TikTok at times can talk a good game about Trust and Safety, and will fly tech journalists out to parade some marketing, but they know the internal reality doesn't match with their own PR.
“If TikTok took online safety seriously, then instead of a short-sighted offshoring move that endangers their biggest userbase in Europe, they should actually invest time and resources in to understanding the expertise and processes required to effectively moderate the millions of pieces of dangerous content that flood the platform, and in that process acknowledge and appreciate the existing frontline professionals in Trust and Safety already in their employ.”
The group of union leaders and campaigners who are signatories on the letter include Ian Russell, the father of Molly Russell, a 14-year-old who a coroner ruled died from an act of self-harm while suffering from depression and “the negative effects of online content”, and Imran Ahmed, CEO, Center for Countering Digital Hate.
“Every single redundancy is targeted at the ‘Trust and Safety Team’, effectively ending content moderation in London - with similar cuts to human moderation happening worldwide,” the letter to Mr Onwurah says.
“These safety-critical workers are the frontline of protecting users and communities from deep fakes, toxicity and abuse.”
Unions have also accused TikTok of “union busting” after the redundancies were announced just eight days before workers were scheduled to vote on union recognition with the United Tech and Allied Workers, the tech-focused branch of the Communication Workers Union. TikTok said its decision was global and not related to discussions with the trade union.
The restructure plans come shortly after the UK's Online Safety Act, enforced by Ofcom, came into force in July. The new legislation mandates online platforms to protect UK viewers from illegal material, such as child sexual abuse and extreme pornography.
Platforms are also required to prevent children from accessing harmful and age-inappropriate content.
A TikTok spokesperson said it “strongly rejected” allegations of deprioritising online safety and union busting.
"We are continuing a reorganisation that we started last year to strengthen our global operating model for Trust and Safety, including concentrating our operations in fewer locations globally, ensuring we maximise effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements,” it added.