TikTok removes 23 million videos for breaches of child safety over three months

The report comes less than a month after the platform apologises for a beheading video that went viral.
The report comes less than a month after the platform apologises for a beheading video that went viral Photo credit: Getty Images

TikTok removed nearly 62 million videos and over 11 million accounts in the first quarter of 2021, the social media network has revealed in its latest transparency report.

And while that seems like a huge number of videos, it makes up less than one percent of the total uploaded, the company said.

By far the biggest reason for removal was 'minor safety' at 36.8 percent, representing nearly 23 million videos.

"We do not tolerate activities that perpetuate the abuse, harm, endangerment, or exploitation of minors on TikTok," the company states in its community guidelines.

"Any content, including animation or digitally created or manipulated media, that depicts abuse, exploitation, or nudity of minors is a violation on our platform and will be removed when detected."

'Illegal activities and regulated goods' and 'adult nudity and sexual activity' were the next biggest reasons for deletion, at 21.1 percent and 15.6 percent respectively.

The report comes less than a month after the company was forced to apologise when a graphic decapitation video went viral on the platform.

That video started with an innocent clip of a girl dancing but cut to a man being graphically decapitated in a bathroom.

The AI searches videos for inappropriate content like nudity and gore, but was likely fooled by the beheading being spliced into the other video, TikTok told Newsweek at the time.

"Not much can be done to prevent these with current systems. We try and stop the wrong ones, but there will be one-offs. Humans make errors, but AI systems get tricked too," the company said.

A human moderator only looks at content on the platform once it reaches 500 views.

The AI technology tasked with detecting and automatically removing that kind of violating content did flag and delete 8.8 million videos during the three month window.

In total, of the videos deleted by the platform, 91.3 percent were taken down before a user reported them, with 81.8 percent removed before they received any views.

Most videos deleted were from the United States, followed by Pakistan, Brazil, Russia and Indonesia.

Of the 11,149,514 accounts removed for violating guidelines or terms of service a suspected 7,263,952 were suspected underage accounts.

As well as removing over 11 million accounts, the platform also prevented 71m accounts being created through automated means. It also rejected 1.9 million ads for violating policies and guidelines.

TikTok has been publishing transparency reports since 2019.