YouTube faces fresh accusations of promoting child porn videos

Warning: This article contains disturbing content.

YouTube has been accused of promoting soft-core child pornography through its recommended videos algorithm.

In a 20-minute video titled 'YouTube is Facilitating the Sexual Exploitation of Children, and it's Being Monetized (2019)', Canadian YouTuber Matt Watson discusses a disturbing discovery he made.

"Over the past 48 hours I have discovered a wormhole into a soft-core paedophilia ring on YouTube," he wrote in the video description.

"YouTube's recommended algorithm is facilitating paedophiles' ability to connect with each-other, trade contact info, and link to actual CP [child porn] in the comments."

YouTube's recommended videos function is a key part of how content reaches its audience. The website uses an algorithm that takes information from videos a user has watched in the past, and suggests other videos they might be interested in.

Mr Watson claims the algorithm is quick to recommend videos of young girls that paedophiles may find erotic, saying he was often directed to such content within five clicks of watching an unrelated video.

As an experiment, he installed a virtual private network (VPN) to mask his computer's browsing history. He then opened YouTube and clicked on a 'bikini haul' video posted by an adult woman, explaining that he's watched such videos before because he's a "red-blooded heterosexual male".

He clicked on another swimsuit-themed clip in the sidebar of recommended videos, and was instantly suggested one titled 'Gymnastics video' which had a thumbnail of a young girl with her legs splayed on either side of her.

"Two clicks," he said in exasperation. "It couldn't be more perfect. I am now in the wormhole. Now that I've clicked on this, look at this sidebar."

His recommended videos list now consisted exclusively of clips of little girls wearing bikinis or shorts.

Most of the footage seems to have been innocently uploaded by children, showing them and their friends playing or dancing. Yet many of the videos have received hundreds of thousands of views, which Mr Watson says is because of the algorithm.

He scrolled through the comments, many of which were written in Russian or Portuguese. Some comments were suggestive, calling the girls "beautiful goddesses" or saying they would make "a great mother sometime".

However a lot more comments consisted simply of a timestamp from the video above. When clicked on, the clip would freeze on a moment when the child was in a compromising position such as with their legs open or stretched in front of them.

"These guys aren't timestamping because the little girl made a funny joke," Mr Watson said.

He also said many people would use the comment section to trade social media details or even links to explicit child pornography.

"Once you enter into this wormhole, there is no other content available. YouTube's algorithm is glitching out to a point that nothing but these videos exist."

Mr Watson's video also showed that many of the videos are monetised, with ads for companies such as Disney and McDonald's appearing before videos as well as alongside them.

He cited an update to YouTube's official guidelines in 2017 that promised to disable all comments on videos featuring minors once inappropriate remarks had been detected in the comment section.

"This is significant because we know that YouTube has an algorithm in place that detects some kind of unusual predatory behavior on these kinds of videos. And yet all that's happening is that the comments are being disabled?"

Mr Watson's video, which was posted on Monday (local time), has received more than a million views so far, with many comparing the revelations to YouTube's Elsagate scandal which saw thousands of creepy or adult videos targeted at young children.

Some commenters said they had created new accounts or installed VPNs so they could try the recommended videos experiment themselves, and found they too were being recommended videos of young girls.

Others questioned why YouTube relies solely on algorithms and doesn't employ actual humans to manually check for inappropriate comments.

Newshub.