YouTube recommends videos of kids to users searching for sexual content

YouTube has been automatically recommending family videos featuring children to paedophiles or users who have watched sexual content.

The website's automatic recommendation system is already under fire for funneling users to extreme violent or white supremacist content.

Canadian YouTuber Matt Watson exposed the algorithm in a video where he clicked through just two videos to be recommended clips of young girls.

One of the signs a family video is being targeted by paedophiles is commenters sharing timecodes that direct other users to images of children.

It's happening in New Zealand too, according to NetSafe CEO Martin Cocker.

"People go into YouTube looking for that kind of content, and the algorithm can help them find it," he told Newshub.

A new study from Harvard University found that in many cases, videos recommended by Youtube became more extreme - eventually showing a "near-endless" stream of children.

Many of them are innocent home videos, making for difficult decisions around how family content is shared online.

"I make my own choices the same way other parents do," Prime Minister Jacinda Ardern said. "I think it's fair to acknowledge it is a really difficult environment now."

Algorithms also contribute to the spread of violent terrorist content. At the Prime Minister's Christchurch Call in Paris, tech giants pledged to be more transparent with how their algorithms work.

"It's fair to say the more light we have shed on that issue. it will actually be relevant to a range of areas," Ardern said.

In some cases those algorithms are driving people to more and more white supremacist content.

"The platforms understand what is in their content and they understand a lot about their users, so if they dig into the data they absolutely have everything they need to break the cycle," Cocker said.

YouTube told Newshub in a statement that this year it's already disabled comments on videos containing minors in situations it describes as "potentially risky".

In the first quarter it removed 800,000 videos for violating child safety policies, but given the major Harvard study, that will be cold comfort to parents.

Jonas Kaiser, one of the study's researchers, told Newshub that analysis shows moderate political content can serve as a gateway to more extreme content.

"As YouTube's algorithms cannot really differentiate between levels of extremism, it can happen that from one moderately conservative channel you might get recommended to a more extreme one," he said.

"Against this background, we urge YouTube to rethink its algorithms since recommendations around music and entertainment are one thing, but around politics and, indeed, children are something else entirely."

Newshub.

Contact Newshub with your story tips:
news@newshub.co.nz