Facebook's algorithms are creating terrorist propaganda videos

Facebook has been caught inadvertently creating Islamic State and Nazi propaganda.

The site regularly generates 'celebration' and 'memories' clips for its users - including members of extremist organisations. Its artificial intelligence even created a business page for al-Qaeda, the group credited with the September 11, 2001 attacks.

The US National Whistleblower Centre monitored pages belonging to 3000 people who had 'liked' or had connections to terror groups, as defined by the US government, for five months.

Attorney John Kostyack told the Associated Press (AP) disturbing content was frighteningly easy to find on the social network.

"Image of severed heads on spikes, other kinds of really violent and horrible images that you can't get out of your mind once you've taken a look at it."

Pages for terror groups are being automatically generated based on users' activity, descriptions and imagery, such as flags and logos. Then videos are being created for those pages, which are being shared to spread hate and recruit new followers.

A page for al-Qaeda, for example, had more than 7000 likes and provided recruiters with "valuable data" for finding new members, the Whistleblower Centre said.

"To think that that actually is a tool organisations are using to recruit people suggests that this is the worst kind of crime that's happening on Facebook. We feel that we have a responsibility to put an end to that."

In response, Facebook said it was taking down terrorist content "at a far higher success rate than even two years ago".

"We don't claim to find everything and we remain vigilant in our efforts against terrorist groups around the world."

Shawnee State University history professor Amr Al Azmm blasted Facebook's claims.

"That's just stretching the imagination to beyond incredulity," he told AP.

"If I can easily find those, I'm sure the people at Facebook who have all the algorithms and the technology, they should be able to see that."

The study found only 38 percent of posts containing prominent symbols belonging to terror groups were taken down.

One person with terrorist sympathies appeared to slip through the net because they embedded their words in an image, rather than posting straight text.

Facebook claims to have more than 30,000 people reviewing potentially harmful material on its site.

Hany Farid, a digital forensics expert at the University of California, told AP Facebook's not motivated to fix the problem because it would be expensive.

"The whole infrastructure is fundamentally flawed, and there's very little appetite to fix it because what Facebook and the other social media companies know is that once they start being responsible for material on their platforms it opens up a whole can of worms."

Facebook has come under increasing criticism since allowing an alleged white supremacist broadcast the killing of 51 people in Christchurch in March.

Newshub.