Facebook changes how it deals with fake news

Social media giant Facebook is altering its approach to fake news, after discovering its 'red flag' system actually "entrenched deeply held beliefs", rather than challenged them.

The company was pushed to introduce measures to combat fake news last year, after it became common on the platform during the 2016 US presidential election.

Its remedy was to alert Facebook users to the fact they were accessing an article that included statements that had been disputed by third-party fact-checkers.

But now, Facebook has revealed that method was unsuccessful, as its red flags didn't have the desired effect.

"Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs - the opposite effect to what we intended," Facebook's Tessa Lyons wrote.

A visual representation of Facebook's old system for dealing with fake news.
A visual representation of Facebook's old system for dealing with fake news. Photo credit: Facebook

The social media giant is now altering the way it deals with fake news by displaying alternative, more credible news articles next to ones that had been disputed.

"[Displaying] related articles [is] simply designed to give more context, which our research has shown is a more effective way to help people get to the facts," Ms Lyons explained.

"Indeed, we've found that when we show related articles next to a false news story, it leads to fewer shares than when the disputed flag is shown."

A visual representation of what'll happen when you see fake news now.
A visual representation of what'll happen when you see fake news now. Photo credit: Facebook

Facebook says it is now "investing in better technology and more people to help prevent the spread of misinformation", and is also starting an initiative that plans to work out which sources are dependable.

Newshub.