Social media giant Facebook is altering its approach to fake news, after discovering its 'red flag' system actually "entrenched deeply held beliefs", rather than challenged them.
The company was pushed to introduce measures to combat fake news last year, after it became common on the platform during the 2016 US presidential election.
- How fake news influences election outcomes
- 'Fake news' named 2017's word of the year
- Two-thirds of Kiwis wary of 'fake news'
Its remedy was to alert Facebook users to the fact they were accessing an article that included statements that had been disputed by third-party fact-checkers.
But now, Facebook has revealed that method was unsuccessful, as its red flags didn't have the desired effect.
"Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs - the opposite effect to what we intended," Facebook's Tessa Lyons wrote.
The social media giant is now altering the way it deals with fake news by displaying alternative, more credible news articles next to ones that had been disputed.
"[Displaying] related articles [is] simply designed to give more context, which our research has shown is a more effective way to help people get to the facts," Ms Lyons explained.
"Indeed, we've found that when we show related articles next to a false news story, it leads to fewer shares than when the disputed flag is shown."
Facebook says it is now "investing in better technology and more people to help prevent the spread of misinformation", and is also starting an initiative that plans to work out which sources are dependable.