Right-wing fake news scores far better engagement on Facebook than left-wing fake news, or real news - study

Far-right fake news has higher levels of engagement on Facebook than any other type of fake news or real news, according to a comprehensive new study.

While left-wing and centrist news organisations suffer penalties if they espouse falsehoods and spread misinformation, the research found the opposite happens when it's far-right sources lying.

The study was published by the New York University's Cybersecurity for Democracy project and found that far-right news outlets that regularly lie to their audiences get 65 percent more Facebook engagement than those that do not.

"We found that politically extreme sources tend to generate more interactions from users. In particular, content from sources rated as far-right by independent news rating services consistently received the highest engagement per follower of any partisan group," the study said.

The research looked at nearly 3000 US news sources and categorised them into five categories: far-right, slightly right, center, slightly left and far-left. The categorisation was guided by independent organisations NewsGuard and Media Bias/Fact Check.

Then the researchers used CrowdTangle, a Facebook-owned tool that analyses interaction on the social media platform, to map out reactions to each of the news sources posts between August 2020 and January 2021.

The results show misinformation from the far-left is, concerningly, almost as engaging as centrist real news on Facebook - but by a very long shot, far-right fake news gets the most engagement.

Graph showing how much more engaging far-right fake news is than any other form of news, real or fake.
For far-right news sources, misinformation significantly outperforms non-misinformation; for all other political leanings, there is a misinformation penalty resulting in lower engagement per follower. Photo credit: New York University

"What we find is that among the far right in particular, misinformation is more engaging than non-misinformation," lead researcher Laura Edelson told Wired.

"I think this is something that a lot of people thought might be the case, but now we can really quantify it, we can specifically identify that this is really true on the far-right, but not true in the center or on the left."

There has been growing concern in recent years about social media platforms using algorithms that drive people to extremist content as it's generally the most engaging.

This study will only add to those fears.

However, a Facebook spokesperson told Wired that the study doesn't tell the full story.

"This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook," they said.

"When you look at the content that gets the most reach across Facebook, it's not at all as partisan as this study suggests."