Facebook to finally allow disabling of comments on posts in algorithm shake-up

Facebook to finally allow disabling of comments on posts in algorithm shake-up.
Photo credit: Getty

Facebook is now allowing comments to be turned off on public posts for the first time since the social media platform launched in 2004.

Users will also be offered more insight into and control over what content appears in their News Feed as a part of what the company calls a "significant shift" in how it operates its algorithms, which have been widely criticised in recent years.

The ability to disable comments on posts will be given to all people and pages as of today, Facebook says, adding the feature is "intended to be used reactively in the case of harassment and other unwanted interactions".

But it could be used however users want.

For news outlets, this could free up staff time currently used for moderating and removing comments that reveal legally suppressed information and other content that breaches their guidelines.

For businesses or politicians, it could mean turning off comments whenever they don't like what the commenters are saying, for whatever reason.

Facebook users can now disable comments on posts.
Selecting the third option and not mentioning anyone in the post will disable all comments. Photo credit: Facebook

To coincide with Facebook announcing the changes, the company's vice president of global affairs Nick Clegg has written an essay defending its algorithms.

"It is alleged that social media fuels polarisation, exploits human weaknesses and insecurities, and creates echo chambers where everyone gets their own slice of reality, eroding the public sphere and the understanding of common facts," writes Clegg.

"Perhaps it is time to acknowledge it is not simply the fault of faceless machines? Consider, for example, the presence of bad and polarising content on private messaging apps - iMessage, Signal, Telegram, WhatsApp - used by billions of people around the world.

"None of those apps deploy content or ranking algorithms. It's just humans talking to humans without any machine getting in the way. In many respects, it would be easier to blame everything on algorithms, but there are deeper and more complex societal forces at play."

Facebook's algorithms have often been cited in recent years as a lead cause for increased polarisation around the world, ultimately leading to atrocities such as the attack on the US Capitol earlier this year and the 2019 terrorist attack on two mosques in Christchurch.

That particular atrocity was very closely associated with Facebook as the terrorist used the platform to livestream the mass murder.

Prime Minister Jacinda Ardern's responses to the Christchurch terror attack have frequently included citing the role of social media algorithms in it, with signatories of the Christchurch Call committing to "review the operation of algorithms" among other things.

A release by New Zealand's Classification Office on violent extremism and disinformation online published in late 2020 specifically blamed algorithms for spreading the Christchurch terrorist's footage.

"The video was amplified by algorithms on platforms like Facebook, and even reached victims' family members and friends. It was a horrific wake-up call to how digital technology can be weaponised in new and devastating ways," said the Classification Office.

"Globally, we have since seen attacks with clear links to the Christchurch terrorist attacks and a growing movement of white supremacists, 'incels', and violent extremist action often linked to disinformation and conspiracy theories spread online."

Facebook is now giving users more transparency around why suggested posts appear in their News Feed.
The changes will give users access to more information on why posts appear in their News Feed. Photo credit: Facebook

Amid a public spat between Facebook and Apple over privacy, Apple CEO Tim Cook earlier this year called out social media algorithms as being responsible for real world violence among other ills.

"At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement - the longer the better - and all with the goal of collecting as much data as possible," said Cook.

"What are the consequences of prioritising conspiracy theories and violent incitement simply because of their high rates of engagement? What are the consequences of not just tolerating, but rewarding content that undermines public trust in life-saving vaccinations? What are the consequences of seeing thousands of users join extremist groups, and then perpetuating an algorithm that recommends even more?

"It is long past time to stop pretending that this approach doesn't come with a cost - of polarisation, of lost trust and, yes, of violence. A social dilemma cannot be allowed to become a social catastrophe."

Among the defences in Clegg's essay is an insistence that polarisation and extremist content is bad for Facebook's bottom line.

"Before we credit 'the algorithm' with too much independent judgement, it is of course the case that these systems operate according to rules put in place by people. It is Facebook's decision makers who set the parameters of the algorithms themselves, and seek to do so in a way that is mindful of potential bias or unfairness," writes Clegg.

"The reality is, it's not in Facebook's interest - financially or reputationally - to continually turn up the temperature and push users towards ever more extreme content. Bear in mind, the vast majority of Facebook's revenue is from advertising.

"Advertisers don't want their brands and products displayed next to extreme or hateful content - a point that many made explicitly last summer during a high-profile boycott by a number of household-name brands."

However, despite Facebook arguing against criticisms of its algorithms, its latest actions show it clearly agrees there is room for improvement.

How impactful the changes are will likely be monitored by concerned governments and other regulatory bodies around the world.