New online safety code could end up failing vulnerable Kiwis - safety groups

"This could be perceived as a way for big online services to write rules that suit themselves."
"This could be perceived as a way for big online services to write rules that suit themselves." Photo credit: Getty Images

A new code of practice being developed to improve online safety in New Zealand could end up failing the people it intends to help, according to three different organisations.

The current process is largely focusing on input from tech platforms, one said, which could be seen as those platforms trying to write rules to suit themselves.

The Aotearoa New Zealand Code of Practice for Online Safety and Harms is being developed by NetSafe, New Zealand's non-profit online safety organisation, alongside major technology companies.

The code, which is due to go for public consultation soon, is designed to deliver a self-regulatory framework to improve safety and reduce harmful content on the likes of Facebook, Instagram, Twitter, TikTok and Twitch.

While Tohatoha NZ, InternetNZ and the Inclusive Aotearoa Collective Tāhono agree that online services should be taking responsibility for content online, the suggestion is those who are most impacted most are being ignored by the process.

"The people whose voices matter most in regard to online safety are those who are most affected by the harmful behaviour this Code sets out to tackle," Anjum Rahman, project co-lead for Inclusive Aotearoa Collective said.

"The draft Code has been developed without equitable opportunities for people and communities to engage."

The code of practice was announced in July, with Netsafe CEO Martin Cocker describing it as an "opportunity for New Zealand to establish a world-leading online safety code of practice to ensure a safer internet experience for the community".

"If successful, the Code will see technology companies commit to ensuring their products and services are delivered in a way that makes the safety of New Zealand internet users of paramount importance," he said. 

But by focusing on the input from tech platforms before coming to communities and individuals it simply highlights the power imbalance, Rahman said.

"This could be perceived as a way for big online services to write rules that suit themselves and avoid efforts at regulation."

Mandy Henk, CEO of Tohatoha NZ, agreed with Rahman's concerns regarding the lack of community involvement.

"To address serious issues like misinformation and hate speech, this code will need broad buy-in from diverse communities across Aotearoa, particularly those most impacted by harmful behaviour online," she said.

"The lack of significant community engagement so far means this Code may be set up to fail these people."