New Zealand is falling "massively" behind other countries when it comes to protecting New Zealanders from AI nudes, a lawyer is warning.
Australia is making moves to criminalise non-consensual digitally manipulated sexual material or 'deepfakes', but back home a lawyer says deepfakes aren't fully captured by legislation.
Deepfakes use artificial technology (AI) to manipulate or alter videos and photos of someone. In instances of deepfake pornography, the technology is used to apply somebody's face to existing pornographic material.
Australia introduced legislation this week to create new criminal offences to ban the sharing of non-consensual deepfake sexually explicit material. Once passed, offenders could face up to seven years imprisonment for sharing and/or creating the material. It strengthens its law which since 2021 has regulated the sharing of altered images like deepfakes.
The change aims to enable laws to catch up with technology.
In recent years, there has been a rise in AI being used to make explicit images of adults without their consent.
Research by visual threat intelligence company Sensity AI found 90 to 95 percent of online deepfake videos are non-consensual pornography. Almost all of them depict women.
The issue was put under the spotlight earlier this year after the White House condemned sexually graphic deepfake images of popstar Taylor Swift which were spread across social media platforms. One post was reportedly viewed 47 million times before it was taken down.
The case reignited global calls for tougher laws and regulations around sexualized deepfakes.
However, in New Zealand, our law is lagging behind.
Currently, our Harmful Digital Communications Act 2015 provides an offence for posting an intimate visual recording without consent.
Justice Minister Paul Goldsmith told Newshub he is advised that in most circumstances this Act will determine posting deep fakes of a sexual nature to be an offence.
However, Arran Hunt, partner at law firm McVeagh Fleming, said the law only covers instances where police can show offenders' intention to cause distress or a high level of harm.
"If someone creates a deepfake without intention to cause that level of harm, or with negligence to it or not really thinking about it or any other reason, then it won't fall under the legislation," Hunt said.
While, if someone sends a real naked image to somebody then by default there's an intent to cause harm in the legislation.
Hunt said New Zealand is "massively" behind other places like Australia and the UK where legislation includes descriptions of non-consensual sexual material that "depicts or appears to depict", or "appears to alter" certain body parts.
"Whether it's real or not is irrelevant, so if someone's creating the material it should be seen to be done with intent," Hunt said.
Goldsmith did not answer Newshub's question on whether the Government would consider following Australia in making a criminal offence for non-consensual deepfake pornography. He said under the Crimes Act, there are already several offences related to intimate visual recordings, including prohibitions on making them, possessing them in certain circumstances, and publishing, importing, exporting, or selling them.
Hunt said this law does not cover deepfakes due to the wording being a video of the actual person.
"When Goldsmith says that deepfakes are covered, they're not," he said.
In 2021, Hunt and others tried to get the previous Government to include deepfake pornography under its revenge porn law change.
"We said 'look this isn't going to cover deepfakes' and for whatever reason they choose not to cover deepfakes," Hunt said.
Hunt said it is common sense to include deepfakes in the legislation.
"All Governments have always had issues creating laws for technology, they usually do it several years too late - this is another example of that," he said.