'Deepfake' videos could be next social media minefield - researchers

There's growing concern about the damage sophisticated fake videos can have on the public.

A new Law Foundation report says deepfake media could prove troublesome to tech companies and the Government, following the Christchurch Call.

Deepfakes involve the use of artificial intelligence to create a simulated video that is almost indistinguishable from the real thing.

Co-Author Tom Barraclough told Newshub the content is rapidly improving in its capabilities.

"There's a risk that when that happens politicians or celebrities or even everyday people could be made to look or sound like they've done something when they haven't."

There's currently at least 16 Acts and guidelines that touch on fake media and Barraclough said  looking at new legislation would be a good starting point.

"What that will let us do is work out where there are gaps through new legislation or amendments and it will also just let us be prepared so that when one of these things does happen the law is ready to respond."

He's not fond of the idea of rushing out to make new laws though.

"Before calling for a new law or new regulators, let's work out what we've already got, and why existing law is or isn't working," Barraclough said in a statement.

"Enforcing the existing law will be difficult enough, and it is not clear that any new law would be able to do better. Overseas attempts to draft law for deepfakes have been seriously criticised."

Barraclough also said in a statement private companies need to consider what they can do to curb the problem.

"We must ask what private companies can do that governments can't. We have to consider access to justice: often a quick solution is more important than a perfect one.

"Social media companies can restrict content in ways that Governments can't: is that something we should encourage or restrict?"

Something does need to change though, as Barraclough told Newshub there's concern our current system could be letting people down.

"We're a little bit concerned that because there's such a wide range of legislation that touches on synthetic media that people who are victims of it might fall through the cracks."