Australia is pushing ahead with plans to introduce jail terms and massive fines for social media providers that don't quickly remove violent material.
Prime Minister Scott Morrison called for a crackdown last week after the alleged Christchurch gunman used Facebook to livestream his shooting, which was then shared via YouTube, Twitter, Facebook and Reddit.
- Facebook appears 'genuinely committed' to change - NetSafe
- Christchurch terror attack: Facebook looking into restrictions on livestreaming
- Facebook deletes hundreds of pages linked to 'inauthentic' political group
Under the proposed laws, offences will be punishable by three years' jail for executives, or fines that can reach up to 10 per cent of the platform's global annual turnover.
Communications Minister Mitch Fifield was left unimpressed after a meeting with social media companies earlier this week and is now resolved to finding a legislative solution.
"They did not present any immediate solutions to the issues arising out of the horror that occurred in Christchurch," Fifield said.
"We will not allow social media platforms to be weaponised by terrorists and violent extremists who seek to harm and kill and nor would we allow a situation that a young Australian child could log onto social media and watch a mass murder take place," Fifield told Aljazeera.
The proposed law targets the showing of violent material, which means the playing or streaming of acts of terrorism, murder, attempted murder, torture, rape and kidnapping on social media.
The new law will make it a criminal offence not to remove any such materials as fast as possible. Also, social media platforms anywhere in the world must notify the Australian Federal Police if they become aware their service is streaming violent conduct happening in the country.
News media will still be able to report on events that are in the public interest, without showing the violent material itself.
Morrison expects to discuss the new laws when world leaders meet in Osaka in June for the G20 Summit.
Meanwhile, Facebook chief operating officer Sheryl Sandberg said in a blog on Instagram, which is owned by Facebook, they're considering restricting who can access service, depending on factors such as previous violations of the site's community standards.
The company is also looking into new technology to quickly identify edited versions of violent videos and take them down.