Christchurch terror attack: Facebook, Instagram carry remembrance message for victims

Facebook and Instagram have unveiled a message of remembrance to victims of the Christchurch terror attack.

Users of both platforms who log in on Friday will be met with a full-screen popup displaying a silver fern and the words: "Together in Tears, Together in Hope. In remembrance of every person lost in the Christchurch attacks. If you need support, call or text 1737."

The message marks a week since the March 15 attack on two Christchurch mosques, which killed 50 and injured dozens.

Facebook says it was approached by the Government to discuss how it could show a mark of respect for the victims and communities affected.

The 1737 number being publicised is a free, 24/7 support service run by the Ministry of Health. It connects people with trained counsellors.

The popup will be delivered to all Facebook and Instagram users in New Zealand until midnight.

The message appears on both Instagram and Facebook
The message appears on both Instagram and Facebook Photo credit: Newshub

But the message has angered some, who say it detracts from Facebook's failures to act against the spread of violent material on its platforms.

"Hey @facebook, we didn't as you to make a fern image a week after a terror attack," said David Tong on Twitter.

"We asked you to make sure that Nazi terrorists couldn't share live videos of their murders.We asked you to stop providing a haven for white racists to spread their hate."

Facebook Inc, which owns both Facebook and Instagram, has been heavily criticised for its slow response to last Friday's attack.

The shooter livestreamed part of the attack on Facebook and it was subsequently shared around the internet by other users.

Other online platforms such as YouTube and Twitter have also been criticised for allowing the video to be spread further.

Facebook has released two statements since the attack, saying the shooter's livestream was viewed fewer than 200 times and was not reported until 12 minutes after the 17-minute broadcast ended. The company says the video was viewed about 4000 times in total before being removed.

The company has admitted its artificial intelligence (AI) is not perfect, and the video did not trigger its automatic detection system.

"While (AI's) effectiveness continues to improve, it is never going to be perfect," Facebook said in a statement on Thursday night (NZ time).

"People will continue to be part of the equation, whether it's the people on our team who review content, or people who use our services and report content to us."

Facebook users encouraged to go dark today

An online campaign - organised via Facebook - is encouraging users to log off the platform at 1:40pm today, marking the exact time a week ago that the attack began.

The "FBFree 4 Christchurch" event is asking people to stay off Facebook for 50 hours - one hour for each victim  and to instead focus on spending quality time with family and friends.

"In a time when technology has been developed that can automatically remove photos, videos, and text that violate their policies we believe they should be doing more to prevent the ability for these atrocities to be shared via social media and impacting victims further," the page says.

"Let's use the positive power of social media to strive to make a difference."

If you would like to donate to the official Victim Support fund, go here. Victim Support's number is 0800 842846. If you need to talk to someone about what you've read or seen in the past few days, call or text Need To Talk? on 1737.


Contact Newshub with your story tips: