'Instagram helped kill my daughter': Why the app is banning self-harm images

Instagram has announced it will ban graphic self-harm images after they were blamed for causing a 14-year-old's suicide.

UK teen Molly Russell killed herself shortly after viewing material relating to depression and suicide on the social media site, her father claims.

A heartbroken Mr Russell says he believes Instagram's algorithms allowed his daughter to view more and more disturbing content.

"I have no doubt that Instagram helped kill my daughter. She had so much to offer and that's gone," he told the BBC.

"The social media companies, through their algorithms, expose young people to more and more harmful content, just from one click on one post.

"In the same way that someone who has shown an interest in a particular sport may be shown more and more posts about that sport, the same can be true of topics such as self-harm or suicide."

In a meeting with UK health secretary Matt Hancock, Instagram's head Adam Mosseri admitted the company hasn't done enough.

"Over the past month we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe," Mr Mosseri said in a statement.

In response to public outrage, Instagram says it will now block graphic images of self-harm. It will also prevent non-graphic, self-harm content - like healed scars - from showing up on parts of the app like the explore tab and searches.

But these non-graphic images won't be removed completely, Instagram says, because it doesn't want to stigmatise self-harm survivors.

"I might have an image of a scar and say, 'I'm 30 days clean,' and that's an important way to tell my story," Mr Mosseri said.

"That kind of content can still live on the site but the next change is that it won't show up in any recommendation services so it will be harder to find."

Where to find help and support:

Newshub.