Instagram pledges to remove self-harm content

Elias Hubbard
February 9, 2019

Instagram has agreed to ban graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life. The platform will also work to provide more resources to people who post and search for self-harm related content and will direct them to organizations that can help, he added.

Experts such as the Centre for Mental Health and Save.org advised Instagram that although safe spaces for people to discuss their experiences are essential, graphic images of self-harm could unintentionally promote more harm, Mosseri said. "We have allowed content that shows contemplation or admission of self-harm, because experts have told us it can help people get the support they need".

"We need to do more to consider the effect of these images on other people who might see them".

Instagram now relies on users to report graphic images of self-harm.

The Facebook-owned social media site has faced an outcry after Ian Russell, father of 14-year-old Molly, claimed it "helped kill my daughter".

However, not everyone is keen on Instagram's proposed changes.

MPs and advocacy groups in the United Kingdom also blasted the platform in the wake of Molly's death, with the minister for suicide prevention Jackie Doyle-Price warning that social media sites were "normalising" self-harm. "We're doing things that feel good and look good instead of doing things that are effective", she said.

Jo Robinson: It's not simple no, and as a parent myself, my heart goes out to any parent who's lost a child to suicide - it really does, but suicide and self-harm and are terribly complex behaviours. Young people can access this material in many other places, so they could be exposed to it despite the best will in the world.

Although, Peter points out, this community is not always helpful in young people overcoming their mental health problems. "We are not removing non-graphic self-harm related content from Instagram entirely, as we don't want to stigmatise or isolate people who may be in distress". Instagram is also planning to blur self-harm content and put it behind a privacy screen so users can not access it accidentally.

Certainly, if you now attempt to search for the hashtag #selfharm, you will be met with new options on your screen - to seek support or confirm that you want to view the content despite Instagram's warnings.

It may take some time to get right, and there could potentially be unintended impacts, but Instagram's latest moves on this front are worthy of encouragement.

Mr Russell was buoyed by Instagram's commitment, and in an interview with the BBC called on other social media to follow suit.

Matt Hancock yesterday said proposals which could have seen firms facing fines for allowing vile and illegal material on their sites were too drastic.

Or, if you are struggling and need to talk, you may benefit from seeking professional support.

Other reports by Click Lancashire

Discuss This Article

FOLLOW OUR NEWSPAPER