The social network will also prevent nongraphic self-harm content from appearing in search, hashtags and the explore tab, says Instagram head Adam Mosseri.

Instagram
Getty Images

Instagram won’t allow graphic images of self-harm, such as cutting, on its site, head Adam Mosseri wrote in a Thursday post. The platform will also prevent nongraphic self-harm related content, such as images of healed scars, from showing up in search, hashtags and the explore tab. It also won’t recommend that content, he said.

“Up until now, we’ve focused most of our approach on trying to help the individual who is sharing their experiences around self-harm,” Mosseri wrote. “We have allowed content that shows contemplation or admission of self-harm, because experts have told us it can help people get the support they need. But we need to do more to consider the effect of these images on other people who might see them.”

Instagram won’t remove nongraphic self-harm content entirely because “we don’t want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help,” Mosseri said. The platform will also work to provide more resources to people who post and search for self-harm related content and will direct them to organizations that can help, he added.

Instagram will continue to consult with experts to learn how to best approach the issue, which might include blurring nongraphic self-harm content with a sensitivity screen, so users would be blocked from seeing content unless they actively choose to, Mosseri said.

The new policy expands on measures Mosseri discussed in an op-edin the Daily Telegraph on Monday, in which he said he’d do more to protect vulnerable users from seeing content promoting suicide and self-harm. This includes using sensitivity screens to obscure images depicting cutting, and being more supportive of people who post images indicating they might be struggling with these issues, he wrote. The piece touched on the death of British teenager Molly Russell, who took her life in 2017. Russell had used Instagram to engage with and post content about depression and suicide, leading her family to blame the social network for her death.

Experts such as the Centre for Mental Health and Save.org advised Instagram that although safe spaces for people to discuss their experiences are essential, graphic images of self-harm could unintentionally promote more harm, Mosseri said.

The goal of Instagram’s new policy is to eliminate graphic self-harm or graphic suicide-related content from the platform, Mosseri said, and to reduce and eventually eliminate all self-harm and suicide images from hashtags, search, the explore tab and recommended content.

“We will not be able to remove these images immediately and we must make sure that people posting self-harm related content do not lose their ability to express themselves and connect with help in their time of need,” Mosseri wrote.

LEAVE A REPLY

Please enter your comment!
Please enter your name here