Click to Skip Ad
Closing in...

Instagram says it will now hide self-harm images behind ‘sensitivity screens’

Published Feb 4th, 2019 8:08PM EST
Instagram Update
Image: Simon Belcher/imageBROKER/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Instagram’s new CEO Adam Mosseri penned an article in The Telegraph this week to explain the various steps Facebook’s subsidiary will take to prevent users from being exposed to self-harm and suicide-related images, while simultaneously looking for ways to help users who are posting such content.

Instagram will use so-called “sensitivity screens,” Mosseri explained in the article, to blur the offensive content and remove it from search, hashtags, or account recommendations. 
The images will not be immediately visible, but they’ll still be there, as Instagram doesn’t want to ban people from sharing self-harm content. The CEO said the company has been trying to stop the dissemination of such images, but it’s not able to find the content fast enough:

We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe. To be very clear, we do not allow posts that promote or encourage suicide or self-harm. We rely heavily on our community to report this content, and remove it as soon as it’s found. The bottom line is we do not yet find enough of these images before they’re seen by other people.

Instagram says that engineers and trained content reviewers will be “working around the clock” to make it harder for people to find these images, and that it has already put in place measures that will stop recommending “related images, hashtags, accounts, and typeahead suggestions.” It’s unclear from the wording of the article whether Instagram will rely on any artificial intelligence tools to automatically recognize elements in a photo or video that would fall under the self-harm category.

Mosseri also said that it wants to help those users who are struggling with self-harm or suicidal thoughts and direct them to organizations that can help, including Papyrus and Samaritans.

Instagram’s announcement follows a tragic event in the UK, where a 14-year old teen took her own life after seeing graphic images of self-harm on Instagram and Pinterest, Engadget explains.

Facebook rolled out AI-based suicide prevention tools in November 2017, so it only makes sense to see similar programs roll out to other services, especially Instagram.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.

Latest News