Instagram announced changes to the app to adjust the content allowed on the app when it comes to self-harm and suicide. Facebook, Instagram's parent company, is also making similar changes to its platform it announced Thursday.
The social media companies consulted with experts from around the globe on what content should and shouldn't be allowed on the platforms to help keep their users safe while still allowing them to express themselves and share their stories.
"Over the past month we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe," said a post from Instagram about the changes to the platform.
What this means for some users is that they might not see some of the same posts as they did before and some might see sensitivity warning screens on posts as well. Instagram said it's continuing to work with experts to find the right path forward but it's going to take time so the changes might not go into effect immediately.
The platform has never allowed photos or posts that promote suicide or self-harm but did historically allow posts of admission or contemplation where the posted discusses their experiences with those topics, said Instagram. While posts like those of healed scars will still be allowed on the app, Instagram is making it more difficult for users to find. It won't be listed under hashtags or in the Explore tab on the app either.
Instagram said that it's also working to add more resources to the app for people searching for such hashtags or looking for that type of content in Explore. "Our aim is to have no graphic self-harm or graphic suicide related content on Instagram and to significantly reduce – with the goal of removing – all self-harm and suicide imagery from hashtags, search, the explore tab or as recommended content, while still ensuring we support those using Instagram to connect with communities of support," said Instagram.
The changes to Instagram and Facebook come after controversy and even some legal threats. The health secretary in the United Kingdom told social media platforms like Facebook and Twitter that if they didn't remove inappropriate content they could face legal consequences, The Guardian reported.