Meta said Tuesday it would start hiding inappropriate content from teen accounts on Instagram and Facebookincluding articles on suicide, self-harm and eating disorders.
The Menlo Park, Calif.-based social media giant said in a blog post that while it already aims to not recommend such “age-inappropriate” content to teens, it will now no longer display it in their feeds, even if it is shared by an account they follow.
“We want teens to have safe, age-appropriate experiences on our apps,” Meta said.
Teenage users – provided they didn’t lie about their age when they signed up for Instagram or Facebook – will also have their accounts placed on the platforms’ most restrictive settings, and they will no longer be able to search for terms likely to be harmful.
“Let’s take the example of someone who posts about their ongoing struggle with thoughts of self-harm. This is an important story that can help destigmatize these issues, but it is a complex topic that is not necessarily suitable for all young people,” Meta said. “Now we’ll start removing this type of content from teens’ experiences on Instagram and Facebook, along with other types of age-inappropriate content.”
Meta’s announcement comes as the company faces legal action from dozens of American states who accuse him of harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that get children addicted to his platforms.
Critics have said Meta’s actions don’t go far enough.
“Today’s announcement by Meta is yet another desperate attempt to avoid regulation and an incredible slap in the face to parents who have lost their children to online harm on Instagram,” said Josh Golin, Executive Director from the online children’s rights group Fairplay. “If the company is capable of hiding pro-suicide and eating disorder content, why did it wait until 2024 to announce these changes?