Responding to global regulatory pressure, Meta has announced its decision to conceal harmful content from teenagers on Instagram and Facebook. This move comes in the wake of accusations from over 33 U.S. states, alleging that the social media company made its platforms addictive and detrimental to the mental health of teenagers. The European Commission had also sought information from Meta regarding its plans to safeguard children from harmful online content.
Meta’s strategy involves the removal of self-harm content from Instagram and Facebook, as well as addressing other age-inappropriate material. Specifically, Instagram’s Reels and Explore sections will be purged of any content related to self-harm, even if posted by someone known to the teenagers. Under the new policies, teenage users will automatically be subjected to the most restrictive settings, prohibiting searches for topics such as suicide, self-harm, and eating disorders.
While the platforms will permit users to share content related to these sensitive topics, Meta has pledged that individuals posting such content will be redirected to expert resources to seek help. The implementation of these updated policies is scheduled to roll out to all users in the coming weeks.
For existing teenage users of Instagram and Facebook, Meta plans to send notifications encouraging them to enhance their safety and privacy settings on the platforms. Opting to ‘Turn on recommended settings’ will result in the enforcement of certain restrictions, preventing actions such as reposting content, tagging, mentioning, or including the profile in Reels Remixes.
In January of the previous year, Meta had already adjusted its policies by limiting advertisers’ use of data to target teens based solely on age and location. Additionally, modifications were made to the types of ads teenagers could encounter on the platform.