After regulators around the globe pressed the social media giant to protect children from harmful content on its apps, Meta Platforms will be hiding more content from teens on Instagram and Facebook. All teens will now be placed into the most restrictive content control settings on the apps and additional search terms will be limited on Instagram, Meta said in a blogpost.
According to Meta, the move will make it more difficult for teens to come across sensitive content such as suicide, self-harm and eating disorders when they use features like Search and Explore on Instagram. The company further said the measures, expected to roll out over the coming weeks, would help deliver a more "age-appropriate" experience.
Meta is under pressure both in the United States and Europe over allegations that its apps are addictive and have helped fuel a youth mental health crisis.
Attorneys general of 33 U.S. states including California and New York sued the company in October, saying it repeatedly misled the public about the dangers of its platforms.
The regulatory scrutiny increased following testimony in the U.S. Senate by a former Meta employee who alleged the company was aware of harassment and other harms facing teens on its platforms but failed to act against them.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.