
Meta has announced new safety features for Teen Accounts on Instagram. This includes a new option to block and report users directly from their private messages (DMs). The newly added safety features are designed to give teen users on Instagram more context about the accounts they are messaging and help them spot potential scammers. A Teen Account on Instagram has enhanced privacy and parental controls.
The move comes amid growing concern about how social media affect young users and as Meta grapples with legal action over the alleged impact of its apps on children’s mental health.
Meta revealed in an announcement on Wednesday that it had recently removed hundreds of thousands of accounts for inappropriate behaviour involving minors. According to the company, over 1,35,000 accounts were flagged for posting sexualised comments on content shared by children under 13, while another 5,00,000 were associated with adult accounts that had “inappropriate interactions” with kids.
The restrictions form part of Meta’s broader effort to tighten guardrails for teenagers, particularly in direct messages. The new safety tools include clearer information about accounts that message teens, as well as a streamlined option to block and report users with a single tap.
"These features are designed to give teens more control over their online experience and help them make safer choices," Meta said in a blog post. One such feature includes a safety alert that pops up when a teen receives a message from someone they don’t follow, encouraging them to be cautious and to block or report if anything feels off.
Also Read: Instagram's Teen Accounts Safety Feature enters India
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.