
Under the changes, teens under 16 are blocked from using Instagram Live unless parents give permission. They also need permission to "turn off our feature that blurs images containing suspected nudity" in direct messages, Meta said in a blog post.
Meta, the parent company of Instagram and Facebook, is introducing new rules to protect teens online. One of the biggest updates is that children under 16 will no longer be allowed to livestream on Instagram unless their parents give permission. This move is part of a broader effort by Meta to make its platforms safer for young users, especially as concerns grow over how social media impacts teens’ mental health and online safety.
The new rule will first roll out in the US, UK, Canada, and Australia, and then expand to Europe and other parts of the world in the coming months. In addition to the livestream ban, Meta is also adding more controls for teens using direct messages. For example, under-16 users will now need parental consent to turn off a feature that automatically blurs images suspected to contain nudity.
Meta is not stopping with Instagram. The company is extending similar protections to Facebook and Messenger. These include:
●Making all teen accounts private by default
● Blocking direct messages from strangers
● Limiting exposure to sensitive content, such as violent videos
● Sending reminders to log off after 60 minutes of screen time
● Disabling notifications during bedtime hours
These changes are part of Meta’s “teen account” initiative, launched in September, which aims to give parents more control and ensure a safer online experience for young users. According to Meta, over 54 million teen accounts have been created since this program started. The company says these updates are designed to limit inappropriate content, reduce unwanted contact, and help teens use their time online more wisely.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.