Meta has declared that it would be releasing new 'Nudity prevention' features on Instagram to protect kids from "sextortion" schemes. These initiatives coincide with charges from US lawmakers about the detrimental effects of the platform on children's mental health. Gangs that engage in sextortion use threats of disclosure to compel victims to provide obscene photos in exchange for money.
The AI-driven "nudity protection" tool that would find and blur images containing nudity that were sent to minors on the app's messaging system.
"This way, the recipient is not exposed to unwanted intimate content and has the choice to see the image or not," Capucine Tuffier, who is in charge of child protection at Meta France, told AFP.
The company said it would also offer advice and safety tips to anyone sending or receiving such messages. Some 3,000 young people fell victim to sexploitation scams in 2022 in the United States, according to the authorities.
Additionally, the company announced plans to offer guidance and safety tips to individuals involved in sending or receiving such messages.
According to US authorities, approximately 3,000 young people in the United States fell victim to sexploitation scams in 2022, underscoring the urgency of addressing this issue.
Separately, more than 40 US states began suing Meta in October in a case that accuses the company of having “profited from children’s pain”. The legal filing alleged Meta had exploited young users by creating a business model designed to maximise time they spend on the platform despite harm to their health.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.