Apple will scan for child exploitation images on devices, winning praise and raising concerns
2021-08-12![Apple will scan for child exploitation images on devices, winning praise and raising concerns Apple will scan for child exploitation images on devices, winning praise and raising concerns](https://varindia.com/storage/news/uploads/2018/02/611538ec0466b.jpg)
Apple has announced the rollout of a new feature that is both winning praise and raising concerns. The newer technology can scan photographs on its devices to check for content that could be classified as Child Sexual Abuse Material (CSAM). The tool, scheduled for rollout later this year, will scan photos and text messages on Apple devices looking for known images of child sex abuse. Jim Lewis, an expert in cybersecurity, said Apple "has gone out of its way to make this as privacy friendly as possible." Experts said a major step forward in the fight to eliminate" child sexual abuse material from the internet.
If there's a match, the photos will be shown to an Apple employee. Verified sensitive material will then be forwarded to the National Center for Missing & Exploited Children, and the user's iCloud account will be locked.
The move is drawing both applause and outcry on Twitter, Tom Hanson reports for "CBS This Morning: Saturday."
But security watchdogs are concerned the new software could be exploited by hackers and foreign governments.
The head of WhatsApp, Will Cathcart, said he's "concerned."
"I think this is the wrong approach and a setback for people's privacy all over the world," he said in a series of tweets.
"Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out," he tweeted.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.