Apple officially scraps plan to scan iCloud for child abuse images

Image for article titled Apple officially cancels plans to scan iCloud photos for child abuse material

photo: Anton_Ivanov (Shutterstock)

Apple has officially ended one of its most controversial proposals of all time: a plan to scan iCloud images for signs of child sexual abuse material (or CSAM).

Yes, last summer, Apple announced that it would introduce on-device scanning — a new feature in iOS that used advanced technology to quietly scan individual users’ photos for signs of bad footage. The new feature was designed So if the scanner found any evidence of CSAM, it would alert human technicians, who would then presumably alert the police.

The plan immediately triggered a torrential flood setback by privacy and security experts, with critics arguing that the scanning feature could eventually be repurposed to search for other types of content. EEven having such scanning capabilities in iOS was a slippery slope to broader surveillance abuse, Critics claimed and tThe general consensus was that the tool ccould quickly become a back door for the police.

Apple fought hard back then criticismbut the company finally relented, saying not long after originally announcing the new feature that it “move” Implementation until a later date.

Now it looks like that date will never come. On Wednesday amid announcements for a bevy of new iCloud security features, the company also announced that it would not be moving forward with its plans for on-device scanning. In an opinion divided With Wired magazine, Apple made it clear that it had decided to take a different path:

After extensive consultations with experts to gather feedback on child safeguarding initiatives we proposed last year, we are deepening our investment in the communications security feature, which we first made available in December 2021. We have further decided not to proceed with our previously proposed CSAM detection tool for iCloud photos. Children can be protected without companies sifting through personal data, and we will continue to work with governments, child advocates and other companies to protect young people, uphold their right to privacy and make the internet a safer place for children and for all of us close .

Apple’s plans seemed well intentioned. Digital distribution of CSAM is a main problem— and experts say it’s only gotten worse in recent years. Apparently, an attempt to solve this problem was a good thing. That is, Apple’s underlying technology proposed usage – and the surveillance dangers that come with it – just don’t seem to be the right tool for the job.

Leave a Reply

Your email address will not be published. Required fields are marked *