Apple Abandons Plan To Check iOS Devices For Child Abuse Images

Apple has stepped back from using its controversial child sexual abuse image detection system on iOS devices and iCloud photo libraries.

The plan surprised many in August 2021, when Apple suddenly announced that it would scan photo libraries on iPhone and iPads, as well as its iCloud for child sexual abuse material (CSAM).

Days after that announcement Apple defended the move, but did not responded to specific concerns that it was proposing to check images on people’s actual handsets, unlike its competitors who only scan uploaded images to the cloud.

CSAM scanning

Apple did however pledge at the time not to allow repressive governments to abuse its controversial child sexual abuse image detection system.

But such was the backlash from campaigners who slammed the system’s potential privacy implications, that Apple put the system on hold.

Concern centred over how part of Apple’s checking process for child abuse images would be done directly on user devices.

And there were fears that repressive governments could also force Apple to add non-child abuse images to its hash check list of images.

Many child safety and security experts however praised the attempt however.

Now this week Apple announced its decision to kill off CSAM scanning, in a statement provided to Wired.

Apple did not respond to other media outlets.

In the statement Wednesday, Apple said it had “decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.”

“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” Apple said in a statement to Wired.

Instead, Apple is refocusing its efforts on growing its Communication Safety feature. This was first made available in December 2021.

Essentially the Communication Safety tool is an opt-in parental control feature that warns minors and their parents when incoming or sent image attachments in iMessage are sexually explicit and, if so, blurs them.

Privacy, security changes

The dropping of the CSAM scanning is not the only change announced this week by Apple.

The firm also announced three new security and privacy features this week, amid ongoing attacks.

Its iMessage Contact Key Verification system means users can verify they are communicating only with whom they intend.

Meanwhile Security Keys for Apple ID gives users the choice to require a physical security key to sign in to their Apple ID account.

And the Advanced Data Protection for iCloud uses end-to-end encryption, so users have the choice to further protect important iCloud data, including iCloud Backup, Photos, Notes, and more.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

6 hours ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

8 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

10 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

11 hours ago