Apple Acknowledges ‘Confusion’ Over ‘iPhone Scanning’ System

Apple has acknowledged that its announcement of tools to scan for illegal images on iPhones and iPads was “jumbled pretty badly”.

Following criticism from privacy campaigners, the company has now given more details on the system, saying device-level scanning would allow independent experts to verify how Apple was using the system and what was being scanned for.

On 5 August Apple announced it would scan images uploaded from iPhones and iPads to its iCloud storage, looking for matches against a database of known child sex abuse material (CSAM) maintained by the US National Centre for Missing and Exploited Children (NCMEC).

Companies that operate cloud-based services, including Facebook, Google and Microsoft, commonly scan for CSAM, but do so remotely.

Upload scanning

Apple said it plans to add hashes for the CSAM database directly to iPhones and iPads in an operating system update later this year and that devices are to scan images before they reach iCloud.

An image is to be scanned only when a user uploads it to iCloud, and the system only detects exact matches against the database.

The system would not flag images of a person’s children in the bath, or search for pornography, Apple’s head of software, Craig Federighi, told The Wall Street Journal.

He said the announcement was “misunderstood” and that people had become concerned that Apple was scanning iPhones for images.

“That is not what is happening,” Federighi said.

“We feel very positively and strongly about what we’re doing and we can see that it’s been widely misunderstood.”

Account review

If the user tries to upload several CSAM images, the account will be flagged for review by Apple staff.

Federighi said this would only happen if the user tried to upload in the region of 30 matching images.

Apple said it plans to add the same database to all versions of iOS and iPadOS, but that it would only be used for scanning in the US initially, with rollouts in other countries to be considered on a case-by-case basis.

Apple said putting the database on the device would add accountability and that an independent auditor would be able to verify what was being scanned for.

‘Confusion’

The company is also rolling out a separate parental control that invovles image-scanning and Federighi said there had been “confusion” between the two.

If activated by a parent, the second feature scans messages sent or received by a child using the iMessage app. If nudity is detected the tool obscures the photo and warns the child.

Parents can also choose to receie an alert if the child chooses to view the photo.

Privacy groups said the tool could be expanded and used by authoritarian governments to spy on citizens.

Will Cathcard, head of WhatsApp, said Apple’s tools were “very concerning” and whistleblower Edward Snowden called the iPhone a “spyPhone”.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

5 hours ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

7 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

9 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

10 hours ago