Apple Urged To Drop CSAM Scanning Plans By Campaigners

The controversy surrounding Apple’s plans to scan messages of children, and the iPhones of adults for known child sexual abuse material (CSAM) continues.

A new open letter signed by more than 90 policy and rights groups has made clear their concerns about Apple’s intentions in this regard, citing the risk of its tech being misused by rogue governments and agencies.

It comes after Apple surprised many earlier this month, when it suddenly announced that it will scan an iPhone’s photo libraries being uploaded to the iCloud for known images of child sexual abuse, in an unprecedented move.

Child protection

The move was immediately slammed by campaigners, who also accused the iPhone maker of creating a backdoor for encryption, by scanning encrypted messages of children using AI, for sexual red flags.

Apple called its development ‘Expanded Protections for Children’, and it means that Apple will scan photo libraries stored on all iPhones of adults in the US for known images of child sexual abuse (Child Sexual Abuse Material or CSAM), before they are uploaded to iCloud.

Prior to the images being uploaded to iCloud, the images will be compared against a database of known child abuse imagery produced by child protection agencies.

If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.

Apple says this will work on iOS and iPadOS operating systems, and since the tool only looks for images that are already in NCMEC’s database, parents taking photos of a child in the bath, for example, apparently need not worry.

Privacy concerns

Apple said that governments cannot force it to add non-CSAM images to a hash list.

It should be noted that other tech companies including Microsoft, Google and Facebook have for years been sharing digital fingerprints of known child sexual abuse images.

But they scan images stored on their own hardware and servers.

Apple itself has used those fingerprints to scan user files stored in its iCloud service for child abuse images.

Apple however believes that its new system is more private than those used by other companies, because its new system uses both its servers and software that will be installed on people’s iPhones through an iOS update.

But this new direction is the first time that a tech company will be actively scanning images “on-device”, and is an unprecedented development.

And it has caused serious concerns among privacy campaigners.

Open letter

Last week a new open letter signed by more than 90 policy and rights groups, urged Apple “to abandon plan to build surveillance capabilities into iPhones, iPads, and other products.”

The letter was organised by the US-based nonprofit Center for Democracy & Technology (CDT).

The letter explains that although the new features are designed to protect children and reduce the spread of child sexual abuse material (CSAM), they will create new risks for children and could be used to censor speech and threaten the privacy and security of people around the world.

Of particular concern is the scanning and alert feature in Messages could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents.

Another concern is that once the CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable.

The letter has been signed by groups including the American Civil Liberties Union, Big Brother Watch, the Electronic Frontier Foundation, Liberty, Privacy International, and X-Lab among many others.

“The undersigned organisations committed to civil rights, human rights and digital rights around the world are writing to urge Apple to abandon the plans it announced on 5 August 2021 to build surveillance capabilities into iPhones, iPads and other Apple products,” said the letter, addressed to CEO Tim Cook.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter states.

AI message scanning

The letter raises two main concerns. The first is Apple using AI to scan children’s message for inappropriate images or text.

“Apple announced that it is deploying a machine learning algorithm to scan images in its text messaging service, Messages, to detect sexually explicit material sent to or from people identified as children on family accounts,” said the letter. “

This surveillance capability will be built right into Apple devices.”

“Algorithms designed to detect sexually explicit material are notoriously unreliable,” the letter states.

“They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery.”

Image scanning

The second concern is Apple building into the OS, the ability to scan images on the iPhones of adults uploading to the iCloud, for CSAM content, which could be misused by governments.

“Apple also announced that it would build into the operating system of its products a hash database of CSAM images provided by the National Center for Missing and Exploited Children in the United States and other child safety organisations,” said the letter.

“It will scan against that database every photo its users upload to iCloud,” it stated. “When a preset threshold number of matches is met, it will disable the account and report the user and those images to authorities. Many users routinely upload the photos they take to iCloud. For these users, image surveillance is not something they can opt out of; it will be built into their iPhone or other Apple device, and into their iCloud account.”

“Once this capability is built into Apple products, the company and its competitors will face enormous pressure – and potentially legal requirements – from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” said the letter.

“Those images may be of human rights abuses, political protests, images companies have tagged as ‘terrorist’ or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them,” said the letter.

“And that pressure could extend to all images stored on the device, not just those uploaded to iCloud,” the letter added. “Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis.”

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Craig Wright Sentenced For Contempt Of Court

Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…

2 days ago

El Salvador To Sell Or Discontinue Bitcoin Wallet, After IMF Deal

Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…

2 days ago

UK’s ICO Labels Google ‘Irresponsible’ For Tracking Change

Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…

2 days ago

EU Publishes iOS Interoperability Plans

European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…

3 days ago

Momeni Convicted In Bob Lee Murder

San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…

3 days ago