acm-header
Sign In

Communications of the ACM

ACM News

Apple's Plan to Scan Handset Images Stopped Before It Started


View as: Print Mobile App Share:
A smartphone whose screen indicates something severe is happening.

Because of all the backlash from privacy watchdogs and consumer groups, Apple said implementation of its scheme to inspect images uploaded from iPhones to iCloud Photo for Child Sexual Abuse Material was being postponed to collect input and make improvem

Credit: fatherly.com

Having child pornography, or Child Sexual Abuse Material (CSAM), on your phone or any other device is not only morally wrong, it is a crime. In that context, it is understandable that Apple would want to purge its cloud service, iCloud Photo, of CSAM.

In August, the company announced a system of child safety features that were to be installed on every iPhone and iPad by a software update "later this year." If a certain threshold of CSAM material was flagged, the user account would automatically be suspended and the user reported to the National Center for Missing and Exploited Children (NCMEC).

The notion that Apple planned to enter user's phones to root around for CSAM caused an uproar among consumer groups and privacy advocates, but this is not what the system does. It only inspects images uploaded from a phone to iCloud Photo—usually for automatic synchronization between devices—and it is not truly 'looking' at those images, either. The new software calculates a 'neural hash' of every photo or video that is set for upload to iCloud Photo, and compares this to the neural hashes of a database of known CSAM provided to Apple by NCMEC.

A neural hash is a 96-bit digital fingerprint of an image. Unlike a cryptographic hash, which changes completely if a single pixel of an image is altered, a neural hash is designed to stay the same, or nearly the same, when an image is cropped, rotated, resized, or subjected to similar changes that leave the content essentially intact. Apple uses an 'intelligent' convolutional neural network to extract a neural hash from an image.

This would enable the system to flag photos or videos whose content matches CSAM in the NCMEC database. To avoid false accusations, it takes more than one match—the threshold can be set at any number, but it might be 10—before the system sends an alert to Apple that this user has child porn on his phone.      

Around that, Apple constructed an elaborate double-encryption shell using Private Set Intersection and Threshold Secret Sharing to protect the privacy of its users. Before the threshold is reached, Apple cannot identify or see images for which the system found a match, and is not told how many matches may have been found on someone's phone. Only when the alert goes out can Apple decrypt the suspicious data and have a human employee review the material to check for errors. 

Because of all the backlash from privacy watchdogs and consumer groups, Apple announced on  September 3 that the implementation of these features was being postponed "to collect input and make improvements." No new date for the software update has been set yet.

Ann Dooms, a data scientist at Belgium's Vrije Universiteit Brussel (Free University of Brussels), recently wrote a column in popular magazine Eos Wetenschap (EOS Science) about this, calling the plan "a balancing act between privacy and security."  Apple could hardly be surprised that its customers were shocked; it has always made data security a selling point, so most users assumed automatic backups to the iCloud were "safe."

"That is a common misperception," says Dooms. "iCloud data are not truly end-to-end encrypted, like messaging services WhatsApp and Signal are, because Apple has the encryption keys. They can read the back-up data in the iCloud." And so can U.S. law enforcement, if a judge orders Apple to hand over the "keys" for a certain customer. Apple will not confirm it, but in January Reuters reported the company dropped a plan for end-to-end encryption of iCloud back-ups under pressure from the U.S. Federal Bureau of Investigation (FBI).  

Obviously, the child safety software would not let an iPhone user choose which outgoing images will be scanned; it scans them all, so a user can only opt out of this by never synchronizing his devices to the cloud. Moreover, once the system gets rolled out, it will scan all existing iCloud Photo data for CSAM.   

Dooms and other experts have been quick to point out that neural hashing is quite vulnerable to adversarial attack. The neural network that does the 'intelligent' hashing can be fooled into flagging doctored but innocent images as CSAM (a toolkit to enable this is available on Github). Said Dooms, "By creating tens of thousands of false-positives, the entire system can be crashed, similar to DDOS attacks on websites."

Mission creep is another concern. Once the system is in place, it can just as well compare the neural hashes of images on an iPhone to neural hashes of other kinds of material, and report any matches. Why not scan for pictures of missing persons? Or criminals on the run? Or dissidents in hiding? Apple now says it will never comply with such requests. The Russian and Chinese governments might think differently.

Even with all privacy safeguards in place and all glitches fixed, Dooms would be unimpressed by the system. After all, it only detects CSAM already labelled as such by the authorities. Perhaps some pedophiles will be careless enough to let their iPhones share such material with the cloud unencrypted, but certainly not heavy users or producers. Said Dooms, "You will only catch the small fish at the end of the food chain."

It is a cruel irony that an iPhone with all these 'child safety features' installed can still be used with impunity to record child pornography and upload that to iCloud Photo.

Apple did not reply to a request for comment, other than referring to its webpage about CSAM detection.

 

Arnout Jaspers is a freelance science writer based in Leiden, the Netherlands.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account