Apple to check iCloud photo uploads for child abuse images

Apple To Check Icloud Photo Uploads For Child Abuse Images Apple To Check Icloud Photo Uploads For Child Abuse Images
The tech giant has said it will check photos on iPhones in the United States for matches with known images of child sexual abuse. Photo: Sean Gallup/Getty
Share this article

By Stephen Nellis

Apple has said it will implement a system that checks photos on iPhones in the United States for matches with known images of child sexual abuse before they are uploaded to its iCloud storage services.

If enough child abuse image uploads are detected, Apple will initiate a human review of and report the user to law enforcement officials, the company said. Apple said the system is designed to reduce false positives to one in one trillion.

With the new system, Apple is trying to address two imperatives: Requests from law enforcement to help stem child sexual abuse, and the privacy and security practices that the company has made a core tenet of its brand.

Apple has now joined most other major technology providers – including Alphabet's Google, Facebook and Microsoft –  in checking images against a database of known child sexual abuse imagery.


“With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” John Clark, chief executive of the National Center for Missing & Exploited Children, said in a statement. “The reality is that privacy and child protection can co-exist.”

Here is how Apple's system works. Law enforcement officials maintain a database of known child sexual abuse images and translate those images into “hashes” – numerical codes that positively identify the image but cannot be used to reconstruct them.

Apple has made its own implementation of that database using a technology called “NeuralHash” that is designed to also catch edited but similar of the original imagines. That database will be stored on iPhones.

When a user uploads an image to Apple's iCloud storage service, the iPhone will create a hash of the image to be uploaded and compare it against the database.

Human review

Photos stored only on the phone are not checked, Apple said, and human review before reporting an account to law enforcement is meant to ensure any matches are genuine before suspending an account.

Apple said users who feel their account was improperly suspended can appeal to have it reinstated.

The Financial Times earlier reported some aspects of the programme.

Child abuse content on Facebook may have been miss...
Read More

One key aspect of the system that sets it apart from other technology companies is that Apple checks photos stored on phones before they are uploaded, rather than checking the photos after they arrive on the company's servers.

On Twitter, some privacy and security experts expressed concerns that the system could eventually be expanded to scan phones more generally for prohibited content or political speech.

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, wrote in response to the earlier reporters.

“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam – governments will demand it from everyone.”

Read More

Want us to email you top stories each lunch time?

Download our Apps
© 2023, developed by Square1 and powered by