Security

Apple Explains How Iphones Will Scan Photos for Child-Sexual-Abuse Images

Apple Explains How Iphones Will Scan Photos for Child-Sexual-Abuse Images

Shortly after reports today that Apple will start scanning iPhones for child abuse images, the company confirmed its plan and provided details in a news release and technical summary.

“Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind,” Apple’s announcement said. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

Apple provided more detail on the CSAM detection system in a technical summary and said its system uses a threshold “set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”

The changes will roll out “later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey,” Apple said. Apple will also deploy software that can analyze images in the Messages application for a new system that will “warn children and their parents when receiving or sending sexually explicit photos.”

Apple accused of building “infrastructure for surveillance”

Despite Apple’s assurances, security experts and privacy advocates criticized the plan.

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world,” said Greg Nojeim, co-director of the Center for Democracy & Technology’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

For years, Apple has resisted pressure from the US government to install a “backdoor” in its encryption systems, saying that doing so would undermine security for all users. Apple has been lauded by security experts for this stance. But with its plan to deploy software that performs on-device scanning and share selected results with authorities, Apple is coming dangerously close to acting as a tool for government surveillance, Johns Hopkins University cryptography Professor Matthew Green suggested on Twitter.

The client-side scanning Apple announced today could eventually “be a key ingredient in adding surveillance to encrypted messaging systems,” he wrote. “The ability to add scanning systems like this to E2E [end-to-end encrypted] messaging systems has been a major ‘ask’ by law enforcement the world over.”

Message scanning and Siri “intervention”

In addition to scanning devices for images that match the CSAM database, Apple said it will update the Messages app to “add new tools to warn children and their parents when receiving or sending sexually explicit photos.”

“Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages,” Apple said.

When an image in Messages is flagged, “the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo.” The system will let parents get a message if children do view a flagged photo, and “similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it,” Apple said.

Apple said it will update Siri and Search to “provide parents and children expanded information and help if they encounter unsafe situations.” The Siri and Search systems will “intervene when users perform searches for queries related to CSAM” and “explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

Apple’s technology for analyzing images

Apple’s technical summary on CSAM detection includes a few privacy promises in the introduction. “Apple does not learn anything about images that do not match the known CSAM database,” it says. “Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.”

Apple’s hashing technology is called NeuralHash and it “analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value,” Apple wrote.

Before an iPhone or other Apple device uploads an image to iCloud, the “device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.”

Using “threshold secret sharing,” Apple’s “system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content,” the document said. “Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images.”

While noting the 1-in-1 trillion probability of a false positive, Apple said it “manually reviews all reports made to NCMEC to ensure reporting accuracy.” Users can “file an appeal to have their account reinstated” if they believe their account was mistakenly flagged.

User devices to store blinded CSAM database

An iPhone or other device will analyze user photos, compute a NeuralHash, and look up “the entry in the blinded hash table.” The device “also uses the blinded hash that the system looked up to obtain a derived encryption key” and uses that encryption key “to encrypt the associated payload data.”

As noted earlier, you can read the technical summary. Apple also published a longer and more detailed explanation of the “private set intersection” cryptographic technology that determines whether a photo matches the CSAM database without revealing the result.

Previous ArticleNext Article

Related Posts