apple-travel2

Apple Publishes FAQ for Their New Child Safety Features (PDF)

Apple:

Could governments force Apple to add non-CSAM images to the
hash list?

Apple will refuse any such demands. Apple’s CSAM detection
capability is built solely to detect known CSAM images stored in
iCloud Photos that have been identified by experts at NCMEC and
other child safety groups. We have faced demands to build and
deploy government-mandated changes that degrade the privacy of
users before, and have steadfastly refused those demands. We will
continue to refuse them in the future. Let us be clear, this
technology is limited to detecting CSAM stored in iCloud and we
will not accede to any government’s request to expand it.
Furthermore, Apple conducts human review before making a report to
NCMEC. In a case where the system flags photos that do not match
known CSAM images, the account would not be disabled and no report
would be filed to NCMEC.

Can non-CSAM images be “injected” into the system to flag ac-
counts for things other than CSAM?

Our process is designed to prevent that from happening. The set of
image hashes used for matching are from known, existing images of
CSAM that have been acquired and validated by child safety
organizations. Apple does not add to the set of known CSAM image
hashes. The same set of hashes is stored in the operating system
of every iPhone and iPad user, so targeted attacks against only
specific individuals are not possible under our design. Finally,
there is no automated reporting to law enforcement, and Apple
conducts human review before making a report to NCMEC. In the
unlikely event of the system flagging images that do not match
known CSAM images, the account would not be disabled and no report
would be filed to NCMEC.

This FAQ is good, and addresses most of the misconceptions I’ve seen. The human review step for flagged accounts is key to the trustworthiness of the system.

I do wonder though, how prepared Apple is for manually reviewing a potentially staggering number of accounts being correctly flagged. Because Apple doesn’t examine the contents of iCloud Photo Library (or local on-device libraries), I don’t think anyone knows how prevalent CSAM is on iCloud Photos. We know Facebook reported 20 million instances of CSAM to NCMEC last year, and Google reported 546,000. For Facebook, that’s about 55,000 per day; for Google, 1,500 per day. I think it’s a genuine “we’ll soon find out” mystery how many iCloud Photo users are going to be accurately flagged for exceeding the threshold for CSAM matches when this goes live. If the number is large, it seems like one innocent needle in a veritable haystack of actual CSAM collections might be harder for Apple’s human reviewers to notice.

Read more at Daring Fireball.

Related Articles

Skip to content