apple-travel2

Joanna Stern Interviews Craig Federighi Regarding Apple’s Controversial New Child Safety Features

Clarifying interview, with at least one bit of news: Federighi says the heretofore unspecified “threshold” for CSAM fingerprint matches that must be reached before an iCloud account is flagged (or even can be flagged, thanks to how the shared-key cryptography is implemented) is “on the order of 30 known child pornographic images”. Right around the 3:00 mark:

“And if, and only if you meet a threshold of something on the order
of 30 known child pornographic images matching, only then does
Apple know anything about your account and know anything about
those images.”

There’s also a WSJ news story (News+ link), co-bylined by Stern and Tim Higgins, in which Federighi emphasizes that the database of CSAM fingerprints is auditable:

Beyond creating a system that isn’t scanning through all of a
user’s photos in the cloud, Mr. Federighi pointed to another
benefit of placing the matching process on the phone directly.
“Because it’s on the [phone], security researchers are constantly
able to introspect what’s happening in Apple’s [phone] software,”
he said. “So if any changes were made that were to expand the
scope of this in some way — in a way that we had committed to not
doing — there’s verifiability, they can spot that that’s
happening.”

Critics have said the database of images could be corrupted, such
as political material being inserted. Apple has pushed back
against that idea. During the interview, Mr. Federighi said the
database of images is constructed through the intersection of
images from multiple child-safety organizations — not just the
National Center for Missing and Exploited Children. He added that
at least two “are in distinct jurisdictions.” Such groups and an
independent auditor will be able to verify that the database
consists only of images provided by those entities, he said.

Read more at Daring Fireball.

Related Articles

Skip to content