From “Apple’s New ‘Child Safety’ Initiatives, and the Slippery Slope” posted Friday on Daring Fireball:
In short, if these features work as described and only as described, there’s almost no cause for concern. In an interview with The New York Times for its aforelinked report on this initiative, Erik Neuenschwander, Apple’s chief privacy engineer, said, “If you’re storing a collection of CSAM material, yes, this is bad for you. But for the rest of you, this is no different.” By all accounts, that is fair and true.
But the “if” in “if these features work as described and only as described” is the rub. That “if” is the whole ballgame. If you discard alarmism from critics of this initiative who clearly do not understand how the features work, you’re still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future.
What happens, for example, if China demands that it provide its own database of image fingerprints for use with this system — a database that would likely include images related to political dissent. Tank man, say, or any of the remarkable litany of comparisons showing the striking resemblance of Xi Jinping to Winnie the Pooh.
This slippery-slope argument is a legitimate concern. Apple’s response is simply that they’ll refuse…
Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.
My take: A good backgrounder by an honest Apple enthusiast who cares about privacy. Recommended.
Gruman refers to NYT (Aug 5, 2021) that many other databases flag cases to law enforcement. “Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million”
FB’s 20M has to include millions of false positives. Thus FB puts millions at risk of being incorrectly entered into NCMEC’s database. It may also over-load NCMEC resources. And one wonders about NCMEC’s own security and privacy, especially with 20 million cases.
Serious child pornographers can turn off iCloud storage. A 2TB drive costs $100.
And..
FB head of WhatApp blasted Apple’s plans as being bad for privacy.
That’s rich
Jerry D says: If we can search for pedophiles then why can’t we search for terrorists, organized crime, etc? One deals with the exploitation of children and the other deals with exploitation of national security and persons’ life savings.
“…. All of these features are fairly grouped together under a ‘child safety’ umbrella, but I can’t help but wonder if it was a mistake to announce them together. Many people are clearly conflating them, including those reporting on the initiative for the news media.”
Jerry D says: A reason exists for writing language in a “legalese” format of convoluting explanations. The reason for writing the explanation as such often is to “bury-the-truth.” It can and could be written as simple as Apple makes its devices “user friendly.” It is not, though. Why?
“…. Which in turn makes me wonder if Apple sees this initiative as a necessary first step toward providing E2E encryption for iCloud Photo Library and iCloud device backups. Apple has long encrypted all iCloud data that can be encrypted, both in transit and on server, but device backups, photos, and iCloud Drive are among the things that are not end-to-end encrypted. Apple has the keys to decrypt them, and can be compelled to do so by law enforcement….. Apple Inc. dropped plans to let iPhone users fully encrypt backups of their devices in the company’s iCloud service after the FBI complained that the move would harm investigations, ….The tech giant’s reversal, about two years ago, has not previously been reported. It shows how much Apple has been willing to help U.S. law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers’ information….you’re still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future.…. Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.”
Jerry D says: Governments operates patiently (China today is perfect example) taking minuscule steps so as to anesthetize public opinion on controversial issues for laying the foundation for its attainment.
But this feature smells too much like Orwell’s 1984. If this feature goes live I will cancel my iCloud account and start using thumb drives. I would imagine that the market for Apple compatible thumb drives will skyrocket.
I would also venture that this capability would thwart sales.
The intent is good, but the idea that it’s future use can be controlled is disingenuous.
“…the idea that it’s future use can be controlled is disingenuous.”
I think that’s a little harsh. Apple is walking a fine line here, that’s true. But it is Apple’s expressed intent to keep it under control. That may or may not be possible. But I have zero doubt about Apple’s commitment to attempt that.
Maybe “is intended to be controlled” is a better way to put it than “can be controlled”.
This is such a non-issue. Typical FUD anytime Apple makes waves regarding privacy and abiding by government regulations/requirements.
They worked long and hard to find to find an approach with the least impact on privacy.