Apple’s new CSAM detection features pass John Gruber’s smell test, but…

From “Apple’s New ‘Child Safety’ Initiatives, and the Slippery Slope” posted Friday on Daring Fireball:

In short, if these features work as described and only as described, there’s almost no cause for concern. In an interview with The New York Times for its aforelinked report on this initiative, Erik Neuenschwander, Apple’s chief privacy engineer, said, “If you’re storing a collection of CSAM material, yes, this is bad for you. But for the rest of you, this is no different.” By all accounts, that is fair and true.

But the “if” in “if these features work as described and only as described” is the rub. That “if” is the whole ballgame. If you discard alarmism from critics of this initiative who clearly do not understand how the features work, you’re still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future.

What happens, for example, if China demands that it provide its own database of image fingerprints for use with this system — a database that would likely include images related to political dissent. Tank man, say, or any of the remarkable litany of comparisons showing the striking resemblance of Xi Jinping to Winnie the Pooh.

This slippery-slope argument is a legitimate concern. Apple’s response is simply that they’ll refuse…

Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.

My take: A good backgrounder by an honest Apple enthusiast who cares about privacy. Recommended.

13 Comments

  1. Fred Stein said:
    Gruber’s article is the best I’ve seen so far. MUST read.

    Gruman refers to NYT (Aug 5, 2021) that many other databases flag cases to law enforcement. “Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million”

    FB’s 20M has to include millions of false positives. Thus FB puts millions at risk of being incorrectly entered into NCMEC’s database. It may also over-load NCMEC resources. And one wonders about NCMEC’s own security and privacy, especially with 20 million cases.

    4
    August 7, 2021
    • David Emery said:
      Speaking of Facebook and image scanning: FB -has to be able- to detect the common scam where a scammer grabs an account’s primary photos and creates a new fake account with the same name and photos (and then sends friend requests.) That’s one where a ‘perfect pixel match’ would work, those scammers are unlikely to change the stolen images. FB’s unwillingness to shut this scam down is a classic indication that FB doesn’t care about its users. Zuckerberg and his company are institutional psychopaths.

      2
      August 7, 2021
  2. Mordechai Beizer said:
    This is a bad idea and will inevitably be used to monitor/censor other personal data. I doubt that Apple will be able to say no to additional monitoring once they’ve implemented the feature. Saying “we can’t do what you want because we don’t have that ability” is a far stronger position than “we won’t do what you want because we think it’s wrong”. The slippery slope starts here.

    5
    August 7, 2021
    • Jerry Doyle said:
      @Mordechai Beizer: Oh, you are so correct when it comes to the government!

      0
      August 7, 2021
  3. Fred Stein said:
    And…

    Serious child pornographers can turn off iCloud storage. A 2TB drive costs $100.

    And..

    FB head of WhatApp blasted Apple’s plans as being bad for privacy.

    4
    August 7, 2021
  4. Jerry Doyle said:
    “…. U.S. law requires tech companies to flag cases of child sexual abuse to the authorities.….The Messages features for children in iCloud family accounts is doing content analysis to try to identify sexually explicit photos, ….”.

    Jerry D says: If we can search for pedophiles then why can’t we search for terrorists, organized crime, etc? One deals with the exploitation of children and the other deals with exploitation of national security and persons’ life savings.

    “…. All of these features are fairly grouped together under a ‘child safety’ umbrella, but I can’t help but wonder if it was a mistake to announce them together. Many people are clearly conflating them, including those reporting on the initiative for the news media.”

    Jerry D says: A reason exists for writing language in a “legalese” format of convoluting explanations. The reason for writing the explanation as such often is to “bury-the-truth.” It can and could be written as simple as Apple makes its devices “user friendly.” It is not, though. Why?

    “…. Which in turn makes me wonder if Apple sees this initiative as a necessary first step toward providing E2E encryption for iCloud Photo Library and iCloud device backups. Apple has long encrypted all iCloud data that can be encrypted, both in transit and on server, but device backups, photos, and iCloud Drive are among the things that are not end-to-end encrypted. Apple has the keys to decrypt them, and can be compelled to do so by law enforcement….. Apple Inc. dropped plans to let iPhone users fully encrypt backups of their devices in the company’s iCloud service after the FBI complained that the move would harm investigations, ….The tech giant’s reversal, about two years ago, has not previously been reported. It shows how much Apple has been willing to help U.S. law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers’ information….you’re still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future.…. Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.”

    Jerry D says: Governments operates patiently (China today is perfect example) taking minuscule steps so as to anesthetize public opinion on controversial issues for laying the foundation for its attainment.

    1
    August 7, 2021
  5. Gregg Thurman said:
    I absolutely abhor child abuse, of any kind. If it were up to me, upon conviction they’d be summarily shot.

    But this feature smells too much like Orwell’s 1984. If this feature goes live I will cancel my iCloud account and start using thumb drives. I would imagine that the market for Apple compatible thumb drives will skyrocket.

    I would also venture that this capability would thwart sales.

    The intent is good, but the idea that it’s future use can be controlled is disingenuous.

    2
    August 7, 2021
  6. Aaron Belich said:
    There are so many other investigative and forensic tools that are better suited for many of the slippery slope arguments than this hash-scanning system.

    This is such a non-issue. Typical FUD anytime Apple makes waves regarding privacy and abiding by government regulations/requirements.

    3
    August 7, 2021
    • Fred Stein said:
      You nailed it, Aaron. I think Apple did this because they had to, forced by our government.

      They worked long and hard to find to find an approach with the least impact on privacy.

      3
      August 7, 2021

Leave a Reply