Apple’s plan to scan iCloud for child abuse images sets off privacy firestorm

From Madhumita Murgia and Tim Bradshaw’s “Apple plans to scan US iPhones for child abuse imagery” posted Thursday:

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.

Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting.

The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.

A few reactions:

  • Matthew Green: This will break the dam — governments will demand it from everyone.
  • Bryan Jones: A vile betrayal of Apple’s very soul.
  • Alec Muffett: A huge and regressive step for individual privacy.
  • Edward Snowden: No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*

My take: Something about the combination of kids, porn and the Internet seems to make otherwise worldly-wise adults a little crazy. I speak from experience.

21 Comments

  1. This is interesting, from Friday’s New York Times:

    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.

    https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html

    3
    August 6, 2021
  2. Jerry Doyle said:
    Apple either provides users privacy or it doesn’t. There are no exceptions to be made. If Apple can open its security lock to scan users iCloud accounts for kiddie porn, then why can Apple not scan users iCloud accounts for terrorists, for organized crime and for all other unsavory illegal acts. Privacy is privacy. There are no exceptions. If there is an exception, then there is no privacy and one must ask why do we not open the doors for more exceptions because there also are other paramount reasons for doing so, such as national security, crime and personal abuse. Apple has created a problem for itself that is not going away. Other exceptions to its privacy policy now must be considered.

    If U.S. laws require tech companies to flag cases of child sexual abuse to authorities, then Congress must act to pass similar legislation to flag terrorists, organized crime activities, etc.

    4
    August 6, 2021
    • David Emery said:
      I can understand different rules for stuff in iCloud than iPhones. Having Apple search my iPhone goes way too far. And there are some interesting legal questions around the “evidence” that Apple would obtain through such a search (obviously conducted without a warrant.)

      3
      August 6, 2021
      • Jerry Doyle said:
        @David Emery: Brother David, enlighten me further on your response.

        How is Apple accessing privileged information in one’s iCloud account different than Apple accessing privileged information inside one’s iPhone? Personal information in my home is no different than the same personal information entrusted to my accountant, to my attorney, to my doctor and to my minister. That personal information just happens not to be in my home, but is the same personal information being held outside my home by others who are suppose to retain its “privilege” status. It all is information that should be held in the strictest level of privacy.

        What Apple may find going forward are Apple buyers foregoing iCloud storage accounts and instead, buying hardware with the largest data storage capacities hoping never to lose their devices.

        0
        August 6, 2021
    • Steven Noyes said:
      It’s as if you are stuck between a rock and a hard place. I can only imagine pics parents take of their kids taking their first bath (my mom still embarrasses me with those) might cause issues. I don’t know how the AI would be able to separate the innocent from the atrocities.

      1
      August 6, 2021
  3. Greg Lippert said:
    Although I get that we want to protect children, this is a very bad overreach. As stated before this allows Apple and other companies to scan for just about anything.

    Privacy is a key benefit of Apple. Poof, gone. Terrible decision.

    1
    August 6, 2021
  4. Romeo A Esparrago Jr said:
    Did anyone say Apple’s made a decision yet?

    “ Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week “

    3
    August 6, 2021
  5. Joe Murphy said:
    Opening Pandora’s box is a grave mistake. The losses it brings far and away exceed the expected gains.

    1
    August 6, 2021
  6. John Konopka said:
    I read about this in the last few days. It was pointed out that other companies (Google and FB, others?) have been doing this for years, maybe five to ten years. Apple is late to the party.

    I could be wrong but I believe that iCloud dada is not encrypted so no key or “back door” is needed.

    2
    August 6, 2021
  7. Rodney Avilla said:
    Please be aware that Apple is not searching thru your icloud pics and trying to decide if there is anything resembling child porn. People worry that grandma’s pic of them bathing as a baby will get flagged. Please note the following (from iDropNews):

    “Photos are only matched against known CSAM images from the NCMEC database. Apple is not using machine learning to “figure out” whether an image contains CSAM, and a cute photo or video of your toddler running around is not going to get flagged by this algorithm.”

    The odds of a innocent photo being flagged as a ‘match’ is 1 in 1 trillion. And then there has to be several matches before the red flag goes up. After that, a human will review the pics before police notification.

    7
    August 6, 2021
    • David Emery said:
      “The odds of a innocent photo being flagged as a ‘match’ is 1 in 1 trillion. ”

      I’m curious how you come up with that number. Furthermore, it doesn’t seem to understand “near match” pattern recognition (whether that is “AI enabled” or just “a distance measure”.) That usually can be tuned (‘show me everything with a 95% match’.)

      0
      August 6, 2021
  8. This is one of many actions Apple will (must?) take to fend off litigation and potential legislation, worldwide. It’s always important to step out of a US-centric perspective, as Apple execs must do each day. They’ll err on the side of morality and decency. Having any moral compass was once rare among corporate behemoths. Bribes, kickbacks, hiring local henchmen, bulldozing villages in the night, that was just the oil & mining concerns.
    There is very little left of unfortunately out-dated notions of privacy. All aspects of our lives are too lucrative to the data miners, spies & politicians.

    0
    August 6, 2021
  9. David Baraff said:
    I strongly urge anyone interested in this topic to read Daring Fireball blog about this. It explains clearly where that 1 in a trillion figure comes from, and why reports that Apple is analyzing iCloud images for content are just plain wrong. (For iCloud they are only matching against a known set of existing images, which is not at all what is described.)

    There are two very different things being done here, and as usual, lazy journalists wrote stories without bothering to understand what the hell they were talking about. Shocker.

    3
    August 6, 2021
    • David Emery said:
      Agree, that did clear up some of my concerns, but reinforced others. I still want to know how the “1 in 1 trillion” number is calculated. And I wonder if diddling a couple of pixels on an image that would otherwise match would invalidate the match.

      0
      August 7, 2021
  10. Kirk DeBernardi said:
    As an aside —

    PED. Could you have picked a more freaky pic to head this topic? If you wanted to grab attention and send shivers, hey…it worked.

    0
    August 8, 2021

Leave a Reply