Apple's NeuralHash blues: The Congressional angle

From Jack Nicas' "Are Apple’s Tools Against Child Abuse Bad for Your Privacy?" in Thursday's New York Times:

A few years ago, the National Center for Missing and Exploited Children began disclosing how often tech companies reported cases of child sexual abuse material, commonly known as child pornography, on their products.

Apple was near the bottom of the pack. The company reported 265 cases to the authorities last year, compared with Facebook’s 20.3 million. That enormous gap was largely due, in most cases, to Apple’s electing not to look for such images to protect the privacy of its users.

In late 2019, after reports in The New York Times about the proliferation of child sexual abuse images online, members of Congress told Apple that it had better do more to help law enforcement officials or they would force the company to do so. Eighteen months later, Apple announced that it had figured out a way to tackle the problem on iPhones, while, in its view, protecting the privacy of its users.

My take: Don't know yet who those congresspeople were, but this story fits the timeline of Craig Federighi's narrative. That 2019 child sex abuse report in the New York Times, by the way, is worth a re-visit. The graphics are stunning. It's ironic, given what flowed from the report, that it does not mention Apple.


  1. This current effort by Apple is directly tied to their stance on device encryption. It’s an effort to give law enforcement what they needed to catch child abusers without building a back door in iOS.
    Here’s a 2019 quote from the Manhattan DA Testimony to the Senate:
    “Ideally, Apple and Google would do their part to help create a balanced technical and legal solution to the problems caused by their encryption decisions. Absent this contribution, the changing winds of public sentiment around Big Tech, in the wake of Facebook’s Cambridge Analytica and Google’s Project Dragonfly scandals, has recently created a climate that will support a legislative solution.” Written Testimony for the United States Senate Committee on the Judiciary on Smartphone Encryption and Public Safety, I’ll follow with link to a rather thorough report, albeit law enforcement’s POV.

    August 19, 2021
  2. Fred Stein said:
    Wait. FB’s 20M – implications and questions:

    1) Are all 20M still in the NCMEC database? And if local police ask, might they get a match?

    2) Why didn’t FB cull the list instead of passing that job on to NCMEC? Surely FB has better big data skills and tools.

    3) Did FB give users any guidance to prevent being caught in the scoop? Is FB still doing this?

    4) Surely many of these 2M are completely innocent.

    August 19, 2021

Leave a Reply