Apple’s unforced error

From Lucas Matney’s “Apple’s dangerous path” posted Saturday on TechCrunch:

In the past month, Apple did something it generally has done an exceptional job avoiding — the company made what seemed to be an entirely unforced error…

In early August — seemingly out of nowhere** — the company announced that by the end of the year they would be rolling out a technology called NeuralHash that actively scanned the libraries of all iCloud Photos users, seeking out image hashes that matched known images of child sexual abuse material (CSAM). For obvious reasons, the on-device scanning could not be opted out of.

Researchers and advocacy groups had almost unilaterally negative feedback for the effort, raising concerns that this could create new abuse channels for actors like governments to detect on-device information that they regarded as objectionable. As my colleague Zack noted in a recent story, “The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.”

The issue — of course — wasn’t that Apple was looking at find ways that prevented the proliferation of CSAM while making as few device security concessions as possible. The issue was that Apple was unilaterally making a massive choice that would affect billions of customers (while likely pushing competitors towards similar solutions), and was doing so without external public input about possible ramifications or necessary safeguards…

Having spent several years in the tech media, I will say that the only reason to release news on a Friday morning ahead of a long weekend is to ensure that the announcement is read and seen by as few people as possible, and it’s clear why they’d want that. It’s a major embarrassment for Apple, and as with any delayed rollout like this, it’s a sign that their internal teams weren’t adequately prepared and lacked the ideological diversity to gauge the scope of the issue that they were tackling. This isn’t really a dig at Apple’s team building this so much as it’s a dig on Apple trying to solve a problem like this inside the Apple Park vacuum while adhering to its annual iOS release schedule.

My take: Yup.

10 Comments

  1. Miguel Ancira said:
    I have always said that a trait of a mature person is being able to say “I am sorry. I was wrong.”
    You move on.
    Next.
    No reason a big company cannot do that.
    It is better than doubling down on your mistake.

    9
    September 5, 2021
  2. Romeo A Esparrago Jr said:
    Upvoted everything you said, Mr. Miguel.
    Especially “ It is better than doubling down on your mistake.”

    3
    September 5, 2021
  3. Fred Stein said:
    Worth noting: Lucas’ ** refers to his postulation that Apple was responding to a movement that US joined with UK, New Zealand, Australia, Canada, India and Japan to address the concern that encryption helped offenders, especially child porn.

    2
    September 5, 2021
  4. Jerry Doyle said:
    “…. The issue was that Apple was unilaterally making a massive choice that would affect billions of customers (while likely pushing competitors towards similar solutions), and was doing so without external public input about possible ramifications or necessary safeguards…”.

    Candidly, I do not see where involving external representatives to glean input for necessary safeguards resolves this issue. As I wrote last Friday, the issue never has been about the CSAM features targeted at child safety. It’s all about the “precedence” this foray establishes into personal privacy emanating from the phone. There is no amount of public input that will resolve that problem. Privacy is privacy for all.

    If Apple selectively decides to erode privacy for one group, then Apple is opening the door for governments demanding new privacy abuse channels for targeting other nefarious actors, whether it be organized crime, known extremist groups, political activist associations, organizationals entities working with countries (think China), etc. Who gets to decide which groups poses the most threat, hazard or peril to peoples; or to a nation? Apple should not have sole authority to decide.

    Apple has decided appropriately that perpetrators involved in CSAM should be a targeted group. Many, though, believe a concomitant and commensurate threat exists from other nefarious actors. It’s doesn’t end with child molesters. And many governmental entities and law enforcement agencies would say, it shouldn’t.

    “…. it’s a sign that their (Apple) internal teams weren’t adequately prepared and lacked the ideological diversity to gauge the scope of the issue that they were tackling.”

    If Apple has an Achilles heel, it is Apple’s conviction that it has the requisite skills, talents, knowledge and technical prowess that it can go at it alone believing no one can do it better than Apple.

    2
    September 5, 2021
  5. Gregg Thurman said:
    Wahhhh. You didn’t include in the discussion to stop child exploitation.

    Bunch of children. Apple’s only “mistake” was not making the announcement an event, with lots of explanation about how the feature worked, while simultaneously protecting users’ privacy.

    5
    September 5, 2021
  6. David Drinkwater said:
    My take: Nope. Wrong.

    I have been a sex positivity activist for three decades.

    Sex positivity does not mean free love or good sex for all.

    Sex positivity is about defending the rights of all consenting adults to safe, sane, and consensual sexual relations without regard for: sexual orientation, gender, gender performance, gender identity, race, age, physical (dis)ability, socioeconomic status, religious belief, …

    The key here is consenting adults. CSAM is the opposite of that.

    “Privacy is privacy for all.” I used to believe that way, too, but it’s not correct. CSAM crimes happen in the dark. They are hidden by a populous and erotophobic desire not to put such contents in the light. No-one in their right mind wants to see such things. And predators thrive on that fact.

    Apple’s methodology seem to be pretty extraordinarily conservative to me. They are asking an AI (not a human) to evaluate images for similarities to “known CSAM”. And they are erring toward only triggering a response on one in a trillion occasions. (I think that’s right. Fairly noted, that was a crowd-sourced data point.) Sadly, I suspect that there are more than one in a trillion SCAM criminals in the world, definitely more than one in a billion, and possibly more than one in a million. When such research prompts a question, then a human is asked to become involved. I do not envy that human.

    All this being said, Apple is not making an error in trying to detect CSAM.

    3
    September 5, 2021
  7. Yes, Philip, it was definitely an error, like in baseball: “an error is an act, in the judgment of the official scorer, of a fielder misplaying a ball in a manner that allows a batter or baserunner to advance one or more bases or allows a plate appearance to continue after the batter should have been put out.” I’m not quite sure who advanced as a result of Apple’s ‘forced error’ but maybe FB did. They don’t know the rules to anything, much less baseball.
    Disclosure: I went to games at wooden Connie Mack Stadium in the 1960s. I also saw early stock car races at Langhorne Motor Speedway & Flemington, NJ!

    0
    September 5, 2021

Leave a Reply