Apple can’t get away with anything

Really? Did it think nobody would notice if it removed references to its controversial child safety abuse material program from one of its webpages?

From Jon Porter’s “Apple scrubs controversial CSAM detection feature from webpage but says plans haven’t changed” posted Wednesday on The Verge:

Apple has updated a webpage on its child safety features to remove all references to the controversial child sexual abuse material (CSAM) detection feature first announced in August. The change, which was spotted by MacRumors, appears to have taken place some time between December 10th and December 13th. But despite the change to its website, the company says its plans for the feature haven’t changed.

Two of the three safety features, which released earlier this week with iOS 15.2, are still present on the page, which is titled “Expanded Protections for Children.” However references to the more controversial CSAM detection, whose launch was delayedfollowing backlash from privacy advocates, have been removed.

When reached for comment, Apple spokesperson Shane Bauer said that the company’s position hasn’t changed since September, when it first announced it would be delaying the launch of the CSAM detection. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company’s September statement read.

Crucially, Apple’s statement does not say the feature has been canceled entirely. Documents outlining how the functionality works are still live on Apple’s site.

My take: MacRumors missing nothing.

3 Comments

  1. Gregg Thurman said:
    MacRumors missing nothing

    Oh yeah. It’s missing the incredible harm that pedophiles initiate on children.
    CSAM is a good idea and shouldn’t be abandoned.

    5
    December 15, 2021
  2. Bart Yee said:
    A comment from Macrumors article regarding iOS 15 adoption sums up the privacy concerns for me:

    “3….2…..1….
    I’m NOT installing iOS 15 until I know about CSAM because I won’t allow Apple to invade my privacy.

    Meanwhile I’ll keep using Google and Amazon services and posting my entire life and sharing my location on Facebook, Twitter, Instagram, TikTok and Twitter”

    I think CSAM will eventually prove to be useful and helpful, and the downsides are a question of trust. IMO, Apple will prove itself to be trustworthy.

    5
    December 15, 2021

Leave a Reply