Really? Did it think nobody would notice if it removed references to its controversial child safety abuse material program from one of its webpages?
From Jon Porter's "Apple scrubs controversial CSAM detection feature from webpage but says plans haven’t changed" posted Wednesday on The Verge:
Apple has updated a webpage on its child safety features to remove all references to the controversial child sexual abuse material (CSAM) detection feature first announced in August. The change, which was spotted by MacRumors, appears to have taken place some time between December 10th and December 13th. But despite the change to its website, the company says its plans for the feature haven’t changed.
Two of the three safety features, which released earlier this week with iOS 15.2, are still present on the page, which is titled “Expanded Protections for Children.” However references to the more controversial CSAM detection, whose launch was delayedfollowing backlash from privacy advocates, have been removed.
When reached for comment, Apple spokesperson Shane Bauer said that the company’s position hasn’t changed since September, when it first announced it would be delaying the launch of the CSAM detection. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company’s September statement read.
Crucially, Apple’s statement does not say the feature has been canceled entirely. Documents outlining how the functionality works are still live on Apple’s site.
My take: MacRumors missing nothing.
Oh yeah. It’s missing the incredible harm that pedophiles initiate on children.
CSAM is a good idea and shouldn’t be abandoned.
“3….2…..1….
I’m NOT installing iOS 15 until I know about CSAM because I won’t allow Apple to invade my privacy.
Meanwhile I’ll keep using Google and Amazon services and posting my entire life and sharing my location on Facebook, Twitter, Instagram, TikTok and Twitter”
I think CSAM will eventually prove to be useful and helpful, and the downsides are a question of trust. IMO, Apple will prove itself to be trustworthy.
First off, no normal adult would be against taking any action to protect children from pedophiles.
That said, when I first heard about this feature the first thing that went through my head was, the extraordinary series of beautifully framed series of photographs of my older brother – who is now 72 years old, as a 6 month old baby – naked in the bathtub. Photographed by my late great uncle Garry Winogrand – who if anybody knows anything about the history of photography, knows he was an extraordinarily talented and renowned photographer. And I’m lucky to have them.
So Apple’s new scanning software sees those pictures on my iPhone or stored in icloud – in a folder of all my family photos. And it’s going to report me as a pedophile?
That is pretty much my problem in a nutshell . Not to mention, I also have my own baby picture as a six-month-old in the same family folder.
Now I understand that is not the purpose of Apple’s nanny state invention, and they have no intention of allowing that to happen. But until they convince me otherwise, it certainly opens up Pandora‘s box and it’s a slippery slope to having false positives.
Forcing people to censor perfectly innocent and easily misinterpreted pictures they may happen to have.
Am I the only one that has this problem? I don’t think so.
To repeat myself, Apple has done a lousy job of explaining exactly what this new software does. Or doesn’t do.