From Madhumita Murgia and Tim Bradshaw’s “Apple plans to scan US iPhones for child abuse imagery” posted Thursday:
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.
Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting.
The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.
A few reactions:
- Matthew Green: This will break the dam — governments will demand it from everyone.
- Bryan Jones: A vile betrayal of Apple’s very soul.
- Alec Muffett: A huge and regressive step for individual privacy.
- Edward Snowden: No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*
My take: Something about the combination of kids, porn and the Internet seems to make otherwise worldly-wise adults a little crazy. I speak from experience.
U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
If U.S. laws require tech companies to flag cases of child sexual abuse to authorities, then Congress must act to pass similar legislation to flag terrorists, organized crime activities, etc.
How is Apple accessing privileged information in one’s iCloud account different than Apple accessing privileged information inside one’s iPhone? Personal information in my home is no different than the same personal information entrusted to my accountant, to my attorney, to my doctor and to my minister. That personal information just happens not to be in my home, but is the same personal information being held outside my home by others who are suppose to retain its “privilege” status. It all is information that should be held in the strictest level of privacy.
What Apple may find going forward are Apple buyers foregoing iCloud storage accounts and instead, buying hardware with the largest data storage capacities hoping never to lose their devices.
Privacy is a key benefit of Apple. Poof, gone. Terrible decision.
“ Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week “
That’s a HUGE difference between the direction Apple is taking and everyone else. Yes, it makes for a “dumber” Siri, but it is a far better compromise than Apple’s competitors are forced to make in the field of personal privacy.
Apple is trying for an R. Daneel Olivaw
And then, I’m not sure how Apple translates “we found bad stuff” to the police, including the legality of evidence obtained without a warrant by a non-police actor. (This is actually a wider problem in US law. Police agencies are buying data from data brokers, such as phone location data after the Jan 6 insurrection, that they are not allowed by law to obtain directly. Not sure that has undergone full legal challenge up to the Supreme Court. But IANAL…)
“And then, I’m not sure how Apple translates “we found bad stuff” to the police…”
Just to be clear, for me, the jury is still out on this move. As I understand it (and I’m a LONG way from being fully educated – I just know I don’t know enough), the information is conpletely randomized. No actual pictures or messages come back to Apple. They did say, however, that the chance of a mistake being made by the program was 1 in a trillion. That’s a pretty sophisticated program!
As regards police involvement, Apple still can’t access the iPhone data. That doesn’t mean a “brute force” (i.e. very expensive) attack couldn’t be initiated by law enforcement.
So I imagine Apple would alert that iPhone X was used, for example, to view child porn or to endanger a child’s being used thusly. The proof would then need to be acquired independently.
Again, I’m not completely sure it works that way, so for me, the jury is still out. But if it does work that way, I’ll likely have much less of a problem with it.
Then you should be happy. Apple is not doing any analysis on your phone, my phone, or any phone. Only on iCloud. And that is a big difference, because you have 100% control over what is on your phone and what is on iCloud.
AP, who wonders why they have an integrity issue, has the following headline: “Apple to scan U.S. iPhones for images of child sexual abuse”. That is a lie (they could plead ignorant, but I doubt they are that dumb). It will interesting if AP retracts it or not. My guess is they print the lie to get clicks, then claim they have ‘the truth’ buried in the article.
I could be wrong but I believe that iCloud dada is not encrypted so no key or “back door” is needed.
“Photos are only matched against known CSAM images from the NCMEC database. Apple is not using machine learning to “figure out” whether an image contains CSAM, and a cute photo or video of your toddler running around is not going to get flagged by this algorithm.”
The odds of a innocent photo being flagged as a ‘match’ is 1 in 1 trillion. And then there has to be several matches before the red flag goes up. After that, a human will review the pics before police notification.
I’m curious how you come up with that number. Furthermore, it doesn’t seem to understand “near match” pattern recognition (whether that is “AI enabled” or just “a distance measure”.) That usually can be tuned (‘show me everything with a 95% match’.)
There is very little left of unfortunately out-dated notions of privacy. All aspects of our lives are too lucrative to the data miners, spies & politicians.
There are two very different things being done here, and as usual, lazy journalists wrote stories without bothering to understand what the hell they were talking about. Shocker.
PED. Could you have picked a more freaky pic to head this topic? If you wanted to grab attention and send shivers, hey…it worked.