“How confident are you that this is a nude image and not a rocket ship?” — Joanna Stern
From the Wall Street Journal’s “Apple’s Software Chief Explains ‘Misunderstood’ iPhone Child-Protection Features (Exclusive)” posted Friday.
Cue the YouTube version:
My take: When the going gets tough, Apple arranges to have its best explainer grilled by the Journal’s best video interviewer.
Yes, I do understand that one can turn-off their iCloud Photo Library and no match never would occur. All the more reason for my not understanding why Apple needs access to one’s device for scanning. Apple already has access to the iCloud Photo Library account.
The NCMEC has its known data base of CSAM. So Apple can circumvent invading one’s personal device and use its algorithm for scanning the iCloud Photo Library accounts for any match with NCMEC’s known database of CSAM. Why does Apple go to your personal device to also scan?
The neural hash combination between one’s phone and one’s iCloud Photo Library account is redundant. It seems just an excuse to invade one’s personal device.
The safety voucher is redundant! All that is happening is combining the same two neural hashes coming from the same device that has the same iPhone Photo Library account. What is the need again, to invade the individual’s iPhone device?
Apple always has told us the information on our devices is “private.” What has changed?
Apple has used its privileged keys to open my locked front doors to walk inside my home while I am sleeping. It’s unsettling. I’m sorry folk, but this is a “back door.”
If it looks like a duck, walks like a duck and quacks like a duck, then it must be a duck.
That’s where you are missing it. The feature does not SCAN your iPhone, it only scans photos you upload to iCloud.
Additionally, it only looks at INCOMING photos attached to emails and messages, then alerts the receiver that there MAY be CSAM contained with the communication.
At no time do the features scan for CSAM on your iPhone. Scanning only occurs while the content is in transit to or from your iPhone, and in the case of from your iPhone the scan occurs in the Cloud.
I’m much more comfortable with the features, as explained, now than I was previously.
I wouldn’t be surprised to find that a lot of the misinformation surrounding these features originated with competitors, who already do the same thing, but without the privacy safeguards Apple has designed into the features: anything to soil Apple’s image/reputation.
Yes, I understand explicitly that we are discussing photographs taken with one’s iPhone and subsequently loaded to their iCloud Photo Library. The issue are the features Apple is using to spot CSAM. Apple is using the features “on your device” installed through the forthcoming software update.
The NCMEC has a data base of known CSAM. Other big tech companies such as Google, MS, FB scan photos users upload to the cloud to find images that match photos in the repository of NCMEC. Remember, the scan spots only known CSAM images in NCMEC’s CSAM repository.
Apple decided, though, it wanted to do this scan process beginning with your iPhone, “on your device,” ostensibly for privacy reasons. This identification process can be done (and already is done) on photos in the individual’s iCloud Photo Library. That process already protects privacy of non CSAM photos. So, Apple decided also to institute this scan “on your device” and this is where the redundancy evolves. Apple is doing the same feature on your device it already is doing to photos in your iCloud Photo Library. Why the redundancy? It is not necessary. It is not needed. Why is the iPhone even being mentioned?
So, the question begging to be answered for many is why can’t this specific “multi-part” algorithm be limited to scanning its degree of analysis for photos solely on one’s iCloud Photo Library account instead of also doing the scanning degree of analysis both on the personal iPhone device as pictures are uploaded and on photos loaded in the individual’s iCloud Photo Library account? Why is it necessary for the two halves of the specific known finger prints of specific known child pornographic images being split as such (1) on the phone and (2) on the individual’s iCloud Photo Library account? Where is the increased privacy?
Once Apple enters software onto your device to assist in this identification process the iPhone user has less privacy. Apple used its privileged keys to open my front doors to enter inside my home when Apple created and installed this software update onto my device for enhanced privacy identification purposes.
The identification of CSAM images can and already is being done in iCloud accounts. Again, redundancy for what purpose? Increased privacy is an anemic answer. It is such a fragile answer that it has created a firestorm.
Put CSAM on the trash heap. CSIM, I fully support.
One question raised by Joanna needs more depth; “Who owns your iPhone?”
A more important question is; “Who owns the responsibility of the safety of you iPhone?” When it comes to cars, airplanes, etc. we all know that the makers, the owner/operators and regulators own this.
Sadly those that harp on Apple for child protection, or for the safety of the App Store, have thrown out over a century of legal process on this topic. (Refer to Lynch v Nurdin 1841.)
Calculating it in the cloud would mean that the image needs to be unencrypted first, utilizing the user’s “secret” key.
So, implementation of this new algorithm for CSAM Detection specifically on the iPhone when Apple always has gone out of its way to say the information on our personal devices is private information opens the back door for access to personal devices for which Government long has sought. If Apple can implement such a program for CSAM to accommodate NCMEC’s requests, then Apple can and should do so to accommodate federal and national security agencies’ requests to void potential terrorists and crime syndicates’ activities involving fraud and abuse of innocent citizens, especially the elderly.
Apple chose to void for a specific group, NCMEC, its strong policy line upholding iPhone users personal privacy. While children are a most vulnerable group, they are no more vulnerable than other potential innocent victims at the mercy of perpetrators of domestic terrorist acts and crime syndicates. Consequently, if Apple fails to comply with future government requests for informational access on iPhone devices, then the courts and Congress now has the rational & legal motion requiring government access to personal users data on their iPhone devices.
Absolutely NOT what Alessandro just said. What part of “Even Apple hasn’t got a back door into your iPhone” do you not understand?
On the phone, the “look” at the photo, I.e. the hash computation, is done once, and by code that is auditable and verifiable in terms of what it is doing. Show me another company that is doing even half as much in terms of transparency. The arguments that this system is so easily subverted are not convincing.
Also, unless you believe that terrorists routinely keep the same collections of photos on their phone, using this service to find people that have a particular photo on their phone as a means of identifying terrorists seems quite a stretch. (unless terrorists are all taught to keep the same magic picture, to prove they are terrorists, to other terrorists, as taught innTerrorism 101. ).
If in fact there was a way to get pictures that all terrorists keep on their phone which there is not.
Alessandro Luethi wrote: “… That’s why it is preferable to calculate it at the source, ‘on device,’ (emphasis added) before it is encrypted and sent to the cloud.”
Apple plans to install software on iPhone devices that facilitates its multi-part algorithm in scanning its degree of analysis for photos both on the iPhone as photos are loaded to the cloud and continuing the analysis scan in the individual’s iCloud Photo Library account. As Craig Federighi stated in the video, it is a two halves of the specific known finger prints of specific known child pornographic images being split on the (1) phone and (2) on the individual’s iCloud Photo Library account. You often say you do not read the attachments or view video attachments. Did you not catch that information in Federighi’s explanation?
@Joseph Bland: “…. What part of ‘Even Apple hasn’t got a back door into your iPhone’ do you not understand?
Don’t quite know what you are getting at. The installation of the forthcoming software onto users’ iPhones for the purpose of carrying out the second half of the specific known finger print to detect child pornographic images being split as such (1) on the phone and (2) on the individual’s iCloud Photo Library account, is a back door opening that allows other entities to request no less for other unsavory illegal activities. That is the firestorm brother Joe. It’s not just me denoting this opening; it many other entities all creating the firestorm.
“The installation of the forthcoming software onto users’ iPhones…is a back door opening that allows other entities to request no less for other unsavory illegal activities.”
Again, wrong. Apple has no way to get to the “multi-part algorithm” data the program accomplished on the iPhone. Because Apple has no back door. Period. What it sends – IF it sends anything (because if you don’t send info to the cloud it stays on your iPhone) – is not useful for the purposes you’re literally fantasizing about.
If Apple were to actually create a “back door”, then yes, there’d be a problem. But that’s the whole point. Apple refuses to create one, because as long as it doesn’t have one, folks can’t pressure them to use it.
Also, this is the exact opposite of an “unsavory illegal activity”. It’s designed to help STOP about the most “unsavory illegal activity” out there – and to protect your privacy in the process.
Agree, Joe. You misread my comment. Apple is morally correct to do what it is doing, but sometimes our actions, no matter how well intended, result in unintended consequences. Surely Apple has thought all this out fully so the question becomes will Apple be successful in the future relative to high-profile legal disputes with governmental agencies to defend iPhone users’ information.
There wasn’t this much hulla Ballou when this kind of feature was added to Android, where you know, because of Android’s weak design and Google’s evil intent, will be exploited.
The issue, as I understand brother Gregg, is not others’ belief that Apple has an evil intent to do other than what Apple alleges they are doing. Sometimes good intentions, no matter how right morally, have unintended consequences. It is others’ belief that Apple now has cracked open the door about how the features could be abused or misused in the future. Will Apple be able to withstand government (and public) support for these features to target surveillance of domestic terrorists, crime syndicates, copyright infringement, adult pornography, groups involved in political dissent, etc. Apple now has set a precedent for an algorithm specifically on the iPhone. We now have a slippery slope.
A domestic terrorist event where hundreds, or even thousands are killed will be difficult for Apple to flatly refuse governmental demands for surveillance use. Individuals involved in political dissent that went awry may find Apple under extreme pressure for governmental surveillance of the respective organizations with whom these dissenters associate.
But other companies already do scan for whatever data they want. They know the encryption keys and scan whatever they want, in all tranquillity on their servers, to see if you possess CSAM, to present you optimized advertising, and who knows for what else. THIS is the precedent!
What Apple intends to do, in my understanding: together with your photograph, they ask you to upload a security voucher, that indicates the probability that it is CSAM. The new software allows your device to calculate and upload this security voucher for you. On their servers they count and evaluate the probabilities of your security vouchers, and if a threshold is exceeded, they intervene by taking a closer look at your security vouchers (not the photographs).
This in no way is a back door! It is a service to you, to the children and to the society.