Apple’s Child Safety plan explained: 11:45 minutes with Craig Federighi (video)

“How confident are you that this is a nude image and not a rocket ship?” — Joanna Stern

From the Wall Street Journal’s “Apple’s Software Chief Explains ‘Misunderstood’ iPhone Child-Protection Features (Exclusive)” posted Friday.

Cue the YouTube version:

My take: When the going gets tough, Apple arranges to have its best explainer grilled by the Journal’s best video interviewer.

20 Comments

  1. Jerry Doyle said:
    I have no problem with the communication safety in messages, since this feature deals on the child’s phone (age less than 12) “controlled by parents” relative to potential nudes in text messages sent to their children. It is the CSAM Detection where I still need clarification as to why half of that feature needs access to one’s iPhone for “scanning” purposes. Why can’t this specific “multi-part” algorithm be limited to scanning its degree of analysis for photos solely on one’s iCloud Photo Library account instead of also doing the scanning degree of analysis both on the personal iPhone device and on photos loaded to the individual’s iCloud Photo Library account? Why is it necessary for the two halves of the specific known finger prints of specific known child pornographic images to be split as such on the phone and on the individual’s iCloud Photo Library account? That is the issue needing elucidation. Just do the complete algorithm solely on photos contained in the iCloud Photo Library account to match known CSAM finger prints. Why invade the individual’s personal device which Apple always has told us is “off-limits?” This is the controversial way Apple is spotting this content when it seems unnecessary to invade one’s personal iPhone device. Do it in the iCloud Photo Library.

    Yes, I do understand that one can turn-off their iCloud Photo Library and no match never would occur. All the more reason for my not understanding why Apple needs access to one’s device for scanning. Apple already has access to the iCloud Photo Library account.

    The NCMEC has its known data base of CSAM. So Apple can circumvent invading one’s personal device and use its algorithm for scanning the iCloud Photo Library accounts for any match with NCMEC’s known database of CSAM. Why does Apple go to your personal device to also scan?

    The neural hash combination between one’s phone and one’s iCloud Photo Library account is redundant. It seems just an excuse to invade one’s personal device.

    The safety voucher is redundant! All that is happening is combining the same two neural hashes coming from the same device that has the same iPhone Photo Library account. What is the need again, to invade the individual’s iPhone device?

    Apple always has told us the information on our devices is “private.” What has changed?

    Apple has used its privileged keys to open my locked front doors to walk inside my home while I am sleeping. It’s unsettling. I’m sorry folk, but this is a “back door.”

    If it looks like a duck, walks like a duck and quacks like a duck, then it must be a duck.

    2
    August 14, 2021
    • Gregg Thurman said:
      still need clarification as to why half of that feature needs access to one’s iPhone for “scanning” purposes.

      That’s where you are missing it. The feature does not SCAN your iPhone, it only scans photos you upload to iCloud.

      Additionally, it only looks at INCOMING photos attached to emails and messages, then alerts the receiver that there MAY be CSAM contained with the communication.

      At no time do the features scan for CSAM on your iPhone. Scanning only occurs while the content is in transit to or from your iPhone, and in the case of from your iPhone the scan occurs in the Cloud.

      I’m much more comfortable with the features, as explained, now than I was previously.

      I wouldn’t be surprised to find that a lot of the misinformation surrounding these features originated with competitors, who already do the same thing, but without the privacy safeguards Apple has designed into the features: anything to soil Apple’s image/reputation.

      4
      August 14, 2021
      • Jerry Doyle said:
        @Gregg Thurman: “…. That’s where you are missing it. The feature does not SCAN your iPhone, it only scans photos you upload to iCloud …. At no time do the features scan for CSAM on your iPhone. Scanning only occurs while the content is in transit to or from your iPhone, and in the case of from your iPhone the scan occurs in the Cloud.”

        Yes, I understand explicitly that we are discussing photographs taken with one’s iPhone and subsequently loaded to their iCloud Photo Library. The issue are the features Apple is using to spot CSAM. Apple is using the features “on your device” installed through the forthcoming software update.

        The NCMEC has a data base of known CSAM. Other big tech companies such as Google, MS, FB scan photos users upload to the cloud to find images that match photos in the repository of NCMEC. Remember, the scan spots only known CSAM images in NCMEC’s CSAM repository.

        Apple decided, though, it wanted to do this scan process beginning with your iPhone, “on your device,” ostensibly for privacy reasons. This identification process can be done (and already is done) on photos in the individual’s iCloud Photo Library. That process already protects privacy of non CSAM photos. So, Apple decided also to institute this scan “on your device” and this is where the redundancy evolves. Apple is doing the same feature on your device it already is doing to photos in your iCloud Photo Library. Why the redundancy? It is not necessary. It is not needed. Why is the iPhone even being mentioned?

        So, the question begging to be answered for many is why can’t this specific “multi-part” algorithm be limited to scanning its degree of analysis for photos solely on one’s iCloud Photo Library account instead of also doing the scanning degree of analysis both on the personal iPhone device as pictures are uploaded and on photos loaded in the individual’s iCloud Photo Library account? Why is it necessary for the two halves of the specific known finger prints of specific known child pornographic images being split as such (1) on the phone and (2) on the individual’s iCloud Photo Library account? Where is the increased privacy?

        Once Apple enters software onto your device to assist in this identification process the iPhone user has less privacy. Apple used its privileged keys to open my front doors to enter inside my home when Apple created and installed this software update onto my device for enhanced privacy identification purposes.

        The identification of CSAM images can and already is being done in iCloud accounts. Again, redundancy for what purpose? Increased privacy is an anemic answer. It is such a fragile answer that it has created a firestorm.

        1
        August 14, 2021
    • Steven Noyes said:
      100% this.

      Put CSAM on the trash heap. CSIM, I fully support.

      0
      August 14, 2021
  2. Fred Stein said:
    Craig and Joanna at their best. I love them both.

    One question raised by Joanna needs more depth; “Who owns your iPhone?”

    A more important question is; “Who owns the responsibility of the safety of you iPhone?” When it comes to cars, airplanes, etc. we all know that the makers, the owner/operators and regulators own this.

    Sadly those that harp on Apple for child protection, or for the safety of the App Store, have thrown out over a century of legal process on this topic. (Refer to Lynch v Nurdin 1841.)

    0
    August 14, 2021
  3. Alessandro Luethi said:
    What exactly is a neural hash, what data is it based on? This should be explained much more clearly! Is it based on file names, sizes, other meta data, or on the pixels of the unencrypted photo?

    0
    August 14, 2021
    • Gregg Thurman said:
      It’s based on the string of 1s and 0s that create the IMAGE.

      1
      August 14, 2021
      • Alessandro Luethi said:
        If this is true, the image needs to be in unencrypted state. That’s why it is preferable to calculate it at the source, on device, before it is encrypted and sent to the cloud.
        Calculating it in the cloud would mean that the image needs to be unencrypted first, utilizing the user’s “secret” key.

        1
        August 14, 2021
        • Jerry Doyle said:
          @Alessandro Luethi: “…. That’s why it is preferable to calculate it at the source, on device, before it is encrypted and sent to the cloud.”

          So, implementation of this new algorithm for CSAM Detection specifically on the iPhone when Apple always has gone out of its way to say the information on our personal devices is private information opens the back door for access to personal devices for which Government long has sought. If Apple can implement such a program for CSAM to accommodate NCMEC’s requests, then Apple can and should do so to accommodate federal and national security agencies’ requests to void potential terrorists and crime syndicates’ activities involving fraud and abuse of innocent citizens, especially the elderly.

          Apple chose to void for a specific group, NCMEC, its strong policy line upholding iPhone users personal privacy. While children are a most vulnerable group, they are no more vulnerable than other potential innocent victims at the mercy of perpetrators of domestic terrorist acts and crime syndicates. Consequently, if Apple fails to comply with future government requests for informational access on iPhone devices, then the courts and Congress now has the rational & legal motion requiring government access to personal users data on their iPhone devices.

          0
          August 14, 2021
  4. bas flik said:
    processing on the iphone is free of cost for apple, processing the icloud not.

    4
    August 14, 2021
  5. Gregg Thurman said:
    Why are so many easily convinced Apple’s implementation, already in place on competing handsets, A hides an evil intent on Apple’s part, B without an examination of the code (imbedded in the OS) assume it is readily exploitable to do other than Apple’s stated intent

    There wasn’t this much hulla Ballou when this kind of feature was added to Android, where you know, because of Android’s weak design and Google’s evil intent, will be exploited.

    2
    August 14, 2021
    • Jerry Doyle said:
      @Gregg Thurman: “…. Why are so many easily convinced Apple’s implementation, already in place on competing handsets, A hides an evil intent on Apple’s part, B without an examination of the code (imbedded in the OS) assume it is readily exploitable to do other than Apple’s stated intent”

      The issue, as I understand brother Gregg, is not others’ belief that Apple has an evil intent to do other than what Apple alleges they are doing. Sometimes good intentions, no matter how right morally, have unintended consequences. It is others’ belief that Apple now has cracked open the door about how the features could be abused or misused in the future. Will Apple be able to withstand government (and public) support for these features to target surveillance of domestic terrorists, crime syndicates, copyright infringement, adult pornography, groups involved in political dissent, etc. Apple now has set a precedent for an algorithm specifically on the iPhone. We now have a slippery slope.

      A domestic terrorist event where hundreds, or even thousands are killed will be difficult for Apple to flatly refuse governmental demands for surveillance use. Individuals involved in political dissent that went awry may find Apple under extreme pressure for governmental surveillance of the respective organizations with whom these dissenters associate.

      0
      August 14, 2021
  6. Gregg Thurman said:
    These aren’t rational, knowledgeable objections, they are purely emotional, caused by a lack of knowledge and fear. If any of the hysteria were found to be true, the hit to Apple’s reputation and good will would be immeasurable. Do you honestly believe anyone connected to Apple management would allow that?

    2
    August 14, 2021
  7. Alessandro Luethi said:
    @Jerry: if I understand you well (tell me if not), you say that Apple is creating a sort of precedent, and that once governments see what’s possible they want more of it.
    But other companies already do scan for whatever data they want. They know the encryption keys and scan whatever they want, in all tranquillity on their servers, to see if you possess CSAM, to present you optimized advertising, and who knows for what else. THIS is the precedent!
    What Apple intends to do, in my understanding: together with your photograph, they ask you to upload a security voucher, that indicates the probability that it is CSAM. The new software allows your device to calculate and upload this security voucher for you. On their servers they count and evaluate the probabilities of your security vouchers, and if a threshold is exceeded, they intervene by taking a closer look at your security vouchers (not the photographs).
    This in no way is a back door! It is a service to you, to the children and to the society.

    2
    August 15, 2021

Leave a Reply