
Secure and private AI processing in the cloud poses a formidable new challenge. To support advanced features of Apple Intelligence with larger foundation models, we created Private Cloud Compute (PCC), a groundbreaking cloud intelligence system designed specifically for private AI processing. Built with custom Apple silicon and a hardened operating system, Private Cloud Compute extends the industry-leading security and privacy of Apple devices into the cloud, making sure that personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple. We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale.
Of course, knowing that the phone is running a specific piece of software doesn’t help you if you don’t trust the software. So Apple plans to put each binary image into a “transparency log” and publish the software.
But here’s a sticky point: not with the full source code. 11/
Back to the substance.
I admit that as I learned about this feature, it made me kind of sad. The thought that was going through my head was: this is going to be too much of a temptation. Once you can “safely” outsource tasks to the cloud, why bother doing them locally. Outsource everything! 16/
As best I can tell, Apple does not have explicit plans to announce when your data is going off-device for to Private Compute. You won’t opt into this, you won’t necessarily even be told it’s happening. It will just happen. Magically.
I don’t love that part. 17/
Wrapping up on a more positive note: it’s worth keeping in mind that sometimes the perfect is the enemy of the really good.
In practice the alternative to on-device is: ship private data to OpenAI or someplace sketchier, where who knows what might happen to it. 19/
@matthew_d_green I wish instead we could focus on getting people to reject it and disable these systems.
(I'd say "I wish we could focus on making it illegal", but probably no government would ever want to make this illegal, since people sending their data to the cloud without even knowing which data they've sent to the cloud provides such a rich vein of things to subpeona and/or covertly hack.)
@matthew_d_green I noticed that Apple's blog said nothing whatsoever about legal issues. I'd really like to see an analysis of what Apple could or could not be compelled to do in various jurisdictions, and how much that's affected by what they attest to be impossible for technical reasons.
Independent of the security issues, I feel that this is another step in the direction of devices that are useless without a network connection and I'm not a fan.
@matthew_d_green
I think they have this in their privacy policy right now for icloud:
>> Security and Fraud Prevention. To protect individuals, employees, and Apple [...] prescreening or scanning uploaded content for potentially illegal content, including child sexual exploitation material.
Wouldn't they apply the same thing to these servers at some point into future? If they can't peek, then they won't be able to protect individuals, employees, and Apple.
it might not be the best solution but I think it’s a big deal!