So Apple has introduced a new system called “Private Cloud Compute” that allows your phone to offload complex (typically AI) tasks to specialized secure devices in the cloud. I’m still trying to work out what I think about this. So here’s a thread. 1/
Apple, unlike most other mobile providers, has traditionally done a lot of processing on-device. For example, all of the machine learning and OCR text recognition on Photos is done right on your device. 2/
The problem is that while modern phone “neural” hardware is improving, it’s not improving fast enough to take advantage of all the crazy features Silicon Valley wants from modern AI, including generative AI and its ilk. This fundamentally requires servers. 3/
But if you send your tasks out to servers in “the cloud” (god using quotes makes me feel 80), this means sending incredibly private data off your phone and out over the Internet. That exposes you to spying, hacking, and the data hungry business model of Silicon Valley. 4/
The solution Apple has come up with is to try to build secure and trustworthy hardware in their own data centers. Your phone can then “outsource” heavy tasks to this hardware. Seems easy, right? Well: here’s the blog post. https://security.apple.com/blog/private-cloud-compute/ 5/
Private Cloud Compute: A new frontier for AI privacy in the cloud - Apple Security Research

Secure and private AI processing in the cloud poses a formidable new challenge. To support advanced features of Apple Intelligence with larger foundation models, we created Private Cloud Compute (PCC), a groundbreaking cloud intelligence system designed specifically for private AI processing. Built with custom Apple silicon and a hardened operating system, Private Cloud Compute extends the industry-leading security and privacy of Apple devices into the cloud, making sure that personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple. We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale.

Private Cloud Compute: A new frontier for AI privacy in the cloud - Apple Security Research
TL;DR: it is not easy. Building trustworthy computers is literally the hardest problem in computer security. Honestly it’s almost the only problem in computer security. But while it remains a challenging problem, we’ve made a lot of advances. Apple is using almost all of them. 6/
The first thing Apple is doing is using all the advances they’ve made in building secure phones and PCs in their new servers. This involves using Secure Boot and a Secure Enclave Processor (SEP) to hold keys. They’ve presumably turned on all the processor security features. 7/
Then they’re throwing all kinds of processes at the server hardware to make sure the hardware isn’t tampered with. I can’t tell if this prevents hardware attacks, but it seems like a start. 8/
They also use a bunch of protections to ensure that software is legitimate. One is that the software is “stateless” and allegedly doesn’t keep information between user requests. To help ensure this, each server/node reboot re-keys and wipes all storage. 9/
A second protection is that the operating system can “attest” to the software image it’s running. Specifically, it signs a hash of the software and shares this with every phone/client. If you trust this infrastructure, you’ll know it’s running a specific piece of software. 10/

Of course, knowing that the phone is running a specific piece of software doesn’t help you if you don’t trust the software. So Apple plans to put each binary image into a “transparency log” and publish the software.

But here’s a sticky point: not with the full source code. 11/

Security researchers will get *some code* and a VM they can use to run the software. They’ll then have to reverse-engineer the binaries to see if they’re doing unexpected things. It’s a little suboptimal. 12/
When your phone wants to outsource a task, it will contact Apple and obtain a list of servers/nodes and their keys. It will then encrypt its request to all servers, and one will process it. They use a load balancer to make sure no single server processes all your requests. They’re even using fancy anonymous credentials and a third part relay to hide your IP. 13/
Ok there are probably half a dozen more technical details in the blog post. It’s a very thoughtful design. Indeed, if you gave an excellent team a huge pile of money and told them to build the best “private” cloud in the world, it would probably look like this. 14/
But now the tough questions. Is it a good idea? And is it as secure as what Apple does today? And most importantly: can users opt out of this thing? 14/
Quick intermission: I also posted this thread on Twitter, and then this happened right about this part. It was… uncomfortable.

Back to the substance.

I admit that as I learned about this feature, it made me kind of sad. The thought that was going through my head was: this is going to be too much of a temptation. Once you can “safely” outsource tasks to the cloud, why bother doing them locally. Outsource everything! 16/

As best I can tell, Apple does not have explicit plans to announce when your data is going off-device for to Private Compute. You won’t opt into this, you won’t necessarily even be told it’s happening. It will just happen. Magically.

I don’t love that part. 17/

Finally, there are so many invisible sharp edges that could exist in a system like this. Hardware flaws. Issues with the cryptographic attestation framework. Clever software exploits. Many of these will be hard for security researchers to detect. That worries me too. 18/

Wrapping up on a more positive note: it’s worth keeping in mind that sometimes the perfect is the enemy of the really good.

In practice the alternative to on-device is: ship private data to OpenAI or someplace sketchier, where who knows what might happen to it. 19/

And of course, keep in mind that super-spies aren’t your biggest adversary. For many people your biggest adversary is the company who sold you your device/software. This PCC system represents a real commitment by Apple not to “peek” at your data. That’s a big deal. 20/
In any case, this is the world we’re moving to. Your phone might seem to be in your pocket, but a part of it lives 2,000 miles away in a data center. As security folks we probably need to get used to that fact, and do the best we can to make sure all parts are secure. //fin
@matthew_d_green thanks for your thoughtful insights as always

@matthew_d_green I wish instead we could focus on getting people to reject it and disable these systems.

(I'd say "I wish we could focus on making it illegal", but probably no government would ever want to make this illegal, since people sending their data to the cloud without even knowing which data they've sent to the cloud provides such a rich vein of things to subpeona and/or covertly hack.)

@mcc @matthew_d_green Yeah, I am, at least for the foreseeable future, like “this computer in front of me is plenty powerful. How about we see what else it can do before going all ‘cloud compute’? I'm ok with having some limits”
@matthew_d_green Every bad actor in the world just got very excited by this. It makes me hate my awful iPhone even more. I didn’t think that was possible, but hey, if we can’t dystopia, why are we even here? Anyhow, I like the writeup, I hate it, but I do think you covered it very well. This dumb Mary Shelley book keeps on flying off the shelf all by itself, silly book! ;)

@matthew_d_green I noticed that Apple's blog said nothing whatsoever about legal issues. I'd really like to see an analysis of what Apple could or could not be compelled to do in various jurisdictions, and how much that's affected by what they attest to be impossible for technical reasons.

Independent of the security issues, I feel that this is another step in the direction of devices that are useless without a network connection and I'm not a fan.

@matthew_d_green Thank you for the thread. Do you think having this cloud magic work for free is sustainable? Feels like on-device should still be a priority and/or that AI stuff will get tied to a subscription model.

@matthew_d_green
I think they have this in their privacy policy right now for icloud:

>> Security and Fraud Prevention. To protect individuals, employees, and Apple [...] prescreening or scanning uploaded content for potentially illegal content, including child sexual exploitation material.

Wouldn't they apply the same thing to these servers at some point into future? If they can't peek, then they won't be able to protect individuals, employees, and Apple.

@matthew_d_green Fascinating, and well written. Thanks.
@matthew_d_green great write up  it might not be the best solution but I think it’s a big deal!
@matthew_d_green And of course the nation state who could defeat all the protections is the one you're trying to cross the border into. Laws are generally that Border agents can confiscate your devices and ask you to unlock them (or you get banned from entry, and could be permanent), Next step: "Sir, ask your device to detail all you plan to do on your visit and who your will be meeting." Or, coming back: "Ask your device to detail everything you've purchased on your trip." - And voila. So, don't carry devices that know everything about you across the border! (And can fetch it all quick with a simple voice prompt!)
@matthew_d_green It doesn't have to be perfect, it only has to be more costly to compromise this new system than to compromise the phone itself (for instance, through an exploit on the update servers). IMO, the biggest risk would be when compromising this "cloud" would compromise many phones at once, since that would divide the cost by the number of phones affected.
@cesarb That’s the fundamental risk of centralizing compute. Apple is doing a lot to try to make this less likely (see statelessness and diffusion) so they know the risks. But the risk is there.
@matthew_d_green Or, will something simply not work because there is no connectivity? Will it quietly work degraded? Or will it work degraded but warn you it's having trouble connecting to a PCC?
@matthew_d_green That sounds like a *fantastic* feature request. Ideally if you can convince a senator to ask for it, maybe.
@matthew_d_green Hi Matthew, looks like Apple feels confident about the security of using their Private Cloud Compute server for “Generative AI” tasks? However, if you want to do something which involves ChatGPT, Apple will ask for your permission first. https://www.youtube.com/watch?v=pMX2cQdPubk
Talking Tech and AI with Tim Cook!

YouTube