
Secure and private AI processing in the cloud poses a formidable new challenge. To support advanced features of Apple Intelligence with larger foundation models, we created Private Cloud Compute (PCC), a groundbreaking cloud intelligence system designed specifically for private AI processing. Built with custom Apple silicon and a hardened operating system, Private Cloud Compute extends the industry-leading security and privacy of Apple devices into the cloud, making sure that personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple. We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale.
Of course, knowing that the phone is running a specific piece of software doesn’t help you if you don’t trust the software. So Apple plans to put each binary image into a “transparency log” and publish the software.
But here’s a sticky point: not with the full source code. 11/
Back to the substance.
I admit that as I learned about this feature, it made me kind of sad. The thought that was going through my head was: this is going to be too much of a temptation. Once you can “safely” outsource tasks to the cloud, why bother doing them locally. Outsource everything! 16/
As best I can tell, Apple does not have explicit plans to announce when your data is going off-device for to Private Compute. You won’t opt into this, you won’t necessarily even be told it’s happening. It will just happen. Magically.
I don’t love that part. 17/
Wrapping up on a more positive note: it’s worth keeping in mind that sometimes the perfect is the enemy of the really good.
In practice the alternative to on-device is: ship private data to OpenAI or someplace sketchier, where who knows what might happen to it. 19/
@matthew_d_green I completely understand this is not the point but it adds extra irony given how much data Teslas suck up about their users. They don’t seem to allow users to opt out of their data collection…
Excellent thread though. You bring up a lot of good points
@matthew_d_green I read something, somewhat recently, that having the source code was not that important to security researchers. I don’t remember who wrote it, nor the details, but it was well argued and very interesting, especially given how unintuitive the affirmation was.
Have you read it? Knowing my luck, you probably wrote it and it’s the reason I’m following you. Anyway, I’m not trying to argue, your comment just reminded me of it. 🙂
Thanks for the great thread:)
Just wondering what this will look like in the context of the EU's new ambitions and its blatant surveillance extremism. They want ‘access by design’.
In a classified report dated May 22, the EU expert group, which is dominated by security authorities, lists 42 recommendations:
https://cdn.netzpolitik.org/wp-upload/2024/06/2024-05-22-Recommendation-HLG-Going-Dark-c.pdf
@matthew_d_green Suppose, only for the sake of argument, that these technical measures succeed, and Apple‘s system is as secure as we want it to be. Here‘s my concern.
Years down the line, will their management still be as strongly committed to those goals? It sounds like this comes at considerable cost and effort. Will they *never* give in to the temptation to cut corners?
While reading the thread, I thought of Boeing. Once a model of engineering and safety; look what happened.
I think a more interesting question to ask about Boeing is, "what can we do differently so that failures like this are less likely?"
It's all very well and good to point out potential future problems. The specific problem of what happens if Apple's corporate culture changes is one that sometimes keeps me up at night too—I don't really understand why it's as good as it is, and I want it to stay that way or even improve.
What can we do to facilitate that?
@abhayakara @matthew_d_green In Boeing‘s case, it‘s usually attributed to management with a certain worldview that took over after the MD merger. Which makes it feel like there‘s always a risk. Who can say what will be in someone‘s head, at some unknown point in the future?
Change the incentives, so that it‘s sure to stick? So that sacrificing privacy is never more profitable than “shareholder value“?
Meaning: legislative and regulatory mandates for computer security.
@abhayakara @slimhazard @matthew_d_green "I think a more interesting question to ask about Boeing is, "what can we do differently so that failures like this are less likely?""
Make much more thorough use of anti-trust law. U.S. companies can commit all kinds of atrocities and walk away with a slap on the wrist equivalent to a few hours of doing business because they spend billions to put their friends in government regulation positions. Boeing was no different. They had friendly people in the FAA that looked the other way.
John Oliver did an excellent episode on this: https://www.youtube.com/watch?v=Q8oCilY4szc&t=1
@Avitus @slimhazard @matthew_d_green Right. I don’t mean theoretically what should have been done, though. I mean what can _we_ do, individually or collectively, to either support a better regulatory regime or do other things to reward good corporate behavior.
This is not a hypothetical question. I think many people have the model of governance as something someone else does, but that’s how we got here.
@slimhazard @matthew_d_green Related parallel: the proposals to identify child sexual abuse materials shared using messaging systems by creating hashes to "fingerprint" them.
The same technology potentially opens the door to fingerprinting ANYTHING an authoritarian govt doesn't want disseminated, chilling free speech or even criminalising it, including political manifestos, evidence of human rights abuses, environmental info etc.
@matthew_d_green thanks for posting this here in the fedi.
i wonder how this will play out when china demands access to those servers, apple fell over previously for the ccp...
@matthew_d_green
Nice thread thanks!
I wonder if they used sel4 (for which Apple has been a member of the foundation for some months now) in their PCC tech stack