The OpenAI ChatGPT app on macOS is not sandboxed and stores all the conversations in **plain-text** in a non-protected location:

~/Library/Application\ Support/com.openai.chat/conversations-{uuid}/

So basically any other running app / process / malware can read all your ChatGPT conversations without any permission prompt:

macOS has blocked access to any user private data since macOS Mojave 10.14 (6 years ago!). Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access.

OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.

@pvieito This is really a failure on Apple’s part, not OpenAI’s. The behavior you’re describing is a side effect of Apple being lazy and self-serving in their implementation. They should provide proper API to extend a data vault to third party unsandboxed apps.
@saagar They will add support for that in macOS Sequoia when using Group Containers: https://developer.apple.com/wwdc24/10123
What’s new in privacy - WWDC24 - Videos - Apple Developer

At Apple, we believe privacy is a fundamental human right. Learn about new and improved permission flows and other features that manage...

Apple Developer
@pvieito Yes but you do see how presenting this as something OpenAI has been doing “wrong” because the APIs needed to do this haven’t shipped on macOS yet (flipping your claim on its head, haven’t even shipped 6 years later!) is the wrong way to look at the problem?

@saagar No. OpenAI could perfectly sandbox its app, and they chose to not do it. They could also encrypt the chats, and they did not.

And it is clear they where not doing the things right because they fixed this in the last version: https://www.theverge.com/2024/7/3/24191636/openai-chatgpt-mac-app-conversations-plain-text

OpenAI’s ChatGPT Mac app was storing conversations in plain text

OpenAI updated its ChatGPT macOS app on Friday after users discovered it stored conversations insecurely in plain text.

The Verge
@pvieito They did that because the Verge reporters were in your mentions trying to catch a scoop on the next Recall AI disaster and they wanted to get out ahead of that. The solution they went with was deeply unsatisfying and terrible for the platform as a whole
@pvieito I’m n even going to discuss whether OpenAI “could” sandbox their app; I don’t have enough context to discuss that. But it doesn’t seem unreasonable that they use APIs that are not available from the app sandbox to e.g. know what’s going on when you invoke it
@pvieito Whatever the reasons, the result is that any non-sandboxed app until now had no protections against other apps peeking at their data. Apple made a solution for *their* apps and then later added sandboxed apps in, but ChatGPT (or Chrome, or Photoshop, or …) can’t benefit
@pvieito And the new “solution” means that every app in this situation has to hand-roll its own file encryption code. Plus, users cannot actually access these files directly anymore because there is no way to grant access to a bespoke encrypted file
@pvieito To be clear, I have no issue with users understanding the limitations of the current situation, and adjusting their threat model in response. If you want to bring attention to that more power to you. But you specifically assigned blame and formed a narrative on top
@pvieito And I think the end result of that was the company was forced into a non-optimal security outcome because of concerns everyone would see “plaintext” and freak out. The real issue is most apps people use are in the same boat so the solution needs to come from Apple

@saagar While I understand your point of view, chat apps are special and have to treat user data with extra care.

WhatsApp had also to roll encryption (7 years ago!: https://www.wired.com/story/whatsapp-encryption-end-to-end-turned-on/) because macOS malware could access the plain-text backup stored on iCloud.

WhatsApp encryption: Facebook's messaging app now encrypts iCloud back-ups

Facebook-owned WhatsApp, which has more than one billion monthly active users, has turned on end-to-end encryption by default

WIRED
@pvieito WhatsApp is actually a very interesting usecase! I can double check but I don’t think they actually encrypt data at rest; this is solely for backups to cloud services (which lets you upgrade an insecure service into something that is strongly encrypted)
@pvieito And you can turn this off if you want. I think this demonstrates a decent amount of thought put into balancing security with practicality and accessibility. I don’t feel that OpenAI was given the chance to put in the same amount of thought here
@pvieito Despite all I said if OpenAI decided that ChatGPT is targeted by Mac malware and they should do something special about it, I would be ok with that. But I think doing that work takes time. And the actual work they need to do is more extensive than what they’ve done here
@pvieito Maybe they create a sandboxed service that handles chats only for example, if they want to take advantage of what the OS has right now. Maybe they pick to only do protection in Sequoia and later, or fall back to manual encryption for older OSes.
@pvieito And there’s more extensive work that needs to be done, too: this protects chats, but what’s not protected? Maybe there’s an API key somewhere on disk that gives access to this anyway?
@pvieito Really, what I would have preferred is they go “ok here is the threat model we have, this is the functionality we would like to address this, this is what we have rolled out given the limitations we have”.
@pvieito And it would potentially include an analysis of what they are choosing *not* to do. Otherwise you turn into a banking app that doesn’t allow you to take screenshots because “what if someone was recording your screen and saw your bank balance”