For the past ~2 years, I’ve been working on a neuroscience startup (called Matter). We have an app that—by its nature—needs to operate on private user data. Some of that data is so private that we never actually want to handle it ourselves. So we don’t.

We can’t leak data we don’t have. We can’t mishandle private info that never leaves the control of the user.

We’re taking a mostly-unique approach to user data at Matter, and I’ve finally written a little bit about it:

https://seancoates.com/blogs/matter-and-privacy

Matter and Privacy

When I was still working at Faculty, we took on a new client that was not yet named Matter. We eventually transitioned from an agency-client relationship to a startup relationship, where I became the

Sean Coates
Oh, hello top of Hacker News. It’s been a while… let me see… oh wow, still super toxic. Cool cool. Carry on.

Lots over there getting hung up on worst case scenarios. “What if someone buys you and they’re evil and they change the code and I still update?!”

Our goal here is to be better, not to try to be naïvely bulletproof-perfect. If your threat model really is “it doesn’t matter what their intent is, if my data gets out, I’m finished”, you should probably not be using a computer at all. Or at least not running software that anyone else wrote… including the software that’s in your CPU. Good luck.

@sean I dunno man, people give their data to FB, Twitter, Instagram, LinkedIn etc. They watch scrapers and language model builders suck up the rest.

Someone else comes along, not sucking up their data, trying to be good… and that, That’s what they choose to be annoyed with.

@sean Not that I would expect that crowd to get it, but we make declarations about the data we track to get into the App Store. Apple reviews our apps for inclusion. I don’t know how seriously they evaluate our privacy practices, but each release *is* audited by a third party prior to distribution.