I would like to send every person who uses user data to train an LLM to jail. Ethics jail, where they all must learn basic ethics of user consent and data privacy.

Latest exhibit: Slack is training a *global* model on conversations in your space. To opt out you need to email them. This is unconscionable and is extremely likely to leak private information.

@jamuraa yeah it's deeply, DEEPLY upsetting
@ireneista @jamuraa I haven't seen this yet. Any links to info?
Privacy Principles: Search, Learning and Artificial Intelligence | Legal

Slack's Terms and Policies, including privacy, Terms of Service, API terms, security, and more.

Slack

@jamuraa @ireneista

Thank you.

This is getting so ridiculous. I guess I'm sending them an email.

What a hostile user experience.

@jamuraa @ireneista
Or, maybe it's time I finally set up a Mattermost server.

@jamuraa @ireneista

Actually, I probably need to do some searching and reading. We use slack as a pretty basic channel based group chat. Mattermost is probably way more feature rich than we need. I'll need to look and see what simple self hosted things I could setup instead.

@finner @jamuraa let us know if you find something you like! we've tried all the big tools and aren't that fond of any of them... we were actually just thinking about whether we could just race ahead and implement not-yet-standard IRCv3 features ourselves or something

@ireneista @jamuraa

I've really not used much of anything other than slack, as far as tools similar to slack goes anyway. Unless you consider things like Skype and Teams (yuck and fucking ugh, respectively).

You've tried and didn't like mattermost? I've always thought it was pretty much just an open source clone of slack. What didn't you like about that?

@finner @jamuraa oh Mattermost is a very solid Slack clone, if you're happy with Slack you'll love Mattermost

our complaints about it are the same ones we have about Slack. mostly, it's very convenient if you're in ONE instance, but completely impractical to be in dozens of instances.

@jamuraa thanks for the link. this was about as professional as I could manage to be. hopefully it at least gives some poor support person a giggle.
@jamuraa Are we at the stage where it’s worth it to set up a bot army and pull text from LLMs to use to mass poison LLMs scraping data?
@WhiteCatTamer using Reddit & Slack as training material *is* LLM-mass-poisoning itself
@jamuraa

@jamuraa

The Enshittification continues apace.

@jamuraa This is going to be fun for everyone who wants to search private passwords.

Autocompleting passwords from Slack conversation is going to be big …

@jamuraa

I would love to send every person who profits from any model based on scraped copyright data to jail as well, but I think neither of our realities will come to fruition.

@jamuraa The sudden rise of "Descriptive image caption for users a who are visually impaired" (Mastodon is guilty too) Is virtue signaling and a dog whistle for "let use your images and captions to train AI models and use in ReCaptcha".

They rely on your honesty though, but I am not sure you can poison a dataset by using wrong captions on purpose.

If it was ever about the visually impaired, they could have an AI create the caption anyway.

@jamuraa I love that that page is called "Privacy Principles". Ooc I looked at their Acceptable Use Policy, and they obviously feel comfortable violating it themselves in a nice bit of obvious hypocrisy.
@jamuraa why on earth would companies using a business communications service want to keep their data confidential 😐​
@jamuraa "Thanks for the heads up, we will look into this from a GRC perspective and discuss with legal." lol
@jamuraa Is this the time to chat more about how to brew illegal drugs and build bombs in private Slack spaces? 😉