Mad cow disease, also known as Bovine spongiform encephalopathy, is when cows ingest contaminated meal, which can be sometimes made up of other cows. It was considered a disaster for the cattle industry in the 1990s.

Anyway, I bring it up because this is basically how LLM slop is starting to work now. All the nutritional grass has been eaten and now what’s left is the contaminated meal made from other LLMs. The LLMs eat the meal, and soon we will have, like Mad LLM disease or something

@[email protected] see also, pre-1945 steel
@PearlescentFerret @yassie_j I read somewhere that radioactive decay means that problem's reducing rapidly nowadays. (Sort of a shame, as it was a great metaphor / analogy for all sorts of things -- though an end to desecration of war graves at sea would be great, thanks, China.)
@yassie_j the LLMs mostly output things in 1 style that can be detected and filtered though, so it's not that big a deal in practice sadly...
@yassie_j and even still, the LLM slop on the internet is a small percentage of content but is simply recommended greater due to SEO algorithms, so even without filtering it's not an issue... also newer LLMs focus on architecture improvements so they don't need to collect more data as much (though they're hitting limits of the architectures that have been invented, and eventually the AI market will crash and only the practical uses and FOSS stuff will remain)
@yassie_j a lot of LLM models tend to be intentionally trained on ChatGPT conversations and it causes them fuck up and say something is against OpenAI terms of service even though they’re not from OpenAI.
it would be extremely funny if it didn’t cause the internet to be in a near unusable state lol
@yassie_j Does this mean we get to pile them up in fields and set fire to them?
@yassie_j If LLMs are being built by scraping data from my sites without permission them I'll do my best to poison the well for them. Bots always find links no human user would. I can either use those interactions to attempt to block them or just feed them on their own LLM excrement.
@yassie_j So, pics of [red] tories feeding LLMs to young kids when?...
@yassie_j DSE (digital spongiform encephalopathy)
@yassie_j I think it's called "model collapse" but I like Mad LLM disease infinitely better
@yassie_j @swacknificent I've seen this referred to as Hapsburg AI as well
@yassie_j
Anyone familiar with the "generation loss" that happened when you took something that was photocopied and then photocopied *that* copy, and again and again, until the text (and god forbid, any images) would start to distort—well, that's what's going to happen with the proliferation of AI, as LLMs start ingesting other LLMs output as input.
@ColesStreetPothole @yassie_j A good analogy. Another one is the "whispers" game where kid #1 whispers a word to kid #2, who whispers what they heard to kid #3 and so on. By the time it reaches the last kid, the word has changed into gobbledygook and everyone has a good laugh.
@[email protected]

Interestingly enough Mad Cow disease was caused by cutting corners to maximise profits.
@yassie_j I just wish this phenomenon could be particularly problematic in the Netherlands just so we could get Dutch LLM Disease.
@yassie_j how about 'Dutch LLM Disease'? I know it's not in keeping with mechanism but it has a nice ring to it. Perhaps the first person to come up with the idea was from The Netherlands?!
@yassie_j I've heard it described more like inbreeding, with the result being something like a digital Hapsburg jaw.
@yassie_j Don't worry. We just need a little more GPU power and the models will be able to detect false information.
@yassie_j Not only that, but we’re not even started with people gaming the systems for their own benefit. I’ve already seen people creating web pages that show the human one thing and hide a bunch of text that the LLM scrapers will scrape. Deploy enough of that around (hello Putin’s bot farms) and you’ll start skewing LLMs to your point of view, and there is nothing we can do about that save manually verifying everything.
@yassie_j Yup. All hallucinations, all the way down. And it wasn't even that great to begin with.