“AI sceptics” who work on policy and education continue to overestimate the utility of LLMs—portraying it as a potential revolution even as they warn against overhype—simply because they can’t see that generating seemingly coherent text has very little economic value
LLMs are not automation. That would have economic value. The generated text artefacts look coherent and we, as a society, don’t even value the coherent kind. We underpay most forms of writing. Even code has no inherent value (see open source) outside of the associated integration and expertise
Between diffusion models and LLMs, tech has inflated a trillion dollar bubble around automating activities that have immense social and cultural capital—writing, art, photography—but little actual capital. It’s harmful to education, culture, and society with minimal overall benefit

More and more, generative models are looking like productivity tobacco. Promoted by biased research, it’s addictive, harmful, and the little benefit it has (nicotine is a somewhat effective ADHD drug, for example) cannot outweigh the fact that it’s hurting us all, directly and indirectly.

This shit is already turning out to be one of the most harmful tech innovations of the 21st century. It needs to be regulated at least as much as tobacco, if not banned outright from most economic spheres

@baldur good analogy, I think of it like aspesdos, myself.
Cheap, easy to use, the CEO loves it, but it's deliterous for the workers in long-term, subtle ways.

@noboilerplate @baldur I don't think there's a lot of subtlety in the harm LLMs do. The way they lie is obvious, several people have already been driven to suicide by them, and all the infrastructure they need is extremely obviously harmful.

At least asbestos was good at what it did - prevent fires - better than any method we had before we had asbestos. There are plenty of better ways to get information or text than LLMs

@noboilerplate @baldur (LLMs are also absolutely not cheap, what with OpenAI losing money with every single ChatGPT prompt, even those by paying users)
@ludovica @baldur no arguments there, but from the CEOs POV, it is cheap for them. For now.

@noboilerplate Oh, absolutely! I used to work for a SEO/content marketing company (I got thrown out for not using LLMs enough to write marketing-related articles), and I'm still in contact with some of my former coworkers... All of their clients now want cheap, fast texts, no matter if they're absolute shit or not 😬

(The day after I was thrown out, they literally had a meeting with the content team being like 'We have gone too far into the direction of 'quality', we need to focus more on 'quantity''... I'm really glad I got thrown off that ship when I did, I even got 2 months of paid leave because apparently I was enough of a disturbance that they rather paid 2 months of no work for me than let me work until the end of my contract 😂 )

@noboilerplate oh god, one thing where your asbestos analogy is spot on, though: Can you imagine how difficult it will be to get all that shit OUT of everything again when we finally realize how useless and harmful it actually is??
@ludovica exactly 🫠
@noboilerplate I'm so glad I'm not a programmer 😅
@noboilerplate I mean, I work with databases, but library databases tend to be 10 years behind the trend, so we might be able to escape this one calamity 😬