@glyph I think "purity culture" is the best label for this type of moralism in front of LLMs because the moralising comes from people that can to afford to not use the tech, vs. Reverse Centaurs that have the tech forced onto them and have to grapple with its effects.
Someone getting on their high horse about not using LLMs and "pushing back" on the narrative or "keeping people in check" has no effects on the material realities that most gig-workers that act as de facto slave picker-upper, putter-downer for algorithms face.
It's like saying: "yeah bro, I'm also against AI, like totally bro, totally morally against it's use and deployment" to an Amazon workers that got told by an LLM they could increase their output by getting a colostomy bag.
Doctorow is right, the actual moral way forward is making AI economically unattractive, moralising AI use is just purity testing.
@glyph I'm not accusing you of that, I'm saying purity testing on AI use (between people that can afford the choice of using AI or not) has no material effect on people that are forced to be Reverse Centaurs and is mostly a position of privilege to have.
It's mental onanism disguised as social justice
@glyph my goal isn't to annoy you, but to me this was related to
> That's how we make good tech: not by insisting that all its inputs be free from sin, but by purging that wickedness by liberating the technology from its monstrous forebears and making free and open versions of it
I point at moralising because the core reason why AI is being pushed everywhere right now is because it promises growth in an environment of expensive capital (high interest rates). Most of this deployment is from knowledge work because the West has a) almost completely deindustrialised and b) has a high proportion of highly financialised by ultimately bullshit jobs.
To me, taking the fight to AI means making it economically unattractive, either by enshrining in law that human authorship is needed for copyright, or making models so efficient that large datacentres expenditure becomes foolish.