RE: https://mamot.fr/@pluralistic/116245164337719643

I agree abstractly that technology with bad origins can be used for good. I disagree with the article because saying the leftist critique is some idea of technological "original sin" is incorrect. (1/9, strap in)

The leftist critique is that these technologies are used in harmful ways for real reasons and without a major change in material conditions, the harm will continue. (2/9)
I'm going to reiterate that local LLMs are mostly (put a pin in that) fine, but they mostly don't matter (at least right now). Until that local grammar check is integrated as the text editor's default, the uncritical mass is going to keep feeding the text back to whatever data center. (3/9)
To Firefox's credit, their page translation does this right by translating locally. The problem with this is that Firefox itself also mostly doesn't matter as it's a tiny sliver of installed browsers. (4/9)
But, it's not the spell checking or translation that matters in the first place. If the same compute were done in a data center, it would not moreso boil the oceans. Local LLMs have fundamental compute constraints and that is why they are okay compared to non-local LLMs. (5/9)
"The street" will keep using these unconstrained LLMs to vibe code and boil the oceans until the big tech money finally runs out, because why wouldn't they? Only a sliver of "the street" is critical enough to consider ditching Windows, Twitter, or Facebook even as all of those continually worsen. (6/9)
But, let's revisit the way in which local LLMs are only mostly okay. Training of local LLMs relies upon that same unconstrained cloud compute in order to be created. It boils the oceans too, but if it's treated as a durable item, it's mostly okay to do that. But, while the compute is subsidized, I doubt that it will be treated as such as companies keep pursuing birth of a machine god. (7/9)
This is what will look like technological original sin if you don't pin the nuance down correctly. And if we ask ourselves here what the "master's tools" that we cannot use here are, I'd be more inclined to say that it's the large data centers that provide that unconstrained computation. And we already know that we ecologically cannot afford to use them. (8/9)
Local LLMs may have their day to be positive when the money runs out on unconstrained compute, but that's just not the situation right now and when talking about fighting harms of a technology, saying to use local LLMs is like saying that the solution for all the problems of Twitter is the Fediverse or the solution for all the problems of Windows is Linux. (9/9)

Ugh, but also there are even more issues related to stealing the labor of artists and destroying their livelihoods that will also look like "technological original sin" when brought up, to which I guess I'd say, we can also talk about local AI models when the artists are no longer starving.

Although, given the costs to operate for video generation, that also circles back into problems that are outgrowths of unconstrained computation.

And sure, that's less relevant to spell checking in particular, but it does certainly relate to AI more broadly being harmful and that harm being driven by the surrounding context because we most certainly have the resources to give artists a good living and so long as the context remains as it is, AI will be used to destroy artist's livelihoods.