Writing this up again so I can pin it: AI is literally a fascist project. Friends don't let friends use it.

Before I go into this, there are two types of responses to this that I have taken seriously so far.

One I'll call HashTagNotAllAI, which yields the obligatory "sure", but has the same smell. I'll leave it at that.

The other is that an anti AI stance also throws some assistive technology under the bus, making such a stance intrinsically ableistic. The easy thing to do is to refer...

... to HashTagNotAllAI above, sans the smell, but I don't think that is fair. To be clear, I don't want those tools to disappear.

My anti AI stance isn't about tech, or not primarily about tech.

I don't like that it swallows up rainforests and produces unreliable results. Those are valid criticisms, but I actually agree that those are - in principle at least - solvable problems.

A knife is technology. I can use a knife to cut out a cancer, or to disembowel someone. This makes a knife...

... neither good nor bad; it's the usage of the tool that counts.

That same argument cannot be transferred to a gun. The entire point of a gun is to hurt and kill; its intrinsic purpose is evil. That it can be used to hurt, kill, and potentially deter "baddies" doesn't change that. It may justify the use in highly select circumstances, but doesn't magically absolve it.

Generally, tools are neutral. A weapon is a kind of tool that is intrinsically evil.

Back to AI.

AI is a tool. Even...

... so, the balance of cost vs. benefit must be considered. Clearly the benefit of AI used in assistive tech is worth a much higher cost than when applied in many other areas.

But even a high cost doesn't make a tool evil. It just raises the importance of asking questions about the cost/benefit tradeoff.

The thing that bothers me is that some AI is a weapon, and it's a weapon of fascism.

I suppose it's much fairer to restrict this to generative AI/GenAI, but I resist such a restriction,...

... because I just don't know what other AI use will come around the corner with the same issues. At the same time, it's the pattern that matters more than the tech, so it should be more broadly applied than just to AI.

"AI is evil" and "AI is a fascist project", things you'll see me write, are shorthands for this.

What makes GenAI evil?

The intent of GenAI, both implicitly and explicitly, is to replace humans.

Implicitly, because anything that automates does so. This is the more complex...

... part, but not all that complex, either. Automation is great when it automates boring, repetitive or dangerous tasks. It is useful when things need to be replicated precisely over and over.

The problems with GenAI approaches here are a) that they never seem to target the boring, repetitive or dangerous tasks. Generative art? No, that's literally taking the fun out of life.

And b) they're not precise. The whole point of GenAI is that it's a statistical parrot, it produces *likely* results.

Precision simply is not part of the job description, as it were.

So what this does is replace parts of the human experience that should not be replaced, and leaves parts intact that really should go, at least over time.

This should already be enough to make it evil. But what about it is fascist?

Other than the financing? Well, it's how it fits into politics.

A decade or so ago, some folk published popular science book called "The Dictator's Handbook" (ISBN-13: ‎ 978-1610391849). While this..

... gained some immediate notoriety, what fell by the wayside is that it's actually just the popular science *summary* of much deeper work, and based on a thorough analysis of as many forms of government across the globe and history as the researchers could manage.

The picture that emerges is this: natural resources beget tyrannies; lack of natural resources cause democracy.

This is, of course, a summary of a summary, and shouldn't be taken without comment. But this here is also a social...

... media thread, so I'll skip the fuller explanation, and just provide a brief summary.

No ruler exists without support, and support is essentially bought. This means that the question of who is in power largely relates to where they can raise money from, and how much they need to spend to raise more.

When there exist natural resources, the amount of people needed to extract them is relatively low. You clearly need to pay those people well, as well as the military. The rest of the...

... population is of lesser importance.

When you do not have natural resources, the only sensible source of income is taxation, for which you need a large population earning well, so that the percentage you skim off the top is enough to pay for essential support.

Lack of natural resources tends to make this service economies, which means the population also needs to be healthy, well fed, able to travel, and well educated.

When your population is well educated, it tends to want a say in how...

... things are done, so spending on individual people or groups of people is significantly less effective than spending on the population at large.

The result is that democracies and service oriented economies go hand in hand, and support each other rather than work in opposition.

Marx would not have used the words "service economy", but would have said "labour". Both are synonyms for "people".

Now cryptocurrencies and AI have one thing in common, other than using insane amounts of resources.

They're supported by the same investors. But actually, that's the same as using insane amounts of resources.

I'll explain.

The thing is this: natural resources in themselves do not matter. Yes, history is clear in where the patterns lie. But "air" is also a natural resource, and so far, there isn't much monetization of that. (Man was Spaceballs prescient: https://spaceballs.fandom.com/wiki/Perri-Air).

What makes a natural resource monetizable is scarcity. Cryptocurrencies are explicitly systems of artificial...

Perri-Air

Perri-Air was a brand of canned, naturally sparkling salt-free air from Druidia. It was sold in an aluminum can with a sticker on it reading "Perri-Air, Canned in Druidia." Due to an air shortage, it was one of the few sources of air available to Spaceball City and its residents.[1] Planet Spaceball President Skroob kept several cans in his desk. After he ended a phone interview with a reporter, he took one out of his desk and inhaled it, ironically after denying that Planet Spaceball had an...

Spaceballs: The Wiki

... scarcity, in which - by whichever proof scheme - those who participate early in the system benefit off those who come later (aka pyramid schemes). The proof algorithm guarantees scarcity; it's the whole point of blockchain vs. any other distributed system that there is a chokehold on resource creation somewhere.

AI is doing much the same thing, but it doesn't advertise this artificial scarcity as part of the solution. Instead, it simply guarantees that those who already own the most...

... compute resources have the edge. And that is not you or me.

In short, AI is a system which a) aims to replace human labour, while b) shifting the means of production into the hands of the few.

This would be "fine" if nobody used it. What matters for this to succeed is that everyone depends on it. At that point, "means of production" becomes the digital equivalent of a "natural resource".

Marx matters, folk.

You can still argue that this makes AI a weapon of capitalism or tyranny, but...

... not outright fascism.

Technically, that's kind of true. But it's also missing an important part of the picture. As the infamous Chad C. Mulligan wrote, "COINCIDENCE: You weren't paying attention to the other half of what was going on."

First, note how Hitler's extermination camps were inspired by Henry Ford's assembly line. Capitalism and fascism always had a close relationship, and it's not really possible to separate the two. It's no coincidence that the Jews of the time were also...

... associated with the Bolsheviks, in order to justify the application of means for dealing with one supposed threat to the other.

But more importantly, Peter Thiel is a literal fascist, strong promoter and heavy investor in AI. The ties are there, right here, right now, and who benefits - and it's not just Thiel, but all of his Epstein Ilk" - from an AI takeover is abundantly clear.

It's also well documented. This isn't some vague conspiracy shit. They're saying this quiet part out loud.

In short, *as a system* rather than a technology, AI is without any doubt a deeply fascist project. It is a weapon aimed straight at the world population at large.

Caveats that the tech itself can be seen as neutral, and definitely has good applications remain unaffected by this.

The survival of our democracies - or sufficiently democratic systems around the world - is the thing that concerns me, though. (Also the environment, but arguably less so overall.)

@jens i argue that AI becomes a means of production itself and also that i really want fully automated communism (nobody is required to work). I don't believe that LLMs alone could achieve this, but that they could be an important part of a system that does achieve that.

I have not given up on the Marx' assumption, that capitalism will eventually destroy itself because of it's inner contradictions. AI seems to me to have the potential to drastically increase said inner contradictions. If human labor becomes obsolete, because of AI, who is going to buy the goods produced under capitalism? Not the worker, because workers no longer can afford said goods. That indicates to me, that the automation itself could lead to the end of capitalism.

@condret Your mental model is not my mental model.

In my mental model, hypercapitalists - billionaire oligarchs - have no more need for extra capital. They'll pursue it, but it has absolutely lost meaning other than as a number. This is also the suggestion the very few insider views we get suggest: those people care only that their number is bigger than the other person's, not about money as such.

So any model that reduces this to a capitalist need to extract more capital is, IMHO, wrong. 1/n

@condret What the involvement of e.g. Thiel, Musk, Zuck and Bezos in politics instead demonstrate is that those people care about power.

You don't need to amass capital to have power. That's where the game is currently at, sure. But real power is enslavement.

Slaves either do not buy products, or they buy products you tell them to buy, with the money you give them, carefully adjusted so that they will never have enough to break out of enslavement.

This is the game.

And what better... 2/n

@condret ... way to play it than to make your future slaves dependent on something you control entirely? Make them dependent not only for their livelihood, but for their information - their education?

I don't think mere capitalist logic applies here at all.

/3

@jens

Game, yes.

It's a game based upon dopamine addiction, acquired through schemes of devious pillage.

A game only for dopeys that fail to notice the death ahead of ecology that is the base of their own little lives.

They will lose also. Later than us so can have more time to ponder how they miscalculated and so destroyed the game counters, us.

Prime idiots!

@jens The way the global stock market works is an interesting progenitor for cryptocurrencies, too. It used to be traded mostly based on earnings paid for holding the stock, but has in recent decades transitioned into being traded speculatively, which makes each stock into its own little proto-ponzi scheme.

@nielsa Oh, yes.

My understanding of financial products isn't exactly complete, but my take is that they all fall into two categories.

I mean, buying stock is a bet on future earnings. You can lose that bet, so one category is to aggregate things in such a way that - hopefully - losses in one are offset by gains in the other.

The other category is a layer of indirection, i.e. bets on something other people are betting on.

All of this multi-layered to the point where you can't know what...

@nielsa ... you're betting on, which makes ponzi schemes and insider trading so much more effective, as the costs are externalized to the average shareholder.

And people think this is serious business.

The only thing that seems serious about it is that it seriously affects us.

There's an aside here that I sometimes found worth pointing out: "replacing people" doesn't necessarily mean firing people.

It may simply mean lowering their "worth" in salary negotiations, because you can use the threat of replacement with AI.

Sometimes chains of logic are as simple as "A because B", and sometimes there are several intermediary steps.

You can do a step further: even if YOUR job is not threatened by AI takeover, if the average salary drops (locally), you're also affected.

@jens thanks for the excellent write-up. The last time I tried to make this argument with @davidgerard he blocked me. I'm guessing I didn't make my position clear enough to not be confused with a genAi apologist (me, LOL)

@oblomov @jens

I added to the nuts&bolts database I wrote for my workshop some features like suggesting alternatives if a screw is not available, but I don't think the pages of if…then makes it "intelligent", even if a lot of people I know wouldn't be able to think to that.