Writing this up again so I can pin it: AI is literally a fascist project. Friends don't let friends use it.

Before I go into this, there are two types of responses to this that I have taken seriously so far.

One I'll call HashTagNotAllAI, which yields the obligatory "sure", but has the same smell. I'll leave it at that.

The other is that an anti AI stance also throws some assistive technology under the bus, making such a stance intrinsically ableistic. The easy thing to do is to refer...

... to HashTagNotAllAI above, sans the smell, but I don't think that is fair. To be clear, I don't want those tools to disappear.

My anti AI stance isn't about tech, or not primarily about tech.

I don't like that it swallows up rainforests and produces unreliable results. Those are valid criticisms, but I actually agree that those are - in principle at least - solvable problems.

A knife is technology. I can use a knife to cut out a cancer, or to disembowel someone. This makes a knife...

... neither good nor bad; it's the usage of the tool that counts.

That same argument cannot be transferred to a gun. The entire point of a gun is to hurt and kill; its intrinsic purpose is evil. That it can be used to hurt, kill, and potentially deter "baddies" doesn't change that. It may justify the use in highly select circumstances, but doesn't magically absolve it.

Generally, tools are neutral. A weapon is a kind of tool that is intrinsically evil.

Back to AI.

AI is a tool. Even...

... so, the balance of cost vs. benefit must be considered. Clearly the benefit of AI used in assistive tech is worth a much higher cost than when applied in many other areas.

But even a high cost doesn't make a tool evil. It just raises the importance of asking questions about the cost/benefit tradeoff.

The thing that bothers me is that some AI is a weapon, and it's a weapon of fascism.

I suppose it's much fairer to restrict this to generative AI/GenAI, but I resist such a restriction,...

... because I just don't know what other AI use will come around the corner with the same issues. At the same time, it's the pattern that matters more than the tech, so it should be more broadly applied than just to AI.

"AI is evil" and "AI is a fascist project", things you'll see me write, are shorthands for this.

What makes GenAI evil?

The intent of GenAI, both implicitly and explicitly, is to replace humans.

Implicitly, because anything that automates does so. This is the more complex...

... part, but not all that complex, either. Automation is great when it automates boring, repetitive or dangerous tasks. It is useful when things need to be replicated precisely over and over.

The problems with GenAI approaches here are a) that they never seem to target the boring, repetitive or dangerous tasks. Generative art? No, that's literally taking the fun out of life.

And b) they're not precise. The whole point of GenAI is that it's a statistical parrot, it produces *likely* results.

Precision simply is not part of the job description, as it were.

So what this does is replace parts of the human experience that should not be replaced, and leaves parts intact that really should go, at least over time.

This should already be enough to make it evil. But what about it is fascist?

Other than the financing? Well, it's how it fits into politics.

A decade or so ago, some folk published popular science book called "The Dictator's Handbook" (ISBN-13: ‎ 978-1610391849). While this..

... gained some immediate notoriety, what fell by the wayside is that it's actually just the popular science *summary* of much deeper work, and based on a thorough analysis of as many forms of government across the globe and history as the researchers could manage.

The picture that emerges is this: natural resources beget tyrannies; lack of natural resources cause democracy.

This is, of course, a summary of a summary, and shouldn't be taken without comment. But this here is also a social...

... media thread, so I'll skip the fuller explanation, and just provide a brief summary.

No ruler exists without support, and support is essentially bought. This means that the question of who is in power largely relates to where they can raise money from, and how much they need to spend to raise more.

When there exist natural resources, the amount of people needed to extract them is relatively low. You clearly need to pay those people well, as well as the military. The rest of the...

... population is of lesser importance.

When you do not have natural resources, the only sensible source of income is taxation, for which you need a large population earning well, so that the percentage you skim off the top is enough to pay for essential support.

Lack of natural resources tends to make this service economies, which means the population also needs to be healthy, well fed, able to travel, and well educated.

When your population is well educated, it tends to want a say in how...

... things are done, so spending on individual people or groups of people is significantly less effective than spending on the population at large.

The result is that democracies and service oriented economies go hand in hand, and support each other rather than work in opposition.

Marx would not have used the words "service economy", but would have said "labour". Both are synonyms for "people".

Now cryptocurrencies and AI have one thing in common, other than using insane amounts of resources.

They're supported by the same investors. But actually, that's the same as using insane amounts of resources.

I'll explain.

The thing is this: natural resources in themselves do not matter. Yes, history is clear in where the patterns lie. But "air" is also a natural resource, and so far, there isn't much monetization of that. (Man was Spaceballs prescient: https://spaceballs.fandom.com/wiki/Perri-Air).

What makes a natural resource monetizable is scarcity. Cryptocurrencies are explicitly systems of artificial...

Perri-Air

Perri-Air was a brand of canned, naturally sparkling salt-free air from Druidia. It was sold in an aluminum can with a sticker on it reading "Perri-Air, Canned in Druidia." Due to an air shortage, it was one of the few sources of air available to Spaceball City and its residents.[1] Planet Spaceball President Skroob kept several cans in his desk. After he ended a phone interview with a reporter, he took one out of his desk and inhaled it, ironically after denying that Planet Spaceball had an...

Spaceballs: The Wiki

... scarcity, in which - by whichever proof scheme - those who participate early in the system benefit off those who come later (aka pyramid schemes). The proof algorithm guarantees scarcity; it's the whole point of blockchain vs. any other distributed system that there is a chokehold on resource creation somewhere.

AI is doing much the same thing, but it doesn't advertise this artificial scarcity as part of the solution. Instead, it simply guarantees that those who already own the most...

@jens The way the global stock market works is an interesting progenitor for cryptocurrencies, too. It used to be traded mostly based on earnings paid for holding the stock, but has in recent decades transitioned into being traded speculatively, which makes each stock into its own little proto-ponzi scheme.

@nielsa Oh, yes.

My understanding of financial products isn't exactly complete, but my take is that they all fall into two categories.

I mean, buying stock is a bet on future earnings. You can lose that bet, so one category is to aggregate things in such a way that - hopefully - losses in one are offset by gains in the other.

The other category is a layer of indirection, i.e. bets on something other people are betting on.

All of this multi-layered to the point where you can't know what...

@nielsa ... you're betting on, which makes ponzi schemes and insider trading so much more effective, as the costs are externalized to the average shareholder.

And people think this is serious business.

The only thing that seems serious about it is that it seriously affects us.