We really need to start pushing the truth about AI.

The easiest job for AI to replace would be Management.

Uses the most resources, while offering the least amount of actual productivity.

Reads something by some idiot on the internet, and bases their Management Style around it.

Makes decisions without context.

@RickiTarr

I seem to be the only person on Mastodon who finds AI useful.

@tuban_muzuru @RickiTarr I find it useful, but not nearly as useful as it's hyped to be... which makes me wonder about its long-term viability. Also, whenever I try to discuss this, I get "but one day, it'll be able to do that!" as if it's inevitable.
@eyrea @tuban_muzuru A lot of the issues for me is the lack of regulation and standards involved in it, and the absolutely horrifying amount of energy and water it takes to run it.

@RickiTarr @eyrea

1/

Koomey's Law observes that the energy required to perform a set number of calculations has been cut in half roughly every 2.6 years.

We have GPUs now. Zillions of massively parallel processors for rendering video. That makes them good for calculating AI weights.

So now we're moving to Tensor Processing Units and massive ASIC custom silicon.

Intel has Loihi, which only powers up when a spike is processed - reduces power consumption greatly.

Reducing costs/energy savings does not reduce overall consumption. It lowers the barrier to entry, increasing demand and making the total use the same as or higher than before. AI will never be more efficient (until the bubble bursts).

@michael @RickiTarr @eyrea

I hearken back to the 1980s when I was consulting at a Big Insurance Company. They had a big cooling pond with swans, to cool their mainframes.

Much talk at the time of Landauer Limit.

Tell me what I got wrong here.

In 1985, there was around 30M-50M computers worldwide, and today there is an estimated 2B, 40+ times more.

While some computers in 1985 may have been mainframes, a lot were Commodore 64s or Apple IIs, so it's unlikely the average power consumption was over 40 times more to outweigh the factor 40 in numbers.

Not even counting that a typical computer around 2000 had a 200W-300W PSU while a typical (desktop) computer today has a 500-1000W PSU, and ignoring all mobile devices even though they are more used and way more powerful than a super computer in 1985.

When things are more efficient and cheaper, more people buy them. Typically in numbers that make the overall cost/consumption to up, not down, when things improve.

@michael @RickiTarr @eyrea

When things are more efficient, they draw less current. That's the point here.

I am told by various hysterical Chicken Little types how training an AI model is terrible for the environment - where were they when Chinese farmers were - and still are - mining bitcoin with all that power from Three Gorges Dam?

I've tried to make my point, answering Ricki's concern about energy and water, saying everyone's working on the problem and drastic improvements are coming.

It does not matter they draw less power when there are more of them.

Pretending I didn't notice your blatant attempt at whataboutism, I also complained about bitcoin wasting energy – and have for a decade now. You're using bitcoiners' fundamentally flawed argument that more efficient miners will solve the energy consumption: bitcoin has built in anti-efficiency so more energy efficient bitcoin mining will just lead to more bitcoin mining, not using less energy.

AI is exactly the same: while it might be more efficient to train per neuron or parameter (or other largely irrelevant parameters that grow faster than efficiency improvements), each subsequent model has taken more energy to train, disregarding theoretical and technical improvements to the process. There is absolutely no reason to expect it would be different in the future, and history has told us it is not.

Bitcoiners btw do the same thing you do: talk about their scam-tokens in future tense as if it is the present. They will be so great very shortly, so we might as well pretend they are great now.

@michael @RickiTarr @eyrea

You'll do me the courtesy of paying attention - what is the Landauer Limit?

And following that, why would I make that point?

@michael @RickiTarr @eyrea

[incredulous] You waded in here and didn't realize the Landauer Limit says pretty much everything you've said already?

Closed mouths gather no feet. When a process gets expensive, the boys and girls in Engineering get to work building a better process - which is immediately subsumed into the new more-efficient process. And on it goes ad infinitum.

@tuban_muzuru @michael @RickiTarr @eyrea

The boys and girls in engineering right now are so busy working for the short-term profits demanded by their brutally competing authoritarian sugar daddies that they have stopped thinking ahead.

An awful lot of people actually still believe that at some point someone in this giant game of planetary chicken is going to do the sensible thing and hit the brakes and act to prevent the total apocalyptic collapse of our biosphere and civilization-- but no. No, that is an incredibly fucking naive and dangerous idea.

The people in charge right now don't care about anything but winning, even if that means all they win is the nicest possible throne on the nicest possible yacht from which to rule a seared landscape of steaming rubble heaped with skulls.

The people in charge CANNOT bail out now. That would mean writing off the giant investments. As long as the ball keeps rolling, the billions spent remain "an investment in the future," but the second they officially realize that a fancy T9 dictionary is really, really cool, yet ultimately not that useful and probably not the golden goose it was presented as, it is a "loss."

I did read that one company seemingly has a mostly sensible reaction to the hype. CEO of Apple, Tim Cook, has rejected buying several grifting companies (Tesla and Netflix) despite it being suggested by other higher ups, and SVP of Software Engineering Craig Federighi is hesitant to paying billions for AI grifters (Perplexity and Anthropic). macrumors.com/2025/08/26/apple-discussed-buying-mistral-ai-and-perplexity/
Report: Apple Discussed Buying Mistral AI and Perplexity

Apple executives have reportedly discussed acquiring Mistral AI and Perplexity, The Information reports. Services chief Eddy Cue is apparently the...

MacRumors