"Typical Chat-GPT query uses about 10 times more energy than Google search That’s just for a basic generative AI function. More advanced queries require substantially more power that have to go through an AI Cluster Farm to process large-scale computing between multiple machines."

"AI is typically deployed in 20-30 cabinet clusters at or above 40 KW per cabinet. This represents a fourfold increase in KW/ cabinet with the deployment of AI. The difference is staggering."

https://www.techradar.com/pro/security/i-sat-down-with-two-cooling-experts-to-find-out-what-ais-biggest-problem-is-in-the-data-center

I sat down with two cooling experts to find out what AI's biggest problem is in the data center

What they told me could force a total rethink of data center design

TechRadar pro
Sure, but usually in my experience I can get the answer I need with one query, which is not possible with one Google search. P.S: I do run a local LLM for small things for your mentioned reason though
@gerrymcgovern but if you add a GPT search to every Google search, boom ratios flipped.
@gerrymcgovern Most times I search via Yacy, MAN-Pages, Manjaro/Arch and most of Stack Overflow is queried. I run yacy local, so no more power used than necessary. (No network traffic, and so on)
@gerrymcgovern doesn't almost every Google query go through AI now as well? Or is Gemini that much more power efficient than ChatGPT?
@jonas Yeah. I think they're talking about pre-AI Google searches.

@gerrymcgovern I do agree that chatbot queries do genuinely require more energy, but I think there's a little more consideration here.

If you're measuring energy consumption, you need to do a kind of "lifecycle analysis" -- if the choice is between using a traditional search engine and asking a chatbot, you should compare the entire workflow with each.

If I do a regular web search for something, I will frequently click three to four of the results and open them in new tabs, because I'm not sure exactly which one will answer my question; I might do another search. Each of those loads a website, with all the accompanying HTML, JS, and so on.

With chatbots, I find it's more common for the response to have exactly what I want. "One and done", as they say.

For example, the other day I wanted to know what command in vim will copy text to the system clipboard. (It's "+y when in visual mode, BTW.)

From that perspective, it's not super obvious that the total energy consumption for using a chatbot is that much more than usual searching.

(I hasten to add that it *is* obvious that the overall energy consumption of AI-related stuff -- training the LLMs, operating the chatbots, and so on -- is absolutely staggering and an enormous environmental concern!)

@gerrymcgovern now I'm veering even farther away, but here's another thing to think about when considering the energy usage of generative AI.

Think about what the "deep research" -- such as Google Gemini's (https://gemini.google/overview/deep-research/) can do, for example with medical advice.

Okay, so medical advice is a bit fraught, but think of a person putting in some symptoms, what they know about their health, and so on, and getting a pretty detailed report. That's a *lot* of datacenter energy use.

But we need to compare it with the alternative, which in the US, likely involves driving a car to some clinic, which very well could be miles away.

Think about driving a car, say, 20 miles, along with the implicit energy use of the medical clinic. And your doctor -- again, in the US -- may not be much more helpful than that report.

These two things -- the AI-generated research, and your doctor's advice and comments -- are certainly not always comparable. But if they are? I suspect the AI uses less energy (and it could be cleaner, since the electricity can be generated from renewables or efficient natural gas plants).

I'll admit that this example is real. I actually did this. I have some mysterious medical conditions and symptoms, and what I got from the chatbot's research was, when I compare it to what I've been getting from my doctors, fairly comparable. And I didn't have to make an appointment, to wait, to spend the time driving, to pay for the appointment -- and I got a report with citations and ideas.

Just some more thoughts if we're thinking about the energy usage of AI systems...

Gemini Deep Research - your personal research assistant

Save hours of work with Gemini Deep Research as your personal research assistant from Google.

Gemini
@ddrake These are valid points and for years I used them myself to justify more tech use. However, these systems are highly unreliable and fraught with all sorts of problems. And for every positive use case, there are 1,000 for cheap entertainment. Meanwhile, AI and all other modern systems contribute to the ever growing energy, material and water demands. We have become a devouring civilization, no matter how we try to rationalize things. Our environment is collapsing, and we won't see.

@gerrymcgovern I don't disagree! You have some good observations. But let me, in a friendly way, offer these ideas:

1. Take what you wrote, and replace "AI" with "television". I could imagine someone at the dawn of the tv era saying the exact same thing!

2. I think you could say we have *always* been a devouring civilizaton.

Hell, we humans have been devouring since before civilization! Look at the fossil record of how the extinction of megafauna like, well, mastodons, closely tracks with the expansion of early human populations around the world. We've been devouring resources -- whether ecological ones, or mineral ones, or whatever -- for as long as we've been a species.

@gerrymcgovern that's actually way better than I thought. With the amount of compute llms require I thought it would be like 1000x more
@matipolit That's for basic text interaction. If you're talking about video or animation, then those figures would really jump.

@gerrymcgovern
Thanks for sharing this. I appreciate the level of detail provided here, from folks working in the field.

One of the most interesting things here to me is the suggestion that we already have one strong model for drastically reducing power and water requirements. The question is, how long will it take for it (or another approach) to be taken up by the industry.

@NearerAndFarther Thanks. I've been hearing about these sort of efficiency improvements for so many years. And it nearly never impacts total consumption figures, which always keep rising drastically, because there always such an explosion in use. You just keep getting these huge increases in water and electricity.

@gerrymcgovern
AI Chatbots are not the problem, if we're talking about energy use/effects on the climate.
10x a Google search is still an extremely small amount (as in 3mins of light bulb small!), and the 10x figure is also quite outdated (we're approaching 1x recently it seems)

The real AI-electricity offenders are apparently ad/recommender systems etc. and industrial applications.

I can recommend this post for a detailed discussion:
https://andymasley.substack.com/p/a-cheat-sheet-for-conversations-about

Using ChatGPT is not bad for the environment - a cheat sheet

The numbers clearly show this is a pointless distraction for the climate movement

Andy Masley
@nightoo This is such a tired argument that is used again and again by promoters of technology. It's how we get the term 'clean energy' when it's not clean, 'renewable energy', when it's not renewable. So, AI adtech is intensely damaging. It doesn't make ChatGPT good. The whole world and culture of AI is intensely environmentally destructive.
@gerrymcgovern there are massive environmental problems thanks to AI, but that's not because of ChatGPT.
There are many good arguments for why ChatGPT etc. are bad, but electricity consumption isn't one of them. Let's focus on the good arguments and not weaken our position with misinformation.
Sounds like the old corporate strategy of getting the consumer to blame themselves for gross industrial excesses. The problem isn't disposable packaging manufacturers... the problem is you littering! Clearly! We should all boycott ChatGPT that would totally fix AI! And meanwhile, the behavioral analysts are quietly burning the planet trying to get their massive computers to figure out how to enslave us all. (Ad/recommender systems)

As this Andy Masley says:
The data implies that at most all chatbots are only using 1-3% of the energy used on AI.
CC: @[email protected]
Power-hungry data centers are prompting new gas plant proposals. Critics say that would lock in pollution and higher costs for decades.

We Energies is building new gas-fired power plants to meet growing energy demand that largely stems from data centers. As developers pursue data centers in Wisconsin, that’s raising concerns about environmental impacts tied to their operations.

WPR
@gerrymcgovern I love how a lot of people think it's either Google search or some crap 'AI' tool. I still do web searches myself, using DuckDuckGo (*not* its 'AI' assistant). I never used an 'AI' tool, never felt the need to, and I've always been able to find what I was looking for without feeling it took me too much time. And even if it takes some time, I learn more in the process. I train *myself*, and I trust my method & judgement more than a packaged 'AI' response/result.
@morrick Exactly. It used ChatGPT once for a specific research project I had substantial expertise in. I found it mediocre at best. Searching and research are skills.
@gerrymcgovern I don't use Google search anymore, and have never have used Chat GBT. Nor do I use any the of AI doodads all the already enshittified BE EVIL/BURN THE PLANET/Silicon Valley AI co-religionist techlords are sticking on their front ends. And their ultimate ends are antihuman. I am a human, and thus their enemy. Just sayin' how I really feel.

@tinydoctor @gerrymcgovern

Right there with ya.

@[email protected] @gerrymcgovern Not that I believe the evangelical screed AI becoming sentient and taking over the destruction of a livable planet while they enjoy their new indestructible robot bodies is anything but a fantasy/grift. The danger is the poisonous stupidity of these gormless nimnulls, promoted and protected by their money, and their desire to not save humanity from itself, but save themselves from humanity.
@tinydoctor I think you've hit the nail on the head: the tech bros want to save themselves from humanity and nature. They want to live inside the screen.

@gerrymcgovern

"ChatGPT query 10x as much as a Google search." is roughly 3 Wh per prompt and equivalent to

- leaving 11W led light light on for 16 minutes,
- streaming video in MacBook Air for about 30 minutes.
- 600W microwave for 20 seconds.
- 230W refrigerator running 46 seconds.
- driving 190 Wh/km EV vehicle 330 meters.

The cost of 100 queries per day is similar to leaving a lights on in a room for a day.

@maxpool We can do all these sort of calculations and they're interesting. Meanwhile, AI data centers grow with country-size hungers for electricity and water and for all the materials needed to make these hot machines. Digital is physical.

@gerrymcgovern

Not just interesting, but essential.

It's important to put thing into proportion so that we can make wise policy, business and consumer choices. Depending on what kind of activity those data centers replace or enable, they can save or waste energy.

For example, does a help center with AI consume more or less energy per call than a humans sitting in air conditioned rooms in the front of computer? How much?

Datacenter using 1 GWh of energy is equal to 333 million individual 3 Wh uses. Just because lots of energy is used in one place does not make it inherently worse.

Instead of "country sized" we should think in per worker, per consumer, per hour of work or play, and see where it fits in energy use.

@gerrymcgovern

One could save a lot of money by saying NO to AI.

One could gain a lot of customer goodwill by making this public.

#AI #Insanity

@gerrymcgovern
Please please please let this White Elephant bubble burst and take out all the companies that thought it would be a good idea.
@gerrymcgovern @pluralistic "AI‘s biggest problem" is that it’s a solution to a problem no-one had except for some Silicon Valley billionaires that needed a new license to print even more money.
Normally, I’d be cool with that, except they are so successful at selling their snake oil that public money is rerouted on a worldwide level from solving actual problems to the coffers of the 0.1%.
So, instead of fighting climate change, we take the sparse resources we have to build more radiators?!?
@gerrymcgovern doesnt Grok’s datacenter run Diesel Generators to supplement power?
@Scorpion_Byte13 absolutely. And many, many more than its officially allowed. And with zero abatement. And they're planning another huge one in same place.
@gerrymcgovern fortuantely and unfortunately, AI data-centers will help push forward SMR facilities.
@Scorpion_Byte13 yes, but these are fraught with all sorts of problems. I read something by some senior data center execs last week saying nuclear is 10 years away and will always be 10 years away.

@gerrymcgovern are you thinking of Nuclear Fusion Reactor?

I am talking about small modular nuclear reactors that use existing nuclear fission process. They use safer fuels and cooling methods.

The sales pitch is that they are super cheap and scalable. I believe the only thing in the way is regulation.

Five Things the “Nuclear Bros” Don’t Want You to Know About Small Modular Reactors

A realistic understanding of their costs and risks is critical.

The Equation