@gerrymcgovern now I'm veering even farther away, but here's another thing to think about when considering the energy usage of generative AI.
Think about what the "deep research" -- such as Google Gemini's (https://gemini.google/overview/deep-research/) can do, for example with medical advice.
Okay, so medical advice is a bit fraught, but think of a person putting in some symptoms, what they know about their health, and so on, and getting a pretty detailed report. That's a *lot* of datacenter energy use.
But we need to compare it with the alternative, which in the US, likely involves driving a car to some clinic, which very well could be miles away.
Think about driving a car, say, 20 miles, along with the implicit energy use of the medical clinic. And your doctor -- again, in the US -- may not be much more helpful than that report.
These two things -- the AI-generated research, and your doctor's advice and comments -- are certainly not always comparable. But if they are? I suspect the AI uses less energy (and it could be cleaner, since the electricity can be generated from renewables or efficient natural gas plants).
I'll admit that this example is real. I actually did this. I have some mysterious medical conditions and symptoms, and what I got from the chatbot's research was, when I compare it to what I've been getting from my doctors, fairly comparable. And I didn't have to make an appointment, to wait, to spend the time driving, to pay for the appointment -- and I got a report with citations and ideas.
Just some more thoughts if we're thinking about the energy usage of AI systems...