Every time someone talks to ChatGPT for 20 exchanges or so ...

... Microsoft's servers use a half-liter of freshwater to cool down

AI is *thirsty*

My essay on some implications of this: https://clivethompson.medium.com/ai-is-thirsty-37f99f24a26e

A "friend" link, in case you don't subscribe to Medium: https://clivethompson.medium.com/ai-is-thirsty-37f99f24a26e?sk=f5b2ea10c649a34236577139fecfd86a

@clive how much water would asking the same number of questions of a "normal" search engine require? 500 ml sounds like much, but without a reference value it's a bit hard to judge.

@pkraus

Good question! I didn’t see that figure in the paper — either they didn’t include it or I missed it

My *suspicion*, for which I have no evidence, so take it with a grain of salt, is that inferencing with a model is much more computationally intensive than a traditional search, if only because the latter benefits from two decades of engineering and computer science that has chased efficiencies at scale