"Data centers must coöperate with local electric utilities to manage these training runs. The water coursing above CoreWeave’s microchips enters at room temperature but leaves warmer than a hot bath. It is cooled in a storage tank before being recycled into the system. The temperature, humidity, and particulate count of the air inside the room are also carefully monitored. “Condensation is our enemy,” Conley said, gravely.
All these microchips, all this electricity, all these fans, all this money, all these data, all these water-cooling pumps and cables—all of it is there to tune the weights, this little file of numbers, which is small enough to fit on an external hard drive. A great deal depends on this well-tempered collection of synthetic neurons. The money spent to develop it, and others like it, represents one of the largest deployments of capital in human history.
When the finished product is ready, clones of the weights are distributed to data centers around the country, where they can be accessed through the internet, a process known as “inference.” Users ask questions, prompting the A.I. to produce individual units of intelligence called “tokens.” A token might be a small square of pixels or a fragment of a word. To write a college term paper, an A.I. might produce about five thousand tokens, consuming enough electricity to run a microwave oven at full power for about three minutes. As A.I. fields increasingly complex requests—for video, for audio, for therapy—the need for computing power will increase many times over."
https://www.newyorker.com/magazine/2025/11/03/inside-the-data-centers-that-train-ai-and-drain-the-electrical-grid
#AI #GenerativeAI #DataCenters #Energy #ElectricalGrid