"To get Colossus up and running fast, xAI built its own power plant, setting up as many as 35 natural-gas turbines—railcar-size engines that can be major sources of smog—according to imagery obtained by the Southern Environmental Law Center. Pearson coughed as we drove by the facility. The scratch in my throat worsened, and I rolled up my window.
xAI’s rivals are all building similarly large data centers to develop their most powerful generative-AI models; a metropolis’s worth of electricity will surge through facilities that occupy a few city blocks. These companies have primarily made their chatbots “smarter” not by writing niftier code but by making them bigger: ramming more data through more powerful computer chips that use more electricity. OpenAI has announced plans for facilities requiring more than 30 gigawatts of power in total—more than the largest recorded demand for all of New England. Since ChatGPT’s launch, in November 2022, the capital expenditures of Amazon, Microsoft, Meta, and Google have exceeded $600 billion, and much of that spending has gone toward data centers—more, even after adjusting for inflation, than the government spent to build the entire interstate-highway system.
(...)
Even conservative analyses forecast that the tech industry will drop the equivalent of roughly 40 Seattles onto America’s grid within a decade; aggressive scenarios predict more than 60 in half that time. According to Siddharth Singh, an energy-investment analyst at the International Energy Agency, by 2030, U.S. data centers will consume more electricity than all of the country’s heavy industries—more than the cement, steel, chemical, car, and other industrial facilities put together. Roughly half of that demand will come from data centers equipped for the particular needs of generative AI"
https://www.theatlantic.com/magazine/2026/04/ai-data-centers-energy-demands/686064/
#AI #GenerativeAI #BigTech #DataCenters #Energy