.> ... large-scale AI models are indeed big water consumers. For example, training GPT‑3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough to produce 370 BMW cars or 320 Tesla electric vehicles), and the water consumption would have been tripled if training were done in Microsoft’s data centers in Asia. These numbers do not include the off-site water footprint associated with electricity generation..> ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormous..> ... if we only consider carbon footprint reduction (say, by scheduling more AI training around noon), we’ll likely end up with higher water consumption, which is not truly sustainable for AI..> ... the vast majority of data centers still use potable water and cooling towers. For example, even tech giants such as Google heavily rely on cooling towers and consume billions of liters of potable water each year. Such huge water consumption has produced a stress on the local water infrastructure; Google’s data center used more than a quarter of all the water in The Dalles, Ore..> ... some AI conferences have requested that authors declare their AI models’ carbon footprint in their papers; we believe that with transparency and awareness, authors can also declare their AI models’ water footprint as part of the environmental impact. - The Markup: Water Footprint of AI Technology
- A conversation with
Shaolei Ren and Nabiha Syed

#TheMarkup #NabihaSyed #ShaoleiRen #AISalami #ChatGPT #CarbonFootprint #WaterFootprint #California #Oregon #DallesOregon #Virginia #DataCenterCapital #VirginiaLoudon #LoudonCounty

The Secret Water Footprint of AI Technology – The Markup

A conversation with Shaolei Ren