AI solutions, especially the large-scale models, have significant energy needs across two main phases: training and inference.
While training a model like GPT-3 consumed an estimated 1,287 MWh—enough to power a hundred U.S. homes for a year—the energy consumption of inference (the act of running the model for every user query) is rapidly growing to become the dominant energy cost.


