As someone with experience in industrial power systems, datacenter design, and a little bit of computing, I find the discussions around CoreWeave’s (bad) earnings outlook fascinating. All the discussion measures compute in terms of megawatts and gigawatts rather than a measure of compute. In 5 years, when the tech underpinning AI is 8x or 16x faster than today (per watt), the value of today’s 800Mw worth of computer will be worth substantially less than 800Mw of compute in 5 years using the technology of that day. Or is this a tacit recognition that the operations per watt are not really changing and the growth is more about power density (how many watts can fit into a 2U server is really what’s increasing). I don’t feel like it’s the latter. Maybe it isn’t supposed to make sense? 🤷♂️