what if instead of investing $500 billion in GPUs and data centers, we'd invested $500 billion in like... HyperCard
ridiculous premise, obviously. because if computers were extensible, efficient, easy-to-use machines that people can use to make things and solve problems, people wouldn't need to buy new devices every year to keep up with the system requirements of their software subscriptions

@aparrish

Moore's Law of capitalist production.

@haitchfive well yeah exactly https://gauthierroussilhe.com/en/articles/how-to-use-computing-power-faster convincing argument that software devs are essentially in the business of "wasting transistors" to keep up demand for compute, in order to justify bigger and bigger investments in chip manufacturing (which in turn drive other lucrative excess infrastructure development)
How to use computing power faster: on the weird economics of semiconductors and GenAI | Gauthier Roussilhe

On the economics of the semiconductor industry and its new variations with GenAI development.

@aparrish It's worse than that. I've worked out the econometrics of it. Most of the redundant work software devs do can be characterised as integration work, which is not really needed if companies commit to semantic standards. Ballpark $1.46T - $2.2T annually. That is, the entire output of a midsize western country burnt every year. Graeber is not around to write a new book about it, but somebody else should.
@aparrish At least some of us are beginning to notice this is not "normal".