IBM's new processor-in-memory (which is, besides, not a new idea, and for AI it's mostly MACC-in-memory) will reduce the energy consumption per computation for LLMs.

But if energy efficiency gains would reduce emissions, we would not have climate change. The entire history of the industrial revolution starting with the steam engine is one of energy efficiency gains.
#FrugalComputing

@wim_v12e

1. Didn't Mythic already deploy this idea? (that was analogue multiply-sums in SSD) Hence I've incorporated it into my hypothetical early-on!

2. I'm sceptical about the life-cycle emissions: Do these chips, like most, cost more to make than to use?

@alcinnz Yes, they did. And Groq also uses a systolic array. I have also seen work with ReRAM.

I think the embodied carbon will likely be of the same order of a GPU or CPU of the same area, it depends of course on the technology used but it is unlikely to be much lower. Also, these chips are accelerators so the CPU power consumption will still be there.

@wim_v12e
Sounds a lot like that million-core OIC I remember hearing about being proposed back in 2010ish(?), with all the memory and CPU being in the same block? Or am I misunderstanding?