One of my reasons for a design decision in #soupault that allows it to process infinitely large websites with limited memory but complicates its internal logic a lot was "but what if someone wants a static version of Wikipedia".
A complete dump of Wikipedia is about 86GB uncompressed.
Looks like you can rent a cloud instance with Aarch64 and 512GB RAM from Amazon now for just about $3/hr. The problem I solved (rather creatively, I want to think) doesn't really exist now.
And even without clouds, swapping to NVMe is probably a viable solution to that problem, now that it's difficult to find a machine with any spinning rust in it.
I'm going to drop that in the next release.