There's always going to be a trade off between speed, practicality, and energy efficiency. But for some long-duration tasks, it's probably best to use a microcontroller that doesn't need active cooling

Commodore 64 systems were used in industrial settings for decades. Some are probably still running

If the heat and power budget is bigger, probably something like a TTL (or modern CMOS equivalent) chip based basic computer can keep things running for just as long. And still be user serviceable

@cypnk
We were doing realtime graphics on 100Mhz computers, now we have 3Ghz computers with multiple cores but somehow the web browser pauses.

If you design your code properly you can get away with much smaller hardware.

@Binder @cypnk it's virtually impossible for anything that's been growing by accretion for 25 years to be designed - or even redesigned - in any kind of sensible fashion
@thamesynne @cypnk
Thats the secret, don't let it grow by accretion.

@Binder @thamesynne @cypnk Ironically, the hardware is rarely the issue here. Yes, there's still 8088 XT compatibility bits here and there, but on the whole, the core of modern PC architecture is relatively clean.

The problems as I see them stem from:

1) standards bodies (like PCI, USB, etc.) which imbue their projects with "features" designed solely for vendor lock-in (and I'm not even getting into DRM yet), and,

2) Software. It's always the f####ing software.

@vertigo @Binder @cypnk indeed, i was really only thinking of software when i wrote that

@vertigo @thamesynne @cypnk The software isn't too big of a deal as long as you plan ahead.

ie: what technologies are you sure are going to still be around in 20-30 years?
C++, TCP, Ethernet, SQL

@Binder @thamesynne @cypnk The examples you list are exemplars of things you can plan for, yet are dumpster fires each and every one of them individually, much less taken together.