what if instead of investing $500 billion in GPUs and data centers, we'd invested $500 billion in like... HyperCard
ridiculous premise, obviously. because if computers were extensible, efficient, easy-to-use machines that people can use to make things and solve problems, people wouldn't need to buy new devices every year to keep up with the system requirements of their software subscriptions

@aparrish

Moore's Law of capitalist production.

@haitchfive well yeah exactly https://gauthierroussilhe.com/en/articles/how-to-use-computing-power-faster convincing argument that software devs are essentially in the business of "wasting transistors" to keep up demand for compute, in order to justify bigger and bigger investments in chip manufacturing (which in turn drive other lucrative excess infrastructure development)
How to use computing power faster: on the weird economics of semiconductors and GenAI | Gauthier Roussilhe

On the economics of the semiconductor industry and its new variations with GenAI development.

@aparrish It's worse than that. I've worked out the econometrics of it. Most of the redundant work software devs do can be characterised as integration work, which is not really needed if companies commit to semantic standards. Ballpark $1.46T - $2.2T annually. That is, the entire output of a midsize western country burnt every year. Graeber is not around to write a new book about it, but somebody else should.
@aparrish At least some of us are beginning to notice this is not "normal".
@aparrish these days the web's conceptual lineage from hypercard does feel like one of recent history's crueler jokes.
@brennen @aparrish the mistake was connecting to other people/computers :)
@aparrish also, that would not help billionaires to solve the problem where, once every couple/few years, they need to buy a new superyacht that is marginally more luxurious than their current & previous superyachts

@aparrish God, if only.

14-year-old me is nodding vigorously.

@aparrish imagine the public transport infrastructure 500bn could buy!
@Thebratdragon @aparrish thatd get you almost 4 california high speed rails
@aparrish I was just having this exact conversation yesterday! We're all supposed to nod and go "Okay I can see that use case" when people say they use LLMs for boilerplate code or whatever...but is it worth spending $500B so that you don't have to like, configure your IDE's snippets?
@aparrish let the record state that it's 2025 and there's no official Delphi-like #OpenSource RAD IDE for, say, #python. (Python's official GUI toolkit is laughably TCL/Tk). Not even $5 million can be scrounged for funding this, by, say, the #Linux Foundation, which is a sum of money which is next to nothing for them.
Lazarus Homepage

Lazarus is a professional open-source cross platform IDE powered by Free Pascal

@david_chisnall @aparrish I think that's awesome, but it hasn't gained any mainstream traction to speak of (like python has). Just saying: I've coded in all of Hypercard, Pascal, Delphi, python, and TCL/tk, btw.

@d1 @aparrish

Yup, I've grumbled a lot about this. The Free Software movement largely missed the point, by almost never creating tools that made end users care about the rights that they had. MS Office has more end-user programming support than almost anything from the GNU project (with the exception of EMACS, and that's only because EMACS was a reimplementation of a Lisp Machine editor). You don't make Free Software successful by using licenses that require a law degree to understand, you make Free Software successful by writing software that makes end users exercise their rights to modify and distribute software and then complain when they don't have those rights in other software that they use.

@david_chisnall @aparrish great post. I agree. The Unix philosophy (which is to make small, modular CLI tools that you can pipe into each other) holds back such a comprehensive vision from appearing: a complete RAD GUI IDE - the sort that only a fairly large budget and team, headed by a benevolent dictator, not a committee - could create (like Delphi).

The #Unix philosophy falls on its face when it's time to create an OpenSource Delphi *that everyone would want to use, and gains traction*, a sizeable and formidable undertaking (not to mention any credible threat to Active Directory). This failing does not go unpunished: along comes Microsoft with their Visual Studio Code, filling the gap, and leading impressionable, naive #OpenSource newcomers astray to #Microsoft technologies like Azure, C#, etc.

@aparrish almost as though on cue, look what the #Linux Foundation just released:
https://www.phoronix.com/news/LF-Networking-Essedum-1.0
"Essedum covers data ingestion, pipeline orchestration, and model deployment for AI-powered networking solutions."
#AI #infosec #OpenSource
Linux Foundation Networking Releases Essedum 1.0 For AI-Native Network Apps

LF Networking, the networking group within the Linux Foundation, announced from the Open-Source Summit Europe today the release of Essedum 1.0

@aparrish I genuinely thought that hypercard was that one slot on my thinkpad that ive literally never used and dont even know what plugs into it...

but yea, having other multimedia or hyperlinked stuff would be great? I always hoped we'd move away from the desktop model

@aparrish what about electricity grid, next time there's a bad weather happening there will outage...
@aparrish I remember HyperCard from the late 80s early 90s, everyone around me was pushing me to get into it, saying “Ian, this is perfect for you” etc. Months and months turned into years, I couldn’t get it to do one single thing at all – it just never worked.
@aparrish we could get like... HyperHyperCard!
@aparrish it makes me want to reactivate the FoxReplace browser extension so I can replace "ai"/"artificial intelligence" by HyperCard on webpages :D
@yhancik "can hypercard stacks suffer" lol
@aparrish it really does a good job highlighting the absurdity of it all
@aparrish Investing $500M into HyperCard would have done far more to make programming accessible than investing $500B into GenAI.
@aparrish if you define the latter as broadly as is permissable then what we got for that is the world wide web, which of course the former is trying very hard to kill off right now
@aparrish CERN invested way less than that and turned HyperCard into the WWW. Or "networked hypermedia"...
@aparrish
I haven't laughed this hard in a while. Thank you.
@aparrish maybe we’d be able to get colour ?

@aparrish Hmm... we kinda have seen that with browsers.

Software grows to the point that it can no longer be managed. Having limitations, for example in workforce, force projects to keep complexity to a minimum and finding "smart solutions" instead of "obvious solutions".
The result is that more and more resources are spent just to keep it running.

@aparrish If I had the ressources, I would like to make a startup using all the tech key words like containerized infinite growth, full non-supervised automation and so on, but the product would be farm in container with robot doing the watering and pickup. All the VC money would end up producing food and if the startup fail there would still be theses small farm everywhere
@aparrish @foone We took the wrong step years ago …

@aparrish

Better question - what if instead of spending $500 billion on GPUs and datacenters we invested in people?

@aparrish This makes me want to write a modern Hypercard (Something I lack the skills to do).