As much as I'm all in favour of blaming AI bros for the RAM shortage, there was always going to be a squeeze on RAM prices for some reason eventually, and we really did make sure it would be maximally painful for ourselves when we started treating RAM as functionally free.

RAM prices in 2026 are about the same as they were in 2008 when the original MacBook Air launched. It had 2GB of RAM and it ran like a goddamn dream. There is absolutely zero technical reason we couldn't ship an even better computer with 2GB RAM today — except everything these days is a JavaScript app running in a dedicated Chromium instance and needs at least its own gigabyte to run around in.

That said, it's probably pretty safe to blame AI bros for everything being a Chromium instance — I'm willing to bet it's largely the same people trying to ship something that just about works with the absolute minimum of time and effort now as it was then

@andrewt

Absolute minimum of time and effort

Yup, that describes most LLM users I know.

@andrewt while I generally agree with you about our RAM hogging problem, OpenAI literally cornered the market on RAM wafers:
https://www.tomshardware.com/pc-components/dram/openais-stargate-project-to-consume-up-to-40-percent-of-global-dram-output-inks-deal-with-samsung-and-sk-hynix-to-the-tune-of-up-to-900-000-wafers-per-month

It is absolutely correct to blame the AI bros for this one.

OpenAI's Stargate project to consume up to 40% of global DRAM output — inks deal with Samsung and SK hynix to the tune of up to 900,000 wafers per month

Working at scale.

Tom's Hardware
@rysiek @andrewt The beginning of Andrew's post is a bit, so to speak, clickbaity, but I think he didn't mean to deny the AI bros causing the current squeeze and was moreso pointing out that some kind of issue like this was bound to happen eventually and we should've been better prepared.
@flesh @rysiek @andrewt Do you have a personal bunker? ​
@bunny @rysiek @andrewt That's classified and not really what was meant by "prepared".
@flesh @andrewt @rysiek It's the same mindset, though ​
@bunny @rysiek @andrewt I mean, not really? Like, the underlying idea is to not take respurces (in this case, RAM) for granted and be ready in case of a shortage. It's less "personal bunker" and more "don't cover your room in redundant low-efficiency lightbulbs running all the time".
@flesh @andrewt @rysiek warmmm ​​​​​​​​​​​​​​​​​​​​​
@andrewt What's less bad: an application built as a Chromium instance, or a native application that ends up unavailable to most users because Linux and Windows don't implement the Cocoa API of macOS?
@PinoBatch I mean yes, cross-platform releases are a great thing — but the day before Electron came out we already had the JVM and .NET/Mono so I'm not sure what Electron contributed there except that now we can make our cross-platform apps in a language with no native support for integers, with UIs that diverge wildly from the native OS conventions

@andrewt JVM was owned by Oracle since 2010, and the ASF felt frustrated about inability to license the Technology Compatibility Kit (TCK) for Harmony. Mono's reimplementation of .NET Windows Forms always felt janky and crash-prone in my experience as a user.

I've seen WebAssembly as a successor to the JVM, except less opinionated about a particular object-oriented paradigm and not under encumbrance of One Rich American Called Larry Ellison.

#java #dotnet #wasm

@PinoBatch Oh yeah, there are some very good reasons JVM and .NET didn't catch on like Electron did — but they do at least prove that cross-platform app development can be done with reasonably sized dependencies shared between apps.

Heck, Tauri manages that, and that's just Electron but using the OS' native webview. Even if we accept that the web is the best shared abstraction we have, we don't need to ship a whole browser with every app

@andrewt I think one of the reasons that Electron caught on as opposed to Tauri's approach of using the system HTML renderer is that Apple WebKit was missing so much compared to Blink and Gecko. See "Progress Delayed Is Progress Denied" by Alex Russell.
https://infrequently.org/2021/04/progress-delayed/
Progress Delayed Is Progress Denied

Apple's iOS browser (Safari) and engine (WebKit) are uniquely under-powered. Consistent delays in the delivery of important features ensure the web can never be a credible alternative to its proprietary tools and App Store. This is a bold assertion, and proving it requires examining the record from multiple directions.

Alex Russell
@andrewt @PinoBatch and additionally, im pretty sure its possible to make webapps that dont require absurd amounts of memory, even if you're using JS and not WASM.
@andrewt @PinoBatch I am not sure it's a good reason but most 'application's today don't really have a protocol, they are intended to run an up-to-date collection of minified javasctipts directly off the platform provider's cloud server, and out-of-date collection of javascripts is not supported.

It's possible to repackage that collection of javascripts as an electron 'application' but it's not really feasible to replicate and maintain that in a different language.
@PinoBatch @andrewt The least bad is a project that doesn't have absurdly OS-specific requirements and can be compiled for any reasonable platform without needing to rely on a massively bloated abstraction.
@flesh @andrewt I don't know what counts as a "reasonable platform" given that Microsoft, Apple, and the free desktop community haven't agreed on a GUI API other than the Web.
@andrewt I fear the hidden issue remains the US/China trade war. AI tech is around from a while, like in Nvidia gpu for the DLSS (AI frame generation), the latest year boom isn't so unexpected for who really worked inside the tech. Yet, is like producers didn't upgraded much their production line.
I think they are feared that some sudden US or China policy could make collapse side of the international trade and put at risk these investments, and the risk is very high.
@andrewt I love that the game developpers Mega Crit are ensuring Slay the Spire 2 will be playable on the worst potatoe PC they can. We need more of this state of mind (meanwhile, Path of Exile is burdening the f*** CPU? Come on!)
@BrKloeckner @andrewt Yeah, I feel like we should have a list of modern apps that don't waste ressources, and because they're pretty scarce, they'd be promoted just doing that. And most people that aren't fan of buying no computer every 5 year would flock to them first. And rightly so. Having an optimized program should be a major thing.

@andrewt moore's law's greatest contribution to computing is in allowing software developers to optimize their code less and less as the years have passed

who cares if it takes 10x the RAM if current hardware is 20x as powerful? what, some people are using old hardware? well they should just upgrade it then! i hardly see how tha'ts my problem

it's a symptom of overconsumption, the attitude of not doing more with less, but just throwing endless resources at an roadblock you run into

@andrewt I hate that every major app is now a massive 500MB of RAM behemoth, because every app includes a Chromium instance plus an entire UI framework, etc.

@andrewt "there was always going to be a squeeze on RAM prices for some reason eventually,"

it's like saying that "we run out of water on earth but we would exhaust it for some reason eventually" xD

Sorry but it's clear. Big data centers require RAM. This memory was bought by money that do not exist. I don't have such money as I need to work every day to afford basic meal but Sam Altman and Microslop could afford it by fucking up economy.

It's clear that it's work of big tech.

@andrewt Also 32G of RAM is still something not really needed for most of personal use computers.

If you don't play games, make water simulations, run own server or learn how to alocate big chunks memory in C, 16 gigs seems to be a fair amount.

If you scroll Instagram or TikTok having 32G of ram seems to be waste of memory XD

Idk man. It's sad that something which used to cost penies now fucks up my finances only because big tech wants to get rid of personal computers in the first place.

@andrewt I feel insulted.
I am not an AI bro but I am a person who hates writing C code for GUI and stuff, all non browser GUIs suck to write.
@patricus to be fair the browser gui APIs are hot garbage as well
@andrewt better to write than most though
@andrewt may i temper this with two counter arguments:
1/ the quality of visual medium has exploded (4K video is 16 times more consuming than 720p for example), needing more ram to process
2/ virtualisation (namely Docker) has become a thing and you need ram to set up a virtual machine on the fly (and docker is cool for leaving AI bros - did I hear self-hosting?)

@skro Yeah, definitely high-def video is a big driver — the last model of Chromecast has as much ram as the OG MacBook Air and Google generally like to go "cheap hardware with fast software" where they can.

Docker's a tricky one. Certainly the industry has leaned heavily into distributing containers with all the dependencies — Electron, Flatpak, Docker... Everything's very sandboxed, which is mostly good, but it's also a bit inconvenient and not terribly efficient. In theory we should move everyone to an OS that can sandbox _without_ all this extra stuff, but in real life that's less practical than simply manufacturing an unlimited supply of RAM 🙃

That said, my understanding is that Docker is pretty light, all things considered. On my dev machine right now, I have nine containers running, and Docker is using about the same amount of RAM as VSCode. That said, I have no idea what running all the same services natively would use. Or, indeed, what they'd use if everything but the database weren't written in Node (which is of course just Extremely Headless Chromium)