Hold on to Your Hardware
Hold on to Your Hardware
I don't buy the central thesis of the article. We won't be in a supply crunch forever.
However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
The thing is, other than AI stuff, where does a non powerful computer limit you?
My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.
I'm not arguing mind you, just trying to understand the usecases people are thinking of here.
> other than AI stuff, where does a non powerful computer limit you?
Running Electron apps and browsing React-based websites, of course.
Try Quickweather (with OpenMeteo) if you're on Android. I love it.
I'm giving up on weather apps bullshit at this point, and am currently (literally this moment) making myself a Tasker script to feed hourly weather predictions into a calendar so I can see it displayed inline with events on my calendar and most importantly, my watch[0] - i.e. in context it actually matters.
--
[0] - Having https://sectograph.com/ as a watch face is 80%+ of value of having a modern smartwatch to me. Otherwise, I wouldn't bother. I really miss Pebble.
Yr.no [1] is free, and available in English. Thanks to Norway. Apps available as well.
fun fact, you can kill all firefox background processes and basically hand-crash every tab and just reload the page in the morning. I do this every evening before bed. `pkill -f contentproc` and my cpu goes from wheezing to idle, as well as releasing ~8gb of memory on busy days.
("Why don't you just close firefox?" No thanks, I've lost tab state too many times on restart to ever trust its sessionstore. In-memory is much safer.)
Yeah, I found this out the other day when my laptop was toasting. In hindsight, probably related to archive.today or some Firefox extension.
You have to close Firefox every now and then for updates though. The issue you describe seems better dealt with on filesystem level with a CoW filesystem such as ZFS. That way, versioning and snapshots are a breeze, and your whole homedir could benefit.
I kind of hate how the www has become this lowest common denominator software SDK. Web applications are almost always inferior to what you could get if you had an actual native application built just for your platform. But we end up with web apps because web is more convenient for software developers and it's easier to distribute. Everything is about developer convenience. We're also quickly running out of software developers who even know how to develop and distribute native apps.
And when, for whatever reason, having a "desktop application" becomes a priority to developers, what do they do? Write it in Electron and ship a browser engine with their app. Yuuuuuuck!
It seems like as hardware gets cheaper, software gets more bloated to compensate. Or maybe it’s vice versa.
I wonder if there’s a computer science law about this. This could be my chance!
Sorry to burst your bubble:
https://en.wikipedia.org/wiki/Wirth%27s_law
Not exactly the same (it's about power rather than price). But close enough that when you said it, I thought, "oh! there is something like that." There's also more fundamental economics laws at play for supply and demand of a resource / efficiencies at scale / etc. Given our ever increasing demand of compute compared increasing supply (cheaper more powerful compute), I expect the supply will bottleneck before the demand does.
Ah, so you think there’s a point where actually bloat slows because we eventually can’t keep up with demand for compute?
I guess this might be happening with LLMs already
If only. At work I've got a new computer, replacing a lower-end 5-yo model. The new one has four times the cores, twice the RAM, a non-circus-grade ssd, a high-powered cpu as opposed to the "u" series chip the old one has.
I haven't noticed any kind of difference when using Teams. That piece of crap is just as slow and borken as it always was.
"chrome uses 2gb of ram"
these days individual _tabs_ are using multiple gb of ram.
I think it's a correlation vs causation type thing. Many Electron apps are extremely, painfully, slow. Teams is pretty much the poster child for this, but even spotify sometimes finds a way to lag, when it's just a freaking list of text.
Are they slow because they're Electron? No idea. But you can't deny that most Electron apps are sluggish for no clear reason. At least if they were pegging a CPU, you'd figure your box is slow. But that's not even what happens. Maybe they would've been sluggish even using native frameworks. Teams seems to do 1M network round-trips on each action, so even if it was perfectly optimized assembly for my specific CPU it would probably make no difference.
The issue isn't usage, it's waste. Every byte of RAM that's used unnecessarily because of bloated software frameworks used by lazy devs (devs who make the same arguments you're making) is a byte that can't be used by the software that actually needs it, like video editing, data processing, 3D work, CAD, etc. It's incredibly short sighted to think that any consumer application runs in a vacuum with all system resources available to it. This mindset of "but consumers have so much RAM these days" just leads to worse and worse software design instead of programmers actually learning how to do things well. That's not a good direction and it saddens me that making software that minimizes its system footprint has become a niche instead of the mainstream.
tl;dr, no one is looking for their RAM to stay idle. They're looking for their RAM to be available.
I dunno man, I have 32gb and I'm totally fine playing games with 50 browser tabs open along with discord and Spotify and a bunch of other crap.
In not trying to excuse crappy developers making crappy slow ad wasteful apps, I just don't think electron itself is the problem. Nor do I think it's a particularly big deal if an app uses some memory.
I have no issues with browsers specifically having to use a bunch of resources. They are complicated as fuck software, basically it's own operating system. Same for video games or programs that do heavy data processing.
The issue is with applications that have no business being entitled to large amount of resources. A chat app is a program that runs in the background most of the time and is used to sporadic communication. Same for music players etc. We had these sorts of things since the 90's, where high end consumer PCs hat 16mb RAM.