All the "memory is cheap" developers can truly shut the fuck up now.
@cancel those were good years ;_;
@cancel @fwaggle we forgot to mention to them that “memory is cheap” doesn’t mean “don’t worry about memory leaks” or “you can load seven different copies of Chromium in memory”.
@mavetju @cancel @fwaggle only seven? those are rookie numbers!
@profan @mavetju @cancel @fwaggle
I'm sure they can vibe-code their way to less memory usage real quick.
/s
@jaykass @profan @mavetju @cancel @fwaggle there's gotta be an Electron-based MCP that runs an instance per request, right?
@jaykass @profan @mavetju @cancel @fwaggle if the computer crashes that brings memory usage to a nice clean 0b
@cancel how do you expect them to remember with how cheap their memory was? /s
@cancel no more electronJS
@louisrcouture @cancel I'm down to only two electron apps I still use without being told I have to.
@cancel
And just as I was fixing to upgrade mine. I've been putting it off and now I'm sorry I'm the queen of procrastination.

@Puck @cancel

Same. Guess I stick with what I got until it falls down.

@cancel Wait, did something happen? My news intake is erratic...

(Update: I have now been informed about the LLM-driven demand for RAM, and Crucial in particular closing their consumer division.)

@woozle @cancel I am also confused.

@Infrapink @woozle @cancel the "AI" bubble has inflated RAM prices, and Micron is exiting the consumer RAM market to chase those AI dollars instead.

https://futurism.com/artificial-intelligence/ai-data-centers-ram-expensive

AI Data Centers Are Making RAM Crushingly Expensive, Which Is Going to Skyrocket the Cost of Laptops, Tablets, and Gaming PCs

On top of devouring energy, water, and graphics cards, the AI industry is now swallowing the world's supply of precious RAM.

Futurism
@koorogi @Infrapink @woozle @cancel first it was GPUs and now this

@tragivictoria @koorogi @Infrapink @cancel

Hmm, is this maybe why MSFT went "plonk" 2 days ago? 🧐

@koorogi @Infrapink @woozle @cancel i genuinely find it amazing how badly hypes hurt actual people. You wanted to upgrade your machine? "Haha fuck you, pay more"

@tragivictoria @koorogi @Infrapink @cancel

In compensation, there will probably be a lot of really cheap high-end hardware hitting ebay when the bubble finally pops... (...assuming it doesn't take civilization down with it 🧐).

[plea for AI bubble to pop before destroying civilization, please sir may we still have some civilization left when you're all done, sir... 🧎‍♀️🙏]

@woozle @tragivictoria @koorogi @cancel

Unlikely. I'd love if that were to happen, but the way companies work, they're more likely to just chuck their old computers in a dump.

@Infrapink @tragivictoria @koorogi @cancel

Hopefully there will be a thriving dumpster-dive initiative...

@woozle @Infrapink @tragivictoria @koorogi @cancel

Unfortunately they shred the entire server.

This has been going on for years. All the big data center operators get their machines at a special low price in exchange for shredding them after three years. It's a huge contributor to the poorly-documented "embodied energy" use of the industry.

@publius @Infrapink @tragivictoria @koorogi @cancel

Okay, add that to the list of things that need to be illegal after the revolution.

@publius @woozle @tragivictoria @koorogi @cancel

~Capitalism is SO efficient!~

@Infrapink @publius @tragivictoria @koorogi @cancel

Efficient at extracting wealth from the rest of us (and burying it in a pit), yup. 

@woozle @Infrapink @tragivictoria @koorogi @cancel

It's worth observing that even the radical market economists only claim that the market economy is the most efficient possible (by a completely circular definition of efficiency). Much as they might like to, they cannot claim this about capitalism, which inherently and inevitably involves what they call "market non-idealities", that is, violations of the basic assumptions of free enterprise.

@publius @Infrapink @tragivictoria @koorogi @cancel

Yah -- I've had "discussions" with a few capitalists, especially back in the day on G+, and... it's an awful lot like a religion.

@woozle you'd probably end up with some peeps in a random overseas factory transplanting ram ICs from weird NVIDIA formfactors to regular DIMMs, like how there are laptop CPUs available with desktop-compatible interposer boards
@Infrapink @woozle @cancel RAM is becoming more expensive and some manufacturers have stopped selling to consumers directly because they're selling to data centers to use in gen AI/LLMs instead

@woozle @cancel One of the big makers of RAM, Micron, announced that they are only going to sell to ai data centers from now on. Their consumer products like their "crucial" brand are all going away forever.

It'd be nice to open a new RAM factory in response but the capital outlay for that is something like a billion dollars.

@madengineering @cancel I can't imagine that the LLM-driven demand will last more than another year; the whole thing is a financial shell-game -- companies giving each other free access, and the recipients counting it as "investment" which they can use to further overvalue their stocks...

(I feel like some metaphorical heads will eventually metaphorically roll over this; aren't those sorts of numbers supposed to be vetted by certified accounting firms?)

@woozle @cancel While this is my hope. The market can remain irrational longer than most people can remain solvent.

@madengineering @cancel

Yeah, it's definitely going to cause issues for any non-billionaires out there. 

@cancel

Thankfully Linux people don't seem to have that attitude. Even the big bells-and-whistles desktop environment will run comfortably on 4GB.

@argv_minus_one@mastodon.sdf.org I wouldn't be so sure. Have you ever seen malloc() return NULL in Linux?

Its memory manager writes cheques that it can't cash. This is why you see so many oom_kill problems.

@xconde

I have a laptop next to me with 4GB and it runs KDE Plasma without incident, so yes, I'm quite sure.

@argv_minus_one @xconde yea, 4GB goes a very long way with even an un-tuned install. mint + xfce + disabled swap = my crappy chromebook runs great, I can launch a couple browsers and an IDE or two simultaneously, and I've only had it die from that when intentionally pushing the limits.

@xconde overallocation is a design decision, not a bug/problem. it's safer to kill apps with heuristics instead of trusting developers to handle malloc failures properly. a fatal malloc failure kills whatever process gets caught in the crossfire, not necessarily the heaviest or problematic one(s). malloc failures being fatal is a common pattern across languages that don't make the developer explicitly malloc().

virtually no developer (and that includes Vel) wants to think about malloc failures outside of embedded/critical systems.

@argv_minus_one

@xconde @argv_minus_one doesnt linux have an option to disable memory overcommitting? (vm.overcommit_memory sysctl)
Overcommit Accounting — The Linux Kernel documentation

@unnick true, it does! I’ve never come across a distribution that enables it by default.

@xconde @argv_minus_one you’re making a different argument than the one you’re responding to

argv is saying common linux desktop environments use a modest amount of RAM, and you’re saying, at a technical level, the allocator doesn’t reject requests

but that’s unrelated, you can have a malloc() which is careful to not run out of real memory and still have user applications on top of it which are super disrespectful memory hogs which use every page they can get

@argv_minus_one @cancel back 20 years ago, 4 gb is fing HUGE

@usul while Vel wasn't alive 20 years ago, it's quite certain that low end devices were not using 1920x1080@60 24-bit displays. triple-buffering wasn't used, since there were no fancy smooth animations, etc

using more RAM is often for the purposes of speeding things up and ultimately reducing the load on the CPU/GPU/disk.

@argv_minus_one @cancel

@risc @usul @argv_minus_one

macs were triple buffered by default in 2001 (mac os x) windows was triple buffered by default in 2006 (vista)

@cancel got it, apologies. should have checked.
@usul @argv_minus_one

@risc @usul @argv_minus_one

don't worry it's not that big of a deal.

@cancel
Something is different though. On our laptop almost half the RAM is used by the iGPU driver's RAM spillover. This was not the case 5+ years ago.

That (and websites getting fatter) are the main reason it regularily OOMs.

Got any info/hypothesis on that?
@risc @usul @argv_minus_one

@curiousicae sounds like either a bug or strange settings... vel's iGPU only gets any significant amount of RAM when it's actively using something heavy

@cancel @usul @argv_minus_one

@risc @curiousicae @usul @argv_minus_one yeah usually you allocate a small amount of fixed memory for the iGPU in BIOS for software which can't dynamically allocate it for some reason (certain old APIs?) and then allow the GPU driver to carve out more memory for graphics from shared as needed using the proper APIs.

@cancel
Most of the used memory is “GTT” (horrible Linux kernel jargon for “RAM dynamically allocated by the GPU” - took me years to understand that radeontop line).

I think the static assignment is actually for the iGPU, so that it can function without the graphics driver being present.

Seems like its feeling modest right now though (usually GTT is easily twice that) – really doubt AMD fixed anything for a “legacy” GCN 1.0 card in their driver suddenly:
@risc @usul @argv_minus_one

@cancel

That is a good example of using more RAM to make things faster and smoother. It's just that it became an example earlier than @risc thought (because RAM was already fairly plentiful in 2001).

Like, you know how gamers these days say to run video games in windowed mode instead of full screen to make them run smoother? That's because the window compositor is triple-buffering the game. In full-screen mode, this doesn't happen (unless the game itself does triple buffering).

@usul

@risc
why in the everloving god did we ever design for more than 1024x768 anyway
@usul @argv_minus_one @cancel

@ozzelot

As someone who is very much enjoying his new 3840×2160 screen, it's because I can see *everything* on this huge damn screen. I love it so much.

Now if apps could go ahead and stop wasting space on unnecessary padding, that'd be great. I'm using a desktop with a mouse, not a phone!

Although I do have a touchscreen laptop, so… 🤷‍♂️

@risc @usul @cancel

@argv_minus_one
You can stuff so many 1024x768s in that
@risc @usul @cancel

@ozzelot

10.546875 of them, to be precise.

@risc @usul @cancel