Tempted to write a post that software development lost the plot a long time ago, and that the recent LLM developments are merely the icing on that cake. Software these days is not the painstaking work by people like @bagder or @hyc or @vitaut who write the best code they possibly can. Over the past decade, "the software world" has been developing in a very different way than that.
@bert_hubert Please write that post!
@bert_hubert I hope you do write it! I would read it.

@bagder @hyc @vitaut @bert_hubert

Hear hear.
Write it.

For me software development has gone to shit when "software engineers" were blinking their eyes in confusion at 3rd normal form in relational databases... And it's only gone worse.
I think they don't even teach Knuth's TAOCP

@bert_hubert Dunno. We complained about poorly flung together software back in the nineties. There might be more of it now, but there also is much much more software now.

Similarly, people complain that they can’t find a plumber who cares. Maybe software is just a regular craft now?

@partim @bert_hubert

Did we have npm in the nineties? I think that's an example of what Bert is pointing to. We were certainly moving in that direction, but the days of "wget http://www.trustme.org/install-malware.sh |sh" hadn't really come yet.

@abhayakara @bert_hubert But then, the early oughts were the heyday of email viruses and people slapping together snippets of PHP they found on the Internet without understanding what they did.

The groundwork for OpenSSL becoming somewhat problematic was laid during that time as well.

@partim @bert_hubert

I also wonder why "in the past software was so much better" trope resonates with so many devs. Of course it run on hardware that was a lot weaker - it was bound to by what was available.

But "all" programs, up to OS crashed constantly and it was perceived as absolutely normal. The required RAM doubled with every Windows release for 20 years. Configuring LAN was a nightmare for non experts. Every hardware needed a custom driver. The web was slow and full of crazy IE tweaks

@ghost_letters @partim @bert_hubert I remember the crashes, but I also remember (and have re-experienced) how running e.g. an older version of Windows on contemporary hardware feels so much more responsive than modern versions (or often modern Linux, or macOS, or similar).

I don’t necessarily think they were actually faster, but they were better at giving feedback.

@ghost_letters @partim @bert_hubert It really depends on what you do/did.

I'd say the web was better before those absurd JS UI frameworks really picked up (driving efficiency and usability into the ground), but then I also just didn't use the nightmarish flash/java proprietary malware nightmare sites either so those aren't relevant as a counter-point to me.

Games did somewhat improve in their stability, for that all they're still not amazing.

(There is no reasonable expectation for a game to ever crash without some manner of hardware failure like memory corruption.)

@bert_hubert You are not wrong, in many ways, this is a natural result of the "race to the bottom" that software has been put on by the "move fast and break things" mentality it got from Silicon Valley.

That said, this accelerates it in such a way, that I have the feeling that we might finally hit rock bottom.

Also, I wish more people acknowledged the ethical hell that LLMs represent in code, but I guess not enough people in software care about ethics for that to really make a difference.

@ainmosni @bert_hubert

I feel like taking refuge in hitting rock bottom is the modern equivalent of imaging that the apocalypse is imminent so there's no point in trying to fix things (which I think is why armageddonism is so popular).

@abhayakara @bert_hubert Fair, although that is not my intention, I am fighting it, and trying to fix things, but that doesn't stop me from acknowledging that it feels more than a little quixotic.

@ainmosni @bert_hubert

I hear you. I guess I'm arguing that imagining that this work is quixotic is unnecessarily self-deprecating. This work is essential. It's just that not everybody understands that yet. The future is here now, just not evenly distributed.

@bert_hubert I think (internet) speed and "unlimited" storage have been a major factor of code quality.
You don't get punished (enough) for writing wet or sub-optimal code.
Reminiscing about the days when entire games and software packages fit on a floppy disk. And when we made websites under 20KB (including images).
@regularlabs @bert_hubert Exactly! Developers should develop on/for 10 year old hardware. If it doesn't run smootly, it is badly written or bloated.

@bert_hubert

Tired: LLMs for ‘vibe coding’
Wired: AI for zero day hunting ?

Wouldn’t mind the as-yet-virtual post to touch that. Testing is software engineering too, and ripe for more automation

https://securityaffairs.com/189131/ai/anthropic-claude-opus-ai-model-discovers-22-firefox-bugs.html

Anthropic Claude Opus AI model discovers 22 Firefox bugs

Anthropic used Claude Opus 4.6 to identify 22 Firefox flaws, most of which were high severity, all of which were fixed in Firefox 148.

Security Affairs

@bert_hubert @bagder @hyc @vitaut

When I started the Varnish Cache project, I explicitly tried to dial code quality up to 11, as an experiment to see if that was a feasible strategy.

With less than 20 CVE's in 20 years, I think we have given existence proof that "artisan code" is a valid way to produce high-consequence software (see also: sqlite)

But at the same time, we are very far from "install and forget" when you have to patch once a year.

@bert_hubert @bagder @hyc @vitaut

The downside of having so few CVE's is that they are useless for statistics, which is why I'm so glad @bagder is doing it in #Curl

@bert_hubert it rhymes with flooding the zone with shit. Billionaires win if users and product owners would stop expecting quality, because then there's no longer a point in becoming a good dev. In Silicon Valley they gave those folks at least a sense of ownership and pride. But now that is threatening their businesses. Because high performers can leave if they don't agree with the company politics. If poor quality is the norm, they can hire poor, mediocre devs who won't complain instead.

@bert_hubert

I think it would be incredibly useful to have a curated list of projects striving to a similar code quality.

@bert_hubert I can summarize the post and point the finger at the responsible party in one sentence: middle management ruined software development.

@elricofmelnibone @bert_hubert let me add: the big consulting smokesellers ruined the whole profession.

Yeah, I’m looking at you, Accenture. And you, McKinsey. And at everyone else in that ‘trade’.

@elricofmelnibone @bert_hubert Like basically everything they touch.
@bert_hubert @bagder @hyc @vitaut I mean yes but mostly because the outcome don't matters...
@Di4na @bert_hubert @bagder @vitaut if the outcome truly doesn't matter then it's probably software that didn't need to be written in the first place.

@hyc @bert_hubert @bagder @vitaut well depend. For the user yes.

For the people being paid and having to provide a plausible lies to investors to keep being paid, no.

Value judgement are rarely that absolute. Do i think we would be better off with a world in which we don't end up needing so much plausible lies as the main way to pay software devs? Yes

But the (inefficient) byproduct is a lot of paid software devs doing FOSS so...

Software Quality Collapse: When 32GB RAM Leaks Become Normal

Apple Calculator leaks 32GB RAM. VS Code leaks 96GB. CrowdStrike crashes 8.5M computers. How we normalized catastrophe—and the $364B spent avoiding the fix.

From the Trenches
@MortonRobD @bert_hubert @bagder @hyc @vitaut I'm a bit uncomfortable with the fact that this article sounds like the output of an LLM though (complete with weird diagrams and other tell tale sentence constructions, and some technical arguments that a knowledgeable engineer probably wouldn't have made 😅)
@jpetazzo @bert_hubert @bagder @hyc @vitaut
Interesting. It looked to me like it was originally done as a presentation - bullet points etc. Can you identify some of those technical arguments please? Genuine interest.

@MortonRobD @bert_hubert @bagder @hyc @vitaut

Ah, if it was initially a talk, that might explain the weird pacing.

Technically, the part that made me wince was this:

“Today’s real chain: React → Electron → Chromium → Docker → Kubernetes → VM → managed DB → API gateways.

Each layer adds “only 20–30%.” Compound a handful and you’re at 2–6× overhead for the same behavior.

That's how a Calculator ends up leaking 32GB. Not because someone wanted it to—but because nobody noticed the cumulative cost until users started complaining.”

That "real chain" looks like a weird salad of unrelated tech, totally out of place in the context of the memory leaks mentioned in the rest of the article.

I personally think that the problem isn't as much the compounding of layers in itself (we already had embarrassing memory leaks in the 90s when the stack was much simpler), but the lack of investment in quality and customer support; and that comes (imho!) from enshittification and the desire to extract as much value as possible for as little cost as possible.

𝔰𝔥𝔦𝔟𝔠𝔬 (@[email protected])

Attached: 1 image In 2015 I was on a beach in Hawai'i helping build the prototype of what became Signal. I argued that the app needed pseudonyms because abusers know their victims' phone numbers. I lost the fight that day. History proved me right, and Signal would move to usernames under @[email protected]'s stewardship. In this new essay, I trace the line from Barlow's Declaration of Independence of Cyberspace through smart-home forensics, metadata killings, and Archive Team's non-consensual Tumblr scrape to ask: when did we decide that a jpeg is a photograph, that a profile is a person, that storage is memory? The answer involves a boat off Honolulu, the early days of Signal, Iran's missiles over Amazon's Dubai AWS facilities, and the communities already building for a world where the server goes dark. This is an essay about infrastructure, memory, archiving without consent, and what we lose when we mistake the filesystem for memory. It is also the angriest and most personal text I've ever written. I'm furious, and you should be too. We bet an entire civilisation on a brutal and unreliable stack. Now, fate has come to collect that wager. California has a lot to fucking answer for. https://newdesigncongress.org/en/pub/who-will-remember-us-when-the-servers-go-dark/

post.lurk.org
@bert_hubert fair to say we lost the plot not long after the dot-bombs. "Ship it" was more important than if it even functioned as described. People continually talk shit about dot-bombs from a place of ignorance.
But they cared about building something that worked. Customers wouldn't come if it didn't work.
Now the model is "fuck you, you have to use shitware, so why would we give a damn? Here's a worthless 'feature' nobody wanted so a manager could hit a LoC metric."
@bert_hubert @bagder @hyc @vitaut I always suspected but lack the insight. So a post would be welcome.
@bert_hubert @bagder @hyc @vitaut yes please do or I'll have to get an llm to do it 👻
@bert_hubert @bagder @hyc @vitaut I call that process the "crapularity". It's a play on the "singularity" which proposes technology to self propel itself to become better and better... but the other way round. Technology gets worse and worse while taking more and more resources (both computers and people).