| Codeberg | https://codeberg.org/nils-ballmann |
| GitHub | https://github.com/nils-ballmann |
All the devs saying that Anthropic’s code quality is “normal” are telling on themselves and everybody they’ve worked with
(Also supports what many have been saying about software quality being a crisis that precedes LLMs, but that’s another story)
Rubber Ducking
WOW! This is insanity!
Who’s responsible for rising RAM prices? One culprit: OpenAI.
They locked up 40% of all DRAM supply—without any obligation to buy any of it! Now that they scrapped plans to expand their data centre in Texas, prices are falling fast!
As a result, suppliers like Micron are in free fall!
https://thedeepdive.ca/openai-locked-up-40-of-global-ram-with-no-obligation-to-buy-any-of-it/

In October 2025, OpenAI CEO Sam Altman flew to Seoul and signed letters of intent with Samsung Electronics and SK Hynix — the world’s two largest memory chipmakers — targeting 900,000 DRAM wafer starts per month. Analysts estimated that volume at roughly 40% of global supply. South Korean President Lee Jae-myung stood alongside the chipmakers […]
People keep assuring me that LLMs writing code is a revolution, that as long as we maintain sound engineering practices and tight code review they're actually extruding code fit for purpose in a fraction of the time it would take a human.
And every damned time, every damned time any of that code surfaces, like Anthropic's flagship offering just did, somehow it's exactly the pile of steaming technical debt and fifteen year old Stack Overflow snippets we were assured your careful oversight made sure it isn't.
Can someone please explain this to me? Is everyone but you simply prompting it wrong?
It's a good thing programmers aren't susceptible to hubris in any way, or this would have been so much worse.