All the devs saying that Anthropic’s code quality is “normal” are telling on themselves and everybody they’ve worked with
(Also supports what many have been saying about software quality being a crisis that precedes LLMs, but that’s another story)
All the devs saying that Anthropic’s code quality is “normal” are telling on themselves and everybody they’ve worked with
(Also supports what many have been saying about software quality being a crisis that precedes LLMs, but that’s another story)
@baldur the software crisis is definitely what enabled vibe coding, I feel.
When making software was an artisan process, it was hard for corporations to scale it and treat programmers like cogs in a machine.
So they've been trying more and more to fit software development into a neat mold, essentially dumbing down the process, to the point where making an app has become "just slap together some libraries and hey presto".
That has 100% laid the foundation for LLM-generated code, I feel.
@hostia @hyc @raffaella @baldur > Also a job is a job, no need to frame coding as something "super special that only special people should do".
I agree coding shouldn't be regarded as some kind of special job only special people can or should be doing.
But that being said, I hope everyone cares about the job they're doing, and try to understand how and why they're doing it the way they are, regardless of what the actual job is!
That "anyone" looks pretty elitist to me as I care for a blind 91 year old with dementia and severe recall deficits.
@dalias @baldur Yeah this is fair imho!
What I was mainly trying to convey was a sense of caring about what is you're building as a developer.
That's the thing that's lacking imho from corporate coding practices. The workers don't have to care about the software, they don't have to understand the whole thing, they just need to solve the ticket and move on.
This mindset fundamentally undermines the quality of any software project. And also perfectly lays the foundation for LLM-generated code.
@Tamtam @Tijn @baldur The reasons for caring are different. I care for both reasons in my work, but I think it's somewhat elitist to demand that, in order to work in this field, someone has to view it as an artisnal craft.
Someone can reasonably view it as purely a job, but still respect that it's a job where people's safety is on the line if they fuck it up.
The reason I bring this up is that too often, when we just focus on the artisnal aspect, the pro-AI and AI-curious crowd sneer at it as they would if we were expecting everyone to buy handmade furniture or hand-sewn clothing - fields where there is certainly a reason to respect the artisnal element, but where nobody's safety is on the line when you don't, and where most business-minded people aren't going to respect it.
@McNeely @Tamtam @Tijn @baldur Not every software project is, but basically any public-end-user-facing software deployed commercially is, because it's dealing with private data pertaining to the user that could endanger the user if it gets in the wrong hands.
This *shouldn't be* how it is. None of this software should be phoning home, outsourcing operation to cloud services, etc. But that's the way it is now, and as long as it's like that, it needs to be regulated as load-bearing, safety-critical software.
@Tijn @baldur I’d actually disagree. Treating engineering as artisanal activity is what led to rot in every instance I witnessed it.
It’s once you remove engineering from development that building software devolves into “slap libraries together and call it steampunk”.
And corporates love that. Something about being able to bullshit your way every direction makes managers so damn happy. Not having engineers ask hard questions is a cherry on top.
@slotos @baldur I agree perhaps artisanal was a bad word to use.
As I've tried to explain in another reply to this, the main thing I'm trying to convey is a sense of care that I'd like to see from the developer.
It's that care that slows things down, which is why corporate coding practices don't focus on it at all. "Just make sure it passes the tests" is what we get instead.
As I see it, LLM-generated code is just the next logical step down a line that was misguided in the first place.
@Tijn @baldur @angiebaby
Other industries are trying to replace skilled information workers with bots too.
I’m an automotive parts interpreter that specialises in crash repairs, the amount of BS generated estimates that come across my desk that are completely wrong is astonishing!
And when you provide the correct estimate to the repairer, they are in denial. “The computer said I need these parts, *you* must be wrong”.
Then they call back a week later for me to order and supply the correct parts.
We also refuse credit returns on parts orders that haven’t been vetted by us.