I used AI. It worked. I hated it.

https://lemmy.world/post/45089084

I used AI. It worked. I hated it. - Lemmy.World

cross-posted from: https://programming.dev/post/48191305 [https://programming.dev/post/48191305] > Or maybe that’s just me. I’ve been writing code for a good chunk of my life now. I find deep joy in the struggle of creation. I want to keep doing it, even if it’s slower. Even if it’s worse. I want to keep writing code. But I suspect not everyone feels that way about it. Are they wrong? Or can different people find different value in the same task? And what does society owe to those who enjoy an older way of doing things? > > If I could disinvent this technology, I would. My experiences, while enlightening as to models’ capabilities, have not altered my belief that they cause more harm than good. And yet, I have no plan on how to destroy generative AI. I don’t think this is a technology we can put back in the box. It may not take the same form a year from now; it may not be as ubiquitous or as celebrated, but it will remain. > > And in the realm of software development, its presence fundamentally changes the nature of the trade. We must learn how to exist in a world where some will choose to use these tools, whether responsibly or not. Is it possible to distinguish one from the other? Is it possible to renounce all code not written by human hands? > > https://taggart-tech.com/reckoning [https://web.archive.org/web/20260402210313/https://taggart-tech.com/reckoning/] [web-archive]

I want to keep doing it, even if it’s slower. Even if it’s worse.

You haven’t tried AI long enough. It’s faster than you, but the code you create is way better.

AI is only superficially good. If you ask it to piss some code that does something, sure it’ll do it. The code will even be readable, well-formatted and decently correct. Where AI fails is when the code lives in a particular environment, has constraints in terms of compatibility, solves a particular problem in a complex environment… Then AI will fail you spectacularly, and if you get lulled into thinking it’s cleverer than the idiot savant it truly is, you’ll get bitten hard.

Yeah, that’s cope. You can use it poorly, sure. Vibing will absolutely bite you in the ass.

But you can also feed it smaller, manageable chunks that you know are possible and straightforward, and then you can review those results and tweak and revise as needed. This is not vibe coding, but it is using AI in a way that will make you faster and more efficient.

For 95% of applications, it’s appropriate and efficient to use some level of AI. Maybe things like the Linux kernel don’t need it, but those applications are few and far between.

I’m not worried about being replaced. I’ve seen what it can do, for better and worse, and someone who knows what they’re doing needs to be in the loop. I don’t see that going anywhere anytime soon.

Really what I see happening is that projects that would take two months now take six weeks, but also end up more polished and feature complete than they would have otherwise. Maybe with that kind of efficiency gain you can cut a few people, but not that many. And most places can just use the extra 20% productivity.

If you’re doing it right and not vibing, it’s not the 2000% gain the techbros promise, but it is some gain.

Really what I see happening is that projects that would take two months now take six weeks, but also end up more polished and feature complete than they would have otherwise.

Which is not at all what we’re in reality. Stuff is actively getting worse when using AI code heavily (take GitHub for example), and companies who use AI for code aren’t producing any productivity gains, or increases profits.

There is no sign of benefit on either side of the equation.

Well, those companies are pushing AI and vibe coding so hard it’s actually stupid. CEOs are on this bandwagon that AI is going to replace 80% of their labor costs.

It ain’t gonna happen. And trying it that way is to going to do a lot of damage, both to people and the companies.

It’s a lot like the dotcom bubble, not perfectly, but close enough that we should’ve learned more from it. There’s a lot of absolutely dumb shit going around, but eventually some of it is going to stick. I expect the stuff that sticks won’t be from people humping the hype train.

This is way way worse than the dotcom bubble. The dotcom bubble didn’t burn a fraction of the cash this shit has. And of course it’s all debt, so we all know what happens when that happens.

I’ve been debated all week instructing my advisors to go to cash on my investment accounts. I think it’s a little early for that, but the closer and closer we get to that September October period, the more and more I’m going to consider it. I’m naturally a skeptic, and by habit I see catastrophy everywhere, but I think this one might actually be the big one. Maneuvering out of these bad ones takes a lot of government synchronicity, and well, that worries me the most here.

Well, the other danger is hyperinflation, in which case having cash under your mattress is pretty terrible. I went with a Euro index fund.

The problem is that you can do it yourself in 8 weeks, AI assisted equivalent in 6 weeks… Or pure AI YOLO in a few hours.

Management prefers to take the risk, specially when they are already hiring the cheapest they can’t find which won’t be great anyway.

My mindset: understand the task, prompt in detailed steps, validate each change, test thoroughly.

Management prefers to make the risk because they quite literally don’t give a toss. They understand exactly 0% of what it is that you do, if they don’t walk by you every morning they don’t even know your name, and even if they get caught absolutely asleep on the job producing absolute slop, oh well they pull the golden parachute cord and off to the next gig.

Literally no one can point at anything that AI has produced that actually works, or does, well, anything at my place of main employment. Yet we are all in. Everything is "Oh Michael has something great that was created and we can’t wait for you to see it, and Michael’s like either a) “Sure thing bob, we are just fine tuning it, can’t wait to show it off”, or b) “Yes I can’t wait to share it with everyone”, and then literally just never does.

It’s literally a circus. I have a woman on my team, who wanted to try it, wrote a pretty good prompt, asking for something pretty basic. Who like an hour later was like you know what, I’ll just do it myself, I could have had this done 45 minutes ago. It’s literal crap. Pages and pages and pages of absolute walls of text. All fancy writing, feels insightful, but of course nothing actually useful.