[this post/thread is about how I feel, I'm not looking for arguments that I'm "wrong", other thoughtful interactions are welcome]

I just saw a demo to my team at work of Cursor, an AI-powered code IDE, by another coworker who has years of industry experience. The demonstration was beyond impressive. His demonstration of adding two new features entirely by prompting in minutes -- that would have taken hours manually -- was astounding.

And I'm profoundly sad.

1/x

It was clear that he had invested time formerly on this project with the AI tooling and had developed the skills to engineer the prompts, but he was doing the demo on the fly. And it's clear to me that there are whole swaths of software that can be "written" this way.

Eliding the work of code reviews and testing and understanding the pedigree of the modules it's using, etc: the future of some software development is going to be prompt engineering.

I don't want to be a prompt engineer.

2/x

I'm oddly grateful that this has all come about on the trailing edge of my career rather than in the middle. Indeed it might even accelerate my retirement from it.

For all that I hate it, the demo today convinced me that we can't ignore AI as a software development tool. It would be like telling someone we'd rather use a stone to drive in nails over a hammer.

But it's a tool that will remove any semblance of craft and artistry and creativity from software development.

3/x

An interesting part of the demo today was that the user had to know how to solve the problem and tell the AI what he wanted done to be really successful. This is something many junior devs struggle with, which tells me many of them won't be as successful with the new tool. But can they learn the skills while still using the tool or does using the tool hobble the learning?

What does this ultimately mean for how we grow new developers?

4/x

But ultimately I'm sad. I'm sad that I hate the new tool and everything about it (energy use, how it's trained, how it's used) and I don't want to use it -- and likely won't.

And that means that my time in the industry is probably coming to an end sooner rather than later.

I was not prepared for an existential crisis today.

5/5

I'm inordinately pissed how much this stupid demo (and the resulting existential crisis) has ruined my day. I'm almost tempted to go back to the fucking gym and throw* heavy things around to see if that helps.

* figuratively throw, let's be real.

@gairdeachas
As long as you post another post-workout pic, we support this approach.

(And I am genuinely sorry. I experience some of that with academic science research.)

@gairdeachas I feel you, I've been pretty mad about it myself for a while now. I wouldn't be if it was not in the context of neoliberal capitalism. Who cares if there's machines to do what you like to do if you don't rely on it to put food in your plate? Unfortunately, we do.

Time to grow more vegetables in the yard and get ready for an even hotter climate I guess.

@gairdeachas I share many of your feelings! But I wonder if we will still think it efficient to generate code via AI when we will end up spending so much more time on the back-end debugging. When every debugging session begins with learning code no human has ever really worked with. And when we realize that the entry-level skill building never occurs and we lack the senior engineers who can think big picture and solve problems.

@disappearinjon I have these same questions about the actual benefit vs the downstream costs. My fear is that we (as an industry) will learn this lesson too late and at what cost.

It almost feels like software will be disposable -- it's "easier" to have the AI write a new module than figure out why the old one was broken. That's clearly a bad idea, but is it unrealistic to see it going that way?

@gairdeachas I’ve seen so many supposedly senior developers make the same mistake, so it’s not unrealistic. And it’s true that ignoring the downstream costs is a way of life in tech…

Perhaps we can help rebuild the industry from its soon-to-be ashes with companies that think long term? Or is that delusional?

@gairdeachas I've got a ways to go until I can even conceive of retirement, but a change is likely. I'm not going to clean up robot vomit for a career.
@gairdeachas I told an online buddy late one night recently that I was looking unsuccessfully for a user-friendly way to do some multivariate statistics under linux. He got back to me after breakfast the next day with a fully functional online app. Through prompt engineering, as you say. Astonishing!
@gairdeachas It’s comforting to know I’m not alone in also feeling this way, for what that’s worth.

@gairdeachas 🫂

This is depressing to read, but I'm glad I did. I'm probably only a little past midway through my career and I've been mostly ignoring AI code writers, expecting that they'll never really be good enough to actually take good jobs away from software developers. I may have to reexamine that some day soon. I've seen enough from you on here to value your thoughts on these topics.

@gairdeachas I’ve had very similar feelings with software design. The “AI” generated stuff is ok. Not great, but not bad either. The future is here and it is meh.

@gairdeachas I feel this so, so hard. My feelings on AI are complicated, mixed, and not internally consistent — but I know that I don’t want to rely on it for writing code. It feels unethical and unprofessional. But that seems to be where the industry is headed.

I have no solutions for you, only sympathy and shared concern for the future.