My impression is that the first field that AI will massively disrupt is not medicine, not writing, and not education, but software engineering. The thing about software engineering is that the computer can often check its own answer, and iterate to a better one. Not so easy in other fields. So progress in AI writing software will be very, very fast.

I'm guessing there will still very much be a job for software engineers, but it's going to change fast.

@ben I’d put marketing slightly ahead of software, but agreed, it’s early on the list of disrupted.
@ben Yes and no. From what I see now, AI can handle tactical questions (“implement quicksort” or "write a regular expression”), but I doubt very much it can come up with a higher-level architecture, because there won't be such patterns out there on the Internet. And testing? It's not always easy even for humans to convert a requirement into a set of test cases.
@SteveBellovin I agree that architecture is likely to be an area that resists AI a little longer but... I'm not sure how long. Check this out, via GPT4:
@SteveBellovin I'd add that, even if you're 100% right, there's a lot of software engineering jobs that are strictly tactical, little architecture.
@ben @SteveBellovin there is probably a lot of software writing and testing that could be automated. Look behind and see the impact of static analyzers or sanitizers. Not very hard to project that forward with better tech. But "writing clear specs of what you want the software to do?"
@huitema @ben There is certainly a lot that can be automated—but there's a lot that can't. To me, it looks like a productivity enhancer, not a replacement, for today's programmers.

@SteveBellovin

100% agreed! For those using GPT-variants to develop code, the nature of their jobs changes slightly, but it still stakes someone who understands the problem domain to specify what is needed.

These LLMs seem like a tool like an IDE is a tool, only more so. And I love it!

@huitema @ben

@SteveBellovin @ben This and the fact that it will never be able to debug anything are exactly why I think it's main use in programming will effectively be to augment Stack Overflow.
@vathpela @SteveBellovin never be able to debug anything? I'm not so sure.
@ben @SteveBellovin I think it'll be able to spot typos and even find copy-paste errors, and that may be nice but it's not debugging in any real sense. I haven't seen anything that makes me think it'll ever have a sense of what the task at hand is and where it's going wrong.

@vathpela @ben @SteveBellovin
The main problem remains: Either a task/problem is described exactly, then all we need is a code generator.
Or there are vague requirements, inconsistencies etc., which requires a kind of "understanding" to resolve inconsistencies and create solutions that fit the requirements.

Current LLM models are just not able to do the second thing.

And checking code for internal inconsistencies is something that belongs more in the area of formal verification.

These models might replace "copy&paste" programmers, so they might accelerate a trend of "writing code that doesn't work and nobody knows why".

@wakame @vathpela @SteveBellovin maybe, but my sense is the ability to iterate, tweak, tell the LLM "no, not quite more like that" and that iteration being effortless... Could be hugely powerful.

@ben @vathpela @SteveBellovin
I think that's a very good point. Maybe in concert with more "old-school" refactoring tools and unit tests.

Ideally, over time, the "very bad" cases (bugs that take a week or more to find and fix) could be removed (or at least accounted for). We would (in a sense) be working on the same "super software repository", applying a bug fix not to a single application/library, but to a body of knowledge.

@wakame @ben @SteveBellovin we *already have that*, and it's not through ML: https://lwn.net/Articles/315686/
Semantic patching with Coccinelle [LWN.net]

@ben You're assuming here there's a limit to what can be done with software. I don't think there is, which is why I expect the number of engineers to rise, and for the pace of change to further accelerate
@DanaBlankenhorn No, I'm not making that assumption, I'm only saying that the *job* of the software engineer is likely to change faster than other jobs. You're absolutely right that this may require *more* software engineers over time, I'm not sure. Just will be a fairly different job.
@ben Software engineering is always changing. My dear heart started out 40 years ago as an Assembly programmer. Then COBOL, then C+, now a system architect.
@DanaBlankenhorn Agreed. I just think this change is going to be much bigger, much faster.
@ben the changing is since ever exponential for software engineers but it accelerates now kind of totally free

@ben There's the more obvious bits where GPT can cobble together code that performs the task asked of it, getting there by cribbing bits and pieces from its training data. In that branch the roles of SWEs becomes that of validators and in some ways, architects.

The longer-term evolution I'm thinking about is if/when the computer itself no longer needs code to deterministically instruct it what to do, but is able to take natural language as "code" itself to execute. In that scenario, it feels almost sci-fi, in that we'd be building systems that we rely on where we no longer have any understanding about how any of it works underneath.

@ben
I have been trying to integrate into my workflow, and the current iterations still have a ways to go.

It has replaced stack overflow, and it is very helpful when I want it to explain how to interact with a new api. Beyond that its like an employee that has negative return on value, kind of doing the right thing but mucking it up enough that I waste more time than in would have originally just doing it myself.