Last year Mark Russinovich and I wrote a paper on how Software Engineering will redefine the profession, and how early in career engineers may see an “AI Drag,” while seniors will see a significant boost. This paper was published today in the Communications of the ACM, Association for Computing Machinery

The real story though, is that we propose a program where companies must invest deeply in our EiC pipeline, not just hiring juniors but giving them formal *preceptors* (modeled after nursing) such that we create a strong pipeline of new senior engineers to take on the coming industry challenges.

Please read and share if it resonates with you!

https://dl.acm.org/doi/10.1145/3779312

@shanselman
Scott, the paper frames the problem well but undersells one angle: the mid-tier. There is a huge population of devs who are not senior by title or years but have deep domain knowledge from living inside a product or industry for a long time. Those people get just as much of a boost as seniors, sometimes more, because they know exactly what to ask for and what to reject. The binary of "senior vs EiC" misses that.
The preceptorship model is solid in theory but I am not sure it addresses the deeper problem. My wife teaches first semester procedural programming and second semester OOP at university level. She switched to an inverse classroom model, students are engaged in lectures, homework comes back looking good, no questions, no pushback. Then exam time hits and the majority fails. My suspicion is that LLM generated homework mimics what a professor would expect closely enough that it does not flag as plagiarism, but the students never actually internalized the material. The cognitive debt from the paper is already showing up in education before these people even enter the workforce.
And from what I see with EiC colleagues who are already working: they use LLMs heavily on a daily basis with the same prompt-and-accept pattern. If that habit is already baked in by the time they start their first job, I am not confident a preceptorship program changes the dynamic. The mentor can explain and guide, but if the junior has no instinct to question AI output because they never built that muscle in school, you are fighting uphill.
The core thesis is right: stop growing juniors and you run out of seniors. But the pipeline problem might start earlier than the paper assumes, at the education level, not the first job.
@shanselman
Following up on the "senior vs EiC" framing: I think domain experts deserve a closer look here. Someone with deep domain knowledge who can produce specs that verify outcomes in a consistent, non-gameable way (think end-to-end acceptance criteria where the testing logic is not leaked to the LLM so it cannot optimize for passing while missing the point) is arguably doing the hardest part of the job. They are defining what correct looks like under real conditions.
That is the exact skill the paper identifies as the bottleneck: verification. But it roots that skill in engineering seniority when in practice it often lives with the person who knows the domain cold. Implementation is what AI is getting good at. Knowing whether the result actually solves the real problem is not an engineering judgment call, it is a domain judgment call.
The concession is that for things like concurrency, security architecture, and systems design, domain knowledge alone is not enough. But for a large share of actual product work, the person who can say "here is what done looks like, prove it without seeing my rubric" is more valuable than the person who can write the code. The paper's hierarchy flips in those contexts.
So the talent model is not a pyramid with seniors at the top and EiCs at the bottom. It is more like a matrix where domain depth and engineering depth are separate axes, and AI compresses the engineering axis while making the domain axis more important than ever.

@shanselman hey Scott, very good analysis and solution to a problem that might work well in a different society. In fact, that's the society I would prefer to live in. I would love to help the EiCs along, as I already try to do, but there are very few in our society.

I know you are the academics in this discussion but we have to address the real-world problem that the EiCs face. It's not the lack of support from the experienced, it's the lack of support from employers.

Our bosses tell us they would love to hire more help, but they don't have the money. They want us to use AI *instead* of hiring less efficient EiCs and mentoring them.

Mentoring may be a requirement for long-term success and stability but the Venture and Equity owners of just about everything are not concerned about anything long-term. Make it big and brash so that some other poor slob or maybe wallst will buy the company and then if the products die, fine. Get us to IPO or sale, and no further

Sure, a small company without the VC/PE overlords could build infrastructure to support the future of Software Engineering but they would be smashed, stolen, or bought out long before they could be helpful.

I weirdly long for the days before automated trading (that I cut my teeth on) and back to the before times, when you invested in a company for more that a few hundred microseconds.

When the requirements were for solid return, rather than constant growth, and the owners at least appeared to care about customers.

We live in a society that gets stuck every few Tuesdays because of a vibe coding aesthetic that allows big companies to rush updates and not test for the wide swath they take up in the landscape.

I'm sorry. I'm afraid we don't live in the society that can hear a good solution to an existential problem and think, "yes, let's spend more and work slower so that someone else can do this when we retire." Instead, the idea will, at best, be given quiet lip-service and ignored during the next 5 quarters of layoffs by which time the books will look good enough to hand the whole thing over to someone else.
[Please note, I don't think I'm the kind of person who usually walks in and shits on everyone else's fun. I just feel really strongly that as good as your ideas are, they have to be sold to a small number of people who likely don't care about the problem you are trying to solve. I think you have to add a crazy amount of economic result for the powers to take it seriously. Without their buy-in, ... well, I've got to tell the AI to hit the backlog now, there's no one below me to do it.]
@shanselman What about enabling senior mentorship without any AI instead? Seems like a best of both worlds approach.