Michael J. Nicholson

@michjnich

> Where do those of us who enjoy the coding part of development go from here?

I am thinking about the same these days. The answer may be to somehow join forces and open slop free shops with like-minded people. I don't know if there's a market for that, though, as it seems all decision makers drank the Kool-Aid.

@decibyte @michjnich

I'm privileged enough to work at a small company where we get to choose our own tools. This means that nobody is asking me to use AI at all.

My guess is that companies that stick to making solid systems with no AI code in it will end up doing better on all measures and hence be more competitive in the end. AND, when the bubble bursts, they won't lose anything. I'm for sure not going into "vibe coding", but those that are may be in for a rough awakening when it all comes tumbling down as according to https://www.wheresyoured.at/the-enshittifinancial-crisis/

The Enshittifinancial Crisis

Soundtrack: Lynyrd Skynyrd — Free Bird This piece is over 19,000 words, and took me a great deal of writing and research. If you liked it, please subscribe to my premium newsletter. It’s $70 a year, or $7 a month, and in return you get a weekly newsletter that’

Ed Zitron's Where's Your Ed At
@agger @decibyte I listen to most of Ed's suff on his podcast :) I'm 99% sure there's a crash coming, and hopefully sooner rather than later ... but AI is here to stay. My suspicion is it may be more local models doing things once it's all calmed down, because the whole claude/GPT hosted stuff just doesn't seem to be financially self-sustaining, but we'll see, and I don't think I'm clever enough to make predictions on what's left over after the crash :)

@michjnich @decibyte

I do believe some good can come of locally running, single-GPU and specially trained LLMs doing very specialized tasks. Translation is one of the things they're actually good at. What I don't believe is that the ChatGPTs and Co-Pilots of this world will ever become truly useful, let alone profitable. So I agree 🙂

@agger @decibyte There's one thing I don't believe LLM activities will ever be, and that is 100% reproducible. This is a feature, not a bug, and to me, it makes them unreliable for a whole lot of use cases.

But yes, they definitely have their uses. Many of which are related to fraud, scams, harassment, propaganda and other illegal activities. Along with a few genuinely useful cases too of course :)

@agger @decibyte Actually, what I'd like to try at some point in the future (personal energy permitting) is to set up a home automation JARVIS kind of thing, but just on my home net, with some speech recognition and then integrated with other smart stuff (carefully selected to be stuff that doesn't insist on phoning home and constantly being connected to work - if that even exists any more ...). So home alexa without all the spying on me :) I feel that should be possible ...
@decibyte That's a nice idea - I had wondered if there'd be a place for people who could sell themselves as "fixers" for the mess AI had made of a codebase :) Fact is though, it is getting a lot better, and I read predictions from people like Simon WIllison (https://simonwillison.net/2026/Jan/8/llm-predictions-for-2026/) and it does worry me ...
LLM predictions for 2026, shared with Oxide and Friends

I joined a recording of the Oxide and Friends podcast on Tuesday to talk about 1, 3 and 6 year predictions for the tech industry. This is my second appearance …

Simon Willison’s Weblog

@michjnich I honestly don't know if I'd enjoy that, as it means exposure to the slop. They'd have to pay good money for it, at least :)

But I also don't think it's going to happen. Managers everywhere are so invested in this now that backtracking like that will be professional suicide.

My idea is more like your local, biodynamic farm producing vegetables people know are nutritious and of good quality. But for code.

@decibyte I think you just summed up rather eloquently why I'm looking forward to the crash/bubble popping. I'm hoping a lot of the "AI in all the things, whether users want it or not" will go away after that. I _suspect_ we'll be left with a lot of smaller, local AI usage over the big players of today. I do think we're stuck with it in some form now though.

I like your "biodynamic code" analogy ... and maybe, just maybe, there will be a place for "artisanal coders" once all the dust settles.

@decibyte Here's hoping anyway.

I just spent today making a fairly small change using an agent, since we're being pushed that way at work, and it took me longer using the agent than it would have done to do this stuff manually. Probably some of that is about me needing to teach it how to do things like run linting/formatting/type checking/tests etc and react accordingly, but also, even with all that, this was only a handful of files, and I could have done it in about 30 mins I reckon ...

@decibyte @michjnich maybe in the future, a low-code solution will mean "using less code" as in "less stuff can break, less stuff to maintain, faster and more performant, cheaper to scale"

And "less code" and complexity doesn't point towards AI (although maybe for "post-processing").

@benjaoming @decibyte How do you mean @benjaoming ? We've seen low-code tools come and go for years? Or do you mean that complex code will become "prompts as code"?

@michjnich @decibyte "low-code" as in "low amount of code" 😁

The actual low-code systems are an income generator for future developers, as they tend to need a lot of help. And now we're seeing the same for AI.

AI will produce more complexity because it favors self-contained changesets that do not integrate very nicely. An example from the Python world: the unnecessary local imports:

```
def view_written_by_ai(request):
from .models import TheModel # <= why?
```

@benjaoming @decibyte Ah, yes. I see a lot of that. And the "need a lot of help" is a big thing I think and it's hard to unpick how much the folks really loving the AI stuff are actually becoming Doctorow's "reverse centaurs" here ... maybe it's more smoke and mirrors than it seems?

@michjnich @decibyte while reading your great blog post (thanks for sharing!!) I was btw also wondering:

We're all aspiring to improve ourselves, as developers.

Industry/society wants better developers and better software.

Then instead of spending the resources on improving the developers (we've always been improving btw), we get this ENORMOUS diversion of resources into AI. It's like theft.

That money should have been spent on education for developers.

@michjnich @decibyte My IDE, PyCharm, is for instance also suffering from this: At the end of the day, they're building AI features I didn't want, and other areas are neglected.
@michjnich @decibyte someone tooted the other day (sorry don't remember who) that the worst thing about AI was how it steals narrative. That's also true. Like, we forget to develop our methods and tools and continue doing what we did because there's this whole AI issue to tackle (which is obviously threatening to pull the carpet)