I first learned how to program in 1984 at 14. The tech press said I'd be obsolete by 25, due to age.

About 1990 tech press said the Japanese were building fifth generation computers to make me obsolete.

In 2000, the dot com bubble bursting was said to make me obsolete.

There's been neural networks, no-code, and more, since then, to make me obsolete.

Now it's LLMs.

Excuse me while I sit here and don't panic.

#rant

EDIT: This blew up. Muting the thread for some peace and quiet.

@liw There were things that pulled stuff away from programmers though. Much of it at the time was hidden by the growth in demand.
Excel, BASIC, some expert systems, Hypercard, DBase and friends all enabled an army of not-really-programmer people to get real work done without having to become programming experts of any kind.

None of them hallucinated or ate entire data centres for lunch. Their output was predictable if slow and they kept working over upgrades in general.

@etchedpixels That is very true, of course.
@liw The real sad thing is the tools that were heading towards making some programming obsolete or at least much easier - stuff that worked off formal proofs and graph theory all got thrown under the bus when shiny "AI" stuff came along.
@etchedpixels @liw Programmers who think they need LLM coding assistants really just need better languages and libraries.
@mason @etchedpixels @liw both, often enough, is not the choice of the coder...
@mason
So true. I saw this on here a couple of weeks ago.
@etchedpixels @liw

@markotway @mason @liw I actually don't think anyone has been gatekeeping programming

The problems for a lot of use cases is IMHO different. It's really hard to build a way that people can express the problem for a tool to do the job properly. In narrow fields, or with constraints we are okish at it (eg Hypercard) but complex problems not so much.

Whether you can take a tiny bit of 'AI' and combine it with stuff that really knows how to combine patterns is a really interesting question.

Better slushpiles and slushpile indexes, maybe enough! Stack Overflow briefly overwrote that skill

@mason @etchedpixels @liw

@clew Would you rephrase that? I've missed your meaning and I'm curious what you meant.

“Slushpile” used to be all the code one had written, with employer secrets removed, whether it had ever seen use or not, organized so that you could reuse work. Data and UI structures that were not a good fit for earlier features might be useful now. Shake them out, improve while trying, put them away a little better.

I worked on a big team that had a nascent group slushpile for its giant long term software. This was way before git so harder than it would be with modern merging.

@mason

@clew Fascinating. I've never encountered the term. Re-reading your reply with that new understanding. Thank you.

background: I believe programmers borrowed “slushpile” from publishers. Publishers use it to mean all the submissions that come in without an agent, so, derogatory. For programmers it kind of means the manuscripts the publisher sent back, I guess.

@mason

@clew Makes sense. Most of my life I've been programming for myself so I've rarely had to maintain a concept for rejected work.

One time was funny. It was my first pure software engineering job. We broke up a task and I had a piece to write. We were all told to use Hungarian notation. I hated it but I did it.

I was the only person who did it. No one else had paid attention. So despite some well-deserved embarrassment from the others, I was the one who had to rewrite all my code, as the combined effort to rewrite everyone else's code would have been too much.

I didn't know it at the time, but this was to be a signpost for my entire career to date. :P

Do you have a place or concept for code you tried and put aside?

(“Nobody else did the obligatory thing” - argh, familiar.)

@mason

@clew It lives forever in a forgotten corner of my CVS^H^H^HSubversion^H^H^H^H^H^H^H^H^H^HGit server.

yeah, I SAY they were organized but “one directory, grep , and hope” was probably more common.

@mason

@clew I want to favourite this harder, but there's only the one button.

@etchedpixels @liw Formal proofs? You mean, something that requires writing a clear, well specified definition of what you want a system to do?

An LLM (the equivalent of a pub conversation on requirements) is always going to be more attractive to most.

@mbpaz @liw Most of your proofs are implied before the project - like not scribbling on things. For a lot of other stuff you then have a standard interface definition so that guides most of the rest of it.

Imagine a Linux driver of a given class. If there's a formal description of that interface then a formal methods based tool can verify you meet the formal methods, you meet the general rules for the kernel and you meet the language don't scribble rules.

and it's way cheaper than debugging!

@etchedpixels @liw
I spent a healthy time in a university dept surrounded by maths types whose version of "vi vs emacs" was "Coq vs Quickcheck" (aka theorem proving vs model checking) - but both meant, you know, studying things and writing arcane spells. Effort, time. The unthinkable.

@mbpaz @etchedpixels @liw

I saw very expensive CASE tools become "shelfware" because the prevailing business culture was to wing it. Also known as the Why the Hell Isn't Somebody Coding Yet? (WHISCY) methodology.

@mbpaz @etchedpixels @liw don't even try to do EAL5+ on this. The result will be dramatic, at best.

@etchedpixels @liw

I think the serious control systems people (FBW aircraft, industrial controls etc.) would disagree with you there.

I'd like to think the automotive people would be the same, but from what I've seen recently I fear not. 😠

@etchedpixels @liw

The problem with formal proof systems is messy requirements and change control. Which is why

- Sociologists can be Surprisingly
Useful in Interactive Systems
Design (October 1997)

https://archive.cs.st-andrews.ac.uk/STSE-Handbook/Papers/SociologistscanbeSurprisinglyUsefulinInteractiveSystemsDesign-Sommerville.pdf

AI doesn't help with the Requirements Engineering problem.

@etchedpixels @liw

BASIC was around long before that. The others were designed and written by programmers.

@liw I'm not going to lie; to me this time *feels* different.

I learned to program at about the same time you did, though I was younger then. And it might not *be* different; I might just be easier to worry now.

@datarama @liw I feel the difference is about the power structures that keep pushing this "new paradigm". We're talking about some of the worst and the most powerful people on this planet, it ultimately doesn't matter if they make the world worse for everyone while doing it - they have an incentive to do so and they will profit from it.

@jarizleifr @datarama @liw when did we run out of rope? Can't remember how to build a guillotine?

We all know how to deal with those people Hint: January 1793, France .
We just need to start doing it.

@jarizleifr @datarama @liw

This.

Companies are getting away with being more more rapacious, and their managers have become more lazy, ignorant, and reckless.

American business disregards legitimate issues unless it comes in the form of a court order.

@jarizleifr @datarama @liw The older changes were not predicated on the normalisation of theft.
It's been said that implementation is what matters, not ideas; the promise of this tech is that the implementation is also now worth little.

@datarama @liw I'm of three minds myself: maybe LLMs will take over (whether they're any better or not, the history of programming is full of such mistakes); maybe they'll fall by the wayside; maybe they'll become a useful tool, in effect the next step in the evolution of programming languages, but with skilled programmers still needed.

I'm also at the point where I can retire whenever I want, and perhaps that is just as well.

@oclsc @liw I'm middle-aged, not wealthy enough to retire early, and unfortunately can't do much of anything anyone would pay me for aside from developing software - so I'm very worried that I'll soon be pretty much fucked.

@datarama @liw Yeah, were I 20 years younger I'd be worried too.

(But I'm an unusual case--no kids, no mortgage, high savings rate. A dozen years ago I was concerned with the stability of my job, checked with my financial advisor, and discovered I'd have been OK even had I stopped working for keeps then. Would that everyone was in that position.)

@oclsc @liw No kids, no car, and I've paid off most of my apartment. I could live on a considerably lower wage if necessary; I'm mostly concerned that there won't be any work someone like me can do at all. I'm on the autism spectrum, have severe asthma, and I'm in terrible shape.

@datarama @liw I forgot to mention no car! I do have three cats, but they don't cost nearly as much as kids.

It will not surprise you that I am on the spectrum too. My late wife was probably borderline.

It was really freeing to me when I wised up and asked the financial advisor. (And was glad I already had one, courtesy of my credit union.) Heartily recommended if available. Fed him my current assets and amounts of the several pensions I had pending, highballed a guess of how much I'd want to spend per month, he plugged the numbers into a program that made reasonable assumptions about inflation and computed I'd be OK until at least age 90. Likely much longer now.

That leaves the problem of boredom/feeling useful--I enjoy my work and that I am helping people who deserve it. (I work at a university.) But I've developed enough other interests that I ought to be OK, and tested it some years back when I got burned out and took several years off working.

In any case I hope you find a comfortabler solution.

@oclsc @liw I don't have a lot of savings (currently I've focused everything on paying off the rest of my apartment as soon as at all possible), and won't be able to draw on my pensions for a long time (retirement age in my country is 70).

If I can latch on to my current job for just 3 more years I should be able to finish paying off my apartment, and then my expenses will be very low.

(I have one lizard, and he costs even less than a cat.)

@oclsc @liw But I dunno. Perhaps, as per OP, there isn't really anything to be worried about.

@datarama @liw I hope there isn't.

Suggestion re financial projections is because I was worried, for other reasons, and realized I could get an objective opinion to tell me whether the worry was justified. As a lifelong Aspie with anxiety problems I recommend that tactic.

In the earlier burnout, I felt trapped in a job that was burning me out; no energy to look for another. What can I do, I wondered? I can't just quit, can I? Then I realized that was a question I could answer, by looking at my then-current savings and my spending rate. Turns out I could. So I did.

Not recommending you quit, of course, just trying to find objective measures of whether and how much to worry.

@oclsc @liw I never really made "big bucks". Devs where I live aren't paid anywhere near what the Internet tells me American ones are (I'm not complaining; I'm comfortably middle class and make more than I spend). I spent ten years teaching CS at a community college rather than working in industry (with much less pay) - so, I don't have a large pile of savings.

But, well, I know I can live where I currently do on less than half of what I currently make (because I've done that in the past). I could also remortgage the apartment if need be.

To be completely honest, it's more the loss of identity that bothers me than it's economic anxiety alone. I spent pretty much my whole life on programming.

@datarama @liw I can understand that. I managed to develop other pieces of identity out of other interests, not that that's trivial. Also set up an idiosyncratic home lab where I can pursue my own idiosyncratic projects.

@oclsc @liw (I've spent the last three years in deep depression, because not just programming but *every* creative interest of mine feels like it's being rendered pointless by AI.

Yes, I am in therapy.)

@datarama @liw I hear you. Felt somewhat in that direction during my burnout, though not as strongly as it sounds like you do. And my wife was subject life-long to depression, occasionally severe and multi-year.

The big thing to watch out for is inertia--just sitting around being unable to start things. If you can get up and start doing anything at all it will help. I'm in moderate burnout right now (spent all of 2025 caring for my dying wife) so this is top of mind for me.

If it helps, I am convinced that really creative pursuits will survive, as long as you're doing it for its own sake and not because you want others to be impressed. A good friend is a really skilled amateur photographer as well as a now-retired programmer and sysadmin, and he's not worried at all.

But it depends on the setting. You need to find what works for you. A therapist is a good resource.

@oclsc @liw Aside from programming, I've played bass and modular synths, I've made pixel art, I've drawn with pencils on paper, I've written short fiction - but at the moment I can't shake the overarching dread that AI is making it all pointless. So I deleted all my personal-project code and destroyed my drawings (which was perhaps a dumb thing to do, but - well, deep depression.) The inertia phase is what I've dealt with these last few years. I've *mostly* kept myself able to work my actual job (though I've had to go on sick leave a couple times), but everything else feels pointless now.

I think I'm just fundamentally not cut out for the world they're building now, to be honest.

(And programming has sort of had a special place for me. It's the only activity I've ever known that kept enough of my brain active that some stupid hindbrain process doesn't spin off into a loop that goes to terribly dark places.)

@oclsc @datarama @liw If you feel the urge to program, download a game engine. That will keep you busy, lol.
@steter @datarama @liw I always used to say the only computer game I really liked is operating systems. My home lab reflects that. (And some of my odd history, though the old stuff is in temporary abeyance while I bring the current infrastructure back into reasonable shape.)
@oclsc @datarama @liw Kernel work is much easier, overall, lol. These things are huge. Unreal Engine is gigs of C++ to the max. They have a nice blueprint system where you can "write" C++ using images representing blocks of code. It's kind of fun, when it's not frustrating.
@steter @datarama @liw My interests have long centred on making things smaller and simpler. Even OSes these days barely qualify.

@liw
@cstross

Found this earlier today.

https://ruby.social/@[email protected]al/116248191876455615

"Sorry, can't use AI. I'm afraid I'll void my professional indemnity insurance."

Ruby.social

@liw My concern isn't being made obsolete, at my age I don't really care.

My concern is the environmental and social harm caused by the needless waste LLMs produce.

Got an already trained model running on your laptop? More power to you, it's no worse than Visual Studio.

Your company is laying off 30,000 people to build datacenters in the desert with illegal fossil fuel plants that use all the groundwater? That's bad

I got a problem esp when it's used to create fakes that scam or harm people.

@Longplay_Games @liw then it's the government's job to regulate ai. It's not the ai's fault.
@liw You should, in fact, be awaiting "Please don't retire. No one else knows what we're doing."

@log @liw

I'm in that boat right now.

@liw oh i'm not worried about us still being needed

i'm worried about the state the world will be left in when we come back after being fired

@SRAZKVT @liw

I got a fair amount of work cleaning up behind tyro VBScript programmers that were set to work on developing enterprise software after the dot com bust. Come to think of it, that's what I'm doing now.

@liw I don't really think it was like that. Maybe way before that, when computers were still mostly analog, digital computers emerged together with prog languages, those made the older analog computers and their operators quite quickly obsolete. But after that there had been no strong claim by anything that would make programmers and developers obsolete (until now with AI, that is.) I'd say it was rather the opposite for long. Everyone was rather strongly encouraged to learn some coding skills, because that was supposed to be a necessity in most future jobs. Specially younger generations and educational programs leaned that way for long. I'd say from way before 2010, and till after covid, at least right up till the AI hype exploded couple of years ago with chatgpt passing some "high level knowledge" exams, formerly out of league for any computer programs.

@raulinbonn @liw it's a fact that most people don't have the thinking required to be able to write code, let alone good code.

All the LLM bs is doing is taking away entry positions and when we are dead and gone there just will not be a replacement on the same level and this is going to be a cumulative process.

@TheOneDoc I feel it's not as if most people can't code, they can do it just fine. It's just that most people, in the pursuit of writing code and earning money, don't assign any thought or importance to things like the politics behind free software, the issues surrounding copyright laws, and how code gives an individual immense power and freedom that can rival the impact a corporation may have.

One of my major sources of inspiration for choosing computer science and being interested in free software was the documentary TPB AFK and The Internet's Own Boy: The Story of Aaron Swartz, not money or lines of code or cool software features or "blazing fast" performance or memory safety.

One of the (flawed) arguments I've read from people who cheer for LLMs is that they "democratize" access to knowledge and "liberate" it from the shackles of copyright.

It's unsurprising to see that people don't realise or care where these LLMs get their data from (by DDoSing websites) but it is surprising to see them not realise who ultimately controls these models and software and the training data behind it. Claude can't "democratize" knowledge because it's limited by the amount of tokens one has access to and the same goes for all the other LLMs. The access may be generous and subsidized at present because of a desire to make people dependent on it but it won't be like that way forever. I mean, we have seen this pattern before with streaming platforms as well but somehow the temptation in this case is too strong to ignore the wisdom people may have gained in the past? It's baffling.

The argument that LLMs liberate knowledge from copyright restrictions is hilarious. Apparently, the actions of corporations and firms who lobby for copyright and DRM isn't enough to make people realise that copyright can always be used by those in power to oppress individuals when it suits them, even at present. I mean, good luck defending oneself in court if a corporation decides that your LLM generated software infringes on their copyright. However, individuals never had and never will wield such power. All LLMs do is pit individuals against each other and give corporations even more power to violate copyright when it suits them and harass people when it doesn't.

@raulinbonn @liw

The Pirate Bay - The galaxy's most resilient bittorrent site

@ayushnix @raulinbonn @liw no you say most coders can code but what I said was that most humans can't because they can't break down a problem into logical steps. It's the same reason most people can't do (anything but basic) math.

Copyright is well and good if you have the money to actually get your right in the legal system.

@raulinbonn @liw I remember the adverts in the computer press saying that companies wouldn't need programmers to write applications - that was probably the late 80s/early 90s, whilst I was still at school.
@raulinbonn @liw I also remember having to learn COBOL in the late 90s!