"I used AI. It worked. I hated it." by @mttaggart https://taggart-tech.com/reckoning/

This is a really good blogpost. And I"m sure it'll make some people unhappy to read whether they're pro or anti genAI. What's good about @mttaggart's blogpost is he talks honestly about how using Claude Code did actually solve the problem he set out to do. It needed various guardrails, but they were possible to set up, and the project worked. But the post is also completely clear and honest about how miserable it was:

- It removed the joy from the process
- If you aim to do the right thing and carefully evaluate the output, your job ends up eventually becoming "tapping the Y key"
- Ramifications on people learning things
- Plenty of other ethical analysis
- And the nagging wonder whether to use it next time, despite it being miserable.

I think this is important, because it *is* true that these tools are getting to the point where they can accomplish a lot of tasks, but the caveat space is very large (cotd)

I used AI. It worked. I hated it.

I used Claude Code to build a tool I needed. It worked great, but I was miserable. I need to reckon with what it means.

What I think is also good about the piece is that it shows how using this tech eventually funnels people down a particular direction. This is captured also by this exchange on lobste.rs: https://lobste.rs/s/7d8dxv/i_used_ai_it_worked_i_hated_it#c_7jirfk

The story that people start with vs where they go is very different:

- They're really just for experts, and are assistants, they don't write the code for you
- Okay the write a lot of the code for me, but I personally don't commit anything without reviewing
- YOLO mode

Which eventually leads you to becoming the drinky bird pressing the Y key from that Simpsons episode. (Funnily enough I wrote that in my comment on lobste.rs in reply to someone else before I had even gotten to the point where I saw that @mttaggart literally had that gif)

And at that point, you're checked out. All that's left is vibes.

And unfortunately, these systems don't survive that point very well. And neither do you, in your skills and abilities.

There are a lot of other concerns but I think since a lot of people on the fediverse are opposed to these tools, they might not be very familiar with where they're currently at ability-wise. @mttaggart provides a good description that they *are* capable of solving many problems you put in front of them... and that doesn't remove the other problems they generate or involved in their process.

The slop part isn't just the individual outputs, but the cumulation, and the effect on society itself.

Is that pushing the goalposts? It may be. I think "slop" used to be easier to dismiss when it came to code because it was obviously bad. Now when it's bad, it's non-obviously bad, which is part of its own problem. And cognitive debt, deskilling, and etc don't get factored into the quality of output aspect.

But unfortunately, the immediate reward aspects of these things are going to make it hard for society to recognize.

@cwebber

It kind of feels like its going to something big happening in the press to get people to stop.

I was thinking an AI caused Therac 25, but maybe a copilot worm that wipes all windows 11 computers might get some outlawing AI code legislation.

@alienghic @cwebber

The thing most likely to get people to stop is the end of the massive subsidies for its use that the VCs are currently pouring in.

Already firms are starting to panic a little about token use for things like Claude Code, and are putting limiters in their workers that really defeat the purpose of all of the "YOU MUST USE THIS OR BE FIRED" diktats. But operating indefinitely at those prices will bankrupt Anthropic soon.

So at some point the private equity love affair with everything AI will dry up (possibly because of a Iran war-induced financial crisis), and at that point it's going to be "my org can spend $50k annually on my personal Claude tokens to make me 20% more productive . . . or it could just hire a junior dev?"

There's a chance they manage to optimize this, or get it to work using a lighter weight model. But I think it's unlikely.

@MichaelTBacon @alienghic @cwebber Yeah I think there's some use in some cases that work okay, but only make sense at the current financially unsustainable offers. As soon as the prices go up/everyone has to pay the piper, the cases where LLMs are useful won't be financially justifiable.

@ocdtrekkie @alienghic @cwebber

And I think when that happens, there's enough sunk capital into the models and built data centers that there will be a desperate search for some way to put those to effective use, and I think they'll find something. But I don't know what it's going to be (if I did I could make myself very, very rich, probably).

But I think the downsizing/de-skilling of this period that we're in the middle of is going to leave a gaping hole in the US's tech sector capacity, and I'm not sure it's going to recover.

@MichaelTBacon @alienghic @cwebber We are already past this point: When schools started giving kids iPads and Chromebooks instead of Windows PCs we ushered in a huge generation of people who don't understand the technology they use.
@MichaelTBacon @alienghic @cwebber Maybe we'll pivot all those GPUs back to crypto, lol.

@ocdtrekkie @MichaelTBacon @alienghic @cwebber

I'm by far no windows fan (more a list of reasons than one) and I would love to see the next generation learning Unix scripts and ssh and more... but I agree that an iPad is not the gateway to a proper shell. :(

If they even forget what a folder structure is, then we get a problem.

@ChristianRiegel @ocdtrekkie @alienghic @cwebber Most people already don't know what a directory tree is, in a large part thanks to Apple.

IMHO this is fundamental computer knowledge that should be taught at grade school level, maybe early high school.

edit: Then again @MichaelTBacon makes a good point here: https://social.coop/@MichaelTBacon/116349207112344811. Perhaps it's more about being aware of where you store files and that one has agency in that action that should be taught.

Michael Bacon (@[email protected])

@[email protected] @[email protected] @cwebber Yeah, if you look at how the big object stores are managed, folders are just a naming convention, not anything that represents the actual way the data are stored on disk. I like folders and still use them of course. But I'm not sure they're a critical element of understanding computing the way they used to be.

social.coop
@ChristianRiegel @ocdtrekkie @MichaelTBacon @alienghic @cwebber yeah, the best thing would have been to give those kids Linux laptops
@LunaDragofelis @ChristianRiegel @ocdtrekkie @cwebber @alienghic For those that want to explore the technology, yes. For those who want to use it and not think about it too much (which is FINE), an iPad or a Neo is a fine place to start.

@MichaelTBacon @LunaDragofelis @ChristianRiegel @ocdtrekkie @cwebber

In thinking about this thread some, I think the most fundamental question for is an OS good for discovery, is can you build applications for that OS on itself.

I think that's really why iOS and Android are inferior to Windows, Mac, Linux.

It looks like ChromeOS can build Android apps at least.

@alienghic @LunaDragofelis @ChristianRiegel @ocdtrekkie @cwebber

Yeah, and thankfully I think Neo hails the end of "let's try to make iPadOS into a real notebook OS!" It's crippleware on the hardware at this point, and they could spend lots of time and money continuing to try to warp iOS into something that can be functional . . . or they can do what they did and just let you run MacOS on iPad-level hardware.

@ocdtrekkie @alienghic @cwebber

I agree with a lot of your posts on this thread but not this one. There's nothing inherently superior about a Windows PC over a Chromebook in terms of understanding what's happening underneath, just a ton of headaches, mysterious error messages, and unnecessary software compat errors.

If we want kids to understand the tech, give them something where they have to use a shell. Most kids and most adults don't need to know that. But the ones who do will find their way anyway.

I got given an account on an AIX workstation when I was 16, and shortly thereafter we got a bunch of Sun workstations. That was when I learned the first of the tech skills I'm still using today at age 49. I've forgotten most of what I knew before then because who the hell needs a .BAT file or how to manage a TSR program, or how to track down an IRQ conflict?

@MichaelTBacon @ocdtrekkie @cwebber

Teachers have noticed that people who grow up on just mobile OSes never learn how to use hierarchical file systems on their own.

The File -> open/save convention of older desktop software is completely unfamiliar to them.

Does this need more than for someone to realize they need teach how to use a hierarchical file system?

That I don't know

@alienghic @MichaelTBacon @cwebber Yeah that's exactly the stuff I'm talking about. We abstracted basic competency out of our software! Stuff people used to learn as a kid we're gonna end up having to train people about at the start of computer science in college!

@ocdtrekkie @MichaelTBacon @cwebber

How important is it to understanding a computer though?

It's not that important to understand how the CP/M, Apple ][, C 64 filesystems worked these days.

@alienghic @ocdtrekkie @cwebber

Yeah, if you look at how the big object stores are managed, folders are just a naming convention, not anything that represents the actual way the data are stored on disk.

I like folders and still use them of course. But I'm not sure they're a critical element of understanding computing the way they used to be.

@alienghic @MichaelTBacon @ocdtrekkie @cwebber

I try to un-learn hierarchical file systems for certain things 🙂 I grew up with them, but e.g. the times when I had emails in subfolder or a tree of browser bookmarks are over. Using tags now.

@MichaelTBacon @ocdtrekkie @alienghic The fun part is, despite all the calls that "you're gonna be left behind!", there's a good chance that the industry is going to desperately need those who have retained their skills, so actually not switching to genAI tools might be a better way to not be left behind
@cwebber @MichaelTBacon @alienghic We're gonna be like COBOL programmers are today. Few in number and vital to the functioning of modern society.
@ocdtrekkie @cwebber @MichaelTBacon @alienghic Gonna use the "hard skills renaissance" to fund my retirement.

@cwebber @MichaelTBacon @ocdtrekkie

What I want to know is can an agent fill out my expense report, order more toner for the lab printer, or hunt my boss down for an account number.

I want it to do the boring sucky parts of my job, not the parts I like.

Oh god, the most horrifying idea.

An agent that pings you to say, "I just wanted to mention that your actions look a lot like a micro-aggression, perhaps you should take a break and calm down".

(Clippy for office behavior)

@alienghic @cwebber @MichaelTBacon As an IT professional, *reviewing gigabytes of log files I generate daily to look for attackers* is absolutely something I want AI tools to do. But also that *has* to be local because of the potential sensitivity of the content.

@ocdtrekkie @cwebber @MichaelTBacon

Though I'm not sure a transformer LLM is the best tool for analyzing logs.

Maybe there's a way to get to be a more forgiving parser for logs? But the analyzing logs for something exceptional that needs to be responded to is probably some other technique.

@alienghic
That sounds like the *ideal* application for machine learning techniques; patterns associated with an attacker should be unusual and therefore have a relatively high Shannon entropy, but in order to detect them you'd need to develop a reliable model of benign access patterns.

@ocdtrekkie @cwebber @MichaelTBacon

@krans @alienghic @cwebber @MichaelTBacon There's a company called Darktrace that will sell you an incredibly expensive on-premise server to do machine learning anomaly detection on every single packet in your network. It's very cool. It's also very, very expensive.
@krans @alienghic @cwebber @MichaelTBacon The problem is it is very hard to convince bean counters to spend massive amounts of money on looking for things that might or might not be there. The use case that's most interesting is very hard to sell the cost of!
@ocdtrekkie @alienghic @cwebber @MichaelTBacon This is exactly my forlorn hope: that they’ll need us old folks to keep the lights on when the crash is over and the kids can’t use any reliable tools.
@cwebber @MichaelTBacon @alienghic There's a steady chant of "Roman steel" in my head whenever these discussions come up, like how they need low-background steel unexposed to the first nuclear detonations of the 1940s and 50s for certain projects like particle detectors, and this pristine steel is sourced from ruins as ancient as Roman shipwrecks. The widespread use of LLMs is the nuclear detonation of skillsets. Fortunately devs don't need to sleep under the ocean for millennia like some grognard Cthulhu to keep their brains low-radiation, they just need to not be okay with the million ills of these models while being okay with being called technophobic and behind the times.
@ocdtrekkie @alienghic @cwebber @MichaelTBacon Yes, this is a strange technology that eats its own. It can only exist because their is a body of well written code. As more of its output becomes its input, it will degrade. As the user relies more on it, their skills erode. Talk about killing the goose of the golden eggs
@cwebber @MichaelTBacon @ocdtrekkie @alienghic after 25 years as a programmer, and having shifted to a different career I am finally trying to learn about what I have been doing all this time. That is focusing on learning programming instead of just intuitively building web apps.