This is a very eloquently written adventure into what most of us know as ai fuckery. I don't want to ruin it for you. Read it yourself, please.

I'm always learning something from @mttaggart , and this is no exception.

https://taggart-tech.com/reckoning/

I used AI. It worked. I hated it.

I used Claude Code to build a tool I needed. It worked great, but I was miserable. I need to reckon with what it means.

I feel as though I'm in a similar position. I detest that this technology exists. The same way that I detest web3 exists. The same way that I detest how absolutely abysmal the cloud/CDNS are when we had an opportunity to make REAL CHANGE in internet-scale computing. The same way we all hate every major change in tech that is extremely poorly thought out.

I've actively avoided it, except in cases in which I used it to check for, and I'm not making this up, plagiarism/cheating in take-home exam for an internship position on our security team. Let me tell you how disheartening it is to find five different candidates from five different prestigious universities from around the country, all regurgitating the exact same incorrect answers from ChatGPT/Copilot.

I weep for critical thinking. I weep for I don't know. But I want to learn more.

@da_667 Thank you, my friend. I'm glad you found something in it. I'm sorry we're all in this mess together.

@mttaggart It is what it is. I still find it very scary that, even with all of the extremely tight guardrails you put up, it still hallucinated functionality that didn't exist out of thin air, and if not for the fact that rust is MILITANT about well, everything, that.... it would've just existed if it were another programming language.

I'm gonna live with that being a parasite itching in my mind that I can't scratch.

@da_667 Nobody's really talking about that part but I really can't imagine how much more annoying this would have been without all the safeties of the Rust toolchain.
@mttaggart that's the one part that really blew my mind. How many unintentional logic bombs or Heisenbugs are people just... dropping into a codebase that will explode 10+ years later?

@da_667 At the risk of logrolling, the Lobste.rs conversation was surreal.

https://lobste.rs/s/7d8dxv/i_used_ai_it_worked_i_hated_it

Here is Simon Willison, one of the preeminent advocates, saying that I should let the thing make more changes at once and review in bulk, which sounds more efficient but also more prone to missing something. And then contending that it's all about building a sufficient safety structure around your process.

That could well be so, but I think we know that onerous best practices become rare finds in reality.

I used AI. It worked. I hated it

38 comments

Lobsters
@mttaggart jesus christ THAT is nightmare fuel.

@mttaggart In my mind, it feels like letting an intern take the wheel behind a major project, and having to repeatedly code review it.

Code review is a very arduous task. Even with one's own code. I can't imagine taking a bulk block of code that was generated all at once and having to nitpick it all in one go, and find EVERY single problem all at once.

We're human beings. We're not machines. We don't work that way. If I had to do it, and is painstaking as it was, your way was the best way. Iteration and constant back and forth.

@da_667 Independent of everything else, advocates will tell you that the hardest shift in mindset is from creator to reviewer/manager. Pretty famously, the two kinds of people tend to not like the others' work. For my part, I do not want to be a machine's editor. I want to build.

But that raises the question of when code is sculpture and when code is drywall. Must all software be handcrafted if it functions and sufficient guardrails are in place? On that narrow question alone, I'm leaning toward probably no, and less so as those safeties improve.

None of that takes away from the original and ongoing sins of the technology, on which the best opposition case is built. I said elsewhere part of why I wrote this is that I was bothered by the "It doesn't work" discourse ignoring the many, many developers who are using the thing successfully.

So like, yeah, it can get the job done. But for we security folks, there's a level above "functionality" that is so, so much harder to obtain. And the baseline is nowhere near that level yet.

@mttaggart and as far as I can tell, it never will be, if this is the best we can do with decades of stolen code and well, literally most of internet having been downloaded. Everybody always talks about how the next model with be AGI, the panacea, the great miracle. And it never is. Petabytes of data, datacenters running on gas turbines due to a lack of power. Kicking global warming into high gear just to produce nebulous slop.
@mttaggart like any tool, it has its uses. I concede that wholeheartedly, but it feels like making a blood sacrifice to use it.
@da_667 My position at the start was that the harms absurdly outweigh the benefits. That has not changed a jot. Speed of code creation—hell, "productivity" at all—is not a moral value to pursue. And by pursuing it, we do indeed sacrifice far too much.