The most surreal thing about AI coding shit taking on is the revelation that so many people who do this thing that I love, seem to have no care for the craft at all. Even people who I would have pointed at, years ago, as those who clearly care. And I know it has always been Just A Job for many people, but holy shit, do you even care a little bit?
We humans are not merely bad at it, we have people who have been doing the work with no desire to be good at it in the first place.
If you don’t like programming, I invite you to do something else with your time instead of promoting the machine that lets you not do it. Some people actually like it, you jerks.
I know a non-trivial amount of people who cannot code, but who use AI to make programs. They get excited from being able to create. And I find that great.

Their annoying mistake is that since AI enabled them to go from being unable to being able to make programs, it will make a developer who already can code better at making programs.

It's like offering a wheelchair to somebody that walks: just because it allows somebody with broken legs to move, it does not make the walking better at walking. But it has a place in aiding those that cannot move without it.

@michael Thing is, it doesn't make you "able to create."
It provides a convenient way of getting the thing.

This, to me, is the second most telling delusion around these things, after seeing them as sentient.
@pikhq

It absolutely does. It does not allow you to create the same way a developer does, but it allows creating anyway. Coding is a small part of software development: design and analysis is a much larger part for most projects, and while LLMs can be used to spar about that (to some more or less meaningful extent), you still need to do that.

Saying code made using an LLM isn't legitimate is just as meaningful as saying that code made using a high-level language is less legitimate than code made in assembler.

@michael @pikhq Sorry, but I'm not buying this argument.

"Coding is a small part of software development: design and analysis is a much larger part for most projects" - with this, I do agree.

However, if you sub out your coding to another person by saying "I need a chunk of code here, that fits these parameters," surely you wouldn't be making the same argument that you personally created it.
The difference is only that you're subbing out to a (deeply problematic in an impressive range of ways) software service. You still didn't create it; you commissioned it. It's just that you commissioned it from a mechanised service instead of from a person.

It's qualitatively not the same thing as writing code in a higher-level language, or even using a Lisp macro to generate the code for you. If it really were equivalent, then it would increase your level of skill over time, rather than causing it to atrophy.

I also didn't say anything about the resulting code being "legitimate," so I'm merely pointing that out as a straw man, and not otherwise engaging with it.

I didn't mean the code being legitimate or not, but that making programs using an LLM can be a legitimate way of making programs.

Using an LLM definitely doesn't teach you programming. But it does increase your skill in making programs using LLMs. They are different skills that accomplish something similar.

Using a high-level language allows you to write programs without having to worry about register allocations, call conventions, or memory models. Many modern developers don't even know what a register is, what a call convention is, or the difference between a stack and a heap. But that is ok: the compiler takes care of that, and you're just delegating to it to take care of the details that don't matter at your level.

For non-developers, the LLM can serve some of the same roles. It can write the code they don't understand and they can create by describing it, just as your or I create programs by describing it in our programming language of choice. It's fair if you don't see describing the program to an LLM as creating, but I see it as creating just on another level. And I hate using LLMs for coding or when try pushing me to using it.

The comparison with a compiler is a little flawed. It should be a comparison between compilers in the 1960s: back then, compilers were pretty garbage and frequently made mistakes. The fact that compilers eventually became great and better than humans should not be taken as an argument that the same will happen for LLMs. But either does allow making programs at a higher abstraction level.

@michael @KatS @pikhq

Vibe coding goes a step further than that, though. Not only do you not understand how the code gets from your editor to being executed, you don't understand the code itself. A programmer in a high level language can explain how the program they wrote works, predict its behavior in a case they didn't think of while writing it, and know exactly what they have to do to modify it to do something else. A vibe coder cannot do the last thing except by forwarding the request to their LLM and hoping it understands, and arguably cannot do the first two at all.

I disagree that what vibe coders do is creation for the same reason I disagree that what AI artists do is creation, and it largely boils down to the understanding of the work. A portrait artist knows how to draw a head and knows how to move the nose just a little to the left if they didn't get the shading quite right. An AI artist has to feed the image back into the generator, ask it to do that, and hope.

I completely agree that somebody creating using an LLM doesn't understand the program and that is a problem in many cases. To expand on the wheelchair analogy, a wheelchair might be faster for going downhill, just as using an LLM might be faster for writing certain code. That still doesn't make it great to YOLO downhill in thick fog towards a busy road. And a wheelchair is still unable to go up stairs. There are absolutely places where LLMs are not appropriate and cannot do the task.

I would still argue that using them can be viewed as creating. Take for example the Youtube channel NeuralViz. They make videos using generative AI for videos, images, and voices. They are not claiming they created the images, video, and voice the way a painter or voice artist would, but they are IMO a writer and director, and impose a visual style on the creations. They are creating art – an entire world with mythos and many recurring characters – but at a different level.

Somebody using LLMs to create programs similarly work at another level. The problem is more that non-developers have a hard time telling the difference, and try using unqualified people armed with an LLM to replace developers where it is inappropriate. Those areas definitely including important logic and critical systems. I am on the fence about whether they can be used for writing GUIs – my experience says no, but people I respect find it works for them.

But it is still a delight to see people that before were not interested in software development creating programs or other automations because they find it fun. They do not claim to be developers, but are now able to make their half-wonky, low-risk ideas realized. If that's not creating, I don't know what is.

@michael @KatS @AVincentInSpace @pikhq Correct. You do not know what creating is.

It is not taking the work of your slave and presenting it as your own. Even if the slave isn't human, and especially when it's bad at its work.

That's an incredibly unnuanced take.

Is a photographer then also not creating since the camera is their slave and they need to use a paintbrush and canvas like a real creator? In fact, forget the paintbrush, use mud on cave walls like a real creator.

LLMs are just tools. You're anthropomorphizing them to a degree they do not deserve. A photographer is not a painter, but still creates at a different level.
@michael @KatS @AVincentInSpace @pikhq LLMs are a scam, and you're helping promote the scam.
@michael @KatS @AVincentInSpace @pikhq Creativity consists of using tools, not issuing orders. The entire bullshit point of LLMs is the promise of just being able to issue orders.

@michael @KatS @AVincentInSpace @pikhq LLMs are not tools. A tool is something to assist you in performing a task, not something to do your task for you.

They are the antithesis of tools. They are literally an attempt to create mechanical slaves.