We humans are not merely bad at it, we have people who have been doing the work with no desire to be good at it in the first place.
@michael @pikhq Sorry, but I'm not buying this argument.
"Coding is a small part of software development: design and analysis is a much larger part for most projects" - with this, I do agree.
However, if you sub out your coding to another person by saying "I need a chunk of code here, that fits these parameters," surely you wouldn't be making the same argument that you personally created it.
The difference is only that you're subbing out to a (deeply problematic in an impressive range of ways) software service. You still didn't create it; you commissioned it. It's just that you commissioned it from a mechanised service instead of from a person.
It's qualitatively not the same thing as writing code in a higher-level language, or even using a Lisp macro to generate the code for you. If it really were equivalent, then it would increase your level of skill over time, rather than causing it to atrophy.
I also didn't say anything about the resulting code being "legitimate," so I'm merely pointing that out as a straw man, and not otherwise engaging with it.
Vibe coding goes a step further than that, though. Not only do you not understand how the code gets from your editor to being executed, you don't understand the code itself. A programmer in a high level language can explain how the program they wrote works, predict its behavior in a case they didn't think of while writing it, and know exactly what they have to do to modify it to do something else. A vibe coder cannot do the last thing except by forwarding the request to their LLM and hoping it understands, and arguably cannot do the first two at all.
I disagree that what vibe coders do is creation for the same reason I disagree that what AI artists do is creation, and it largely boils down to the understanding of the work. A portrait artist knows how to draw a head and knows how to move the nose just a little to the left if they didn't get the shading quite right. An AI artist has to feed the image back into the generator, ask it to do that, and hope.
@michael @KatS @AVincentInSpace @pikhq Correct. You do not know what creating is.
It is not taking the work of your slave and presenting it as your own. Even if the slave isn't human, and especially when it's bad at its work.
@michael @KatS @AVincentInSpace @pikhq LLMs are not tools. A tool is something to assist you in performing a task, not something to do your task for you.
They are the antithesis of tools. They are literally an attempt to create mechanical slaves.
Quoting myself from another reply elsewhere in the thread (made after your post):
I completely agree that somebody creating using an LLM doesn’t understand the program and that is a problem in many cases. To expand on the wheelchair analogy, a wheelchair might be faster for going downhill, just as using an LLM might be faster for writing certain code. That still doesn’t make it great to YOLO downhill in thick fog towards a busy road. And a wheelchair is still unable to go up stairs. There are absolutely places where LLMs are not appropriate and cannot do the task.
The problem is that non-developers have a hard time telling the difference, and try using unqualified people armed with an LLM to replace developers where it is inappropriate. Those areas definitely including important logic and critical systems. I am on the fence about whether they can be used for writing GUIs – my experience says no, but people I respect find it works for them.
I completely agree there are places, many, where LLMs are not the right tool. That’s why I said it is annoying when people claim that since an LLM made them able to make programs, it will make me a better or faster coder.
And you are right, it’s a huge risk – that is already happening on a less serious but wider scale. I see many pieces of software I use getting worse, not just regular enshittification, but introducing more regressions than they used to.