The question is not whether you can create software using LLMs - you can (most software is just boring CRUD shit).
But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

@tante @map That's just part of the truth. You can make wonderful, creative, unique software using AI. The thing is you have to specify what you want to achieve. If you don't give these goals to the AI then it will come up with some mediocre generic solution.
@gklka @tante @map
I've a bridge over the river Shannon you can buy.
"make wonderful, creative, unique software using AI."
No. An LLM can't create at all and if it actually works and meets the spec it's likely copied.
@raymaccarthy @tante @map AI can't create. You create, AI just implements it. I know it is hard to digest but this is everyday work for a lot of us now.
@gklka @tante @map
A compiler implements it. The LLM/Gen is a rubbish search engine, database and statistical engine. It regurgitates based on prompts, not formal specifications.
@raymaccarthy @tante @map Ok, feel free to think whatever you want.
@gklka @raymaccarthy @tante @map I agree with GK on this. Not all AI is the same, and it's definitely not black and white. With the right expertise and detailed specs, you can achieve great results while keeping the code maintainable and retaining ownership. I really dislike the mindset that everything has to be either absolutely good or 100% bad.