The question is not whether you can create software using LLMs - you can (most software is just boring CRUD shit).
But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

@tante Thinking a lot about this. To me it boils down to code ownership. Which is yet another kind of responsibility/liability that is offloaded to machines that by definition can't be.
@map exactly. In a way accepting responsibility for the code one puts in front of people is accepting the connected care duties towards these people.
@tante @map LLMs are another way to avoid putting skin in the game. Which is the whole point, if you’re a sociopath (or play one on TV). Privatize gains, socialize losses.