The question is not whether you can create software using LLMs - you can (most software is just boring CRUD shit).
But you do pay a hefty price: In lowering quality (security issues, less maintainable), in skill decay in the people "guiding" the stochastic parrots, etc.

It's not "can 'AI's create software" but "are we willing to accept worse software running more and more of our lives?"

Creativity will suffer in the long run. AI isn't creating anything new.
I could see a world where coders stop sharing data online, ceemreat and lock their code in a new "Internet." Making the old, open Internet dead.