All it would take for AI to completely collapse is a ruling in the US saying these companies have to licence the content they used to train these tools.

They simply would never reach a sustainable business model if they had to fairly compensate all the people who wrote, drew, edited, sang or just created the content they use.

Simply being forced to respect attribution and licenses would kill them. Will that ruling ever happen? Maybe not. Should it? I think so.

@thelinuxEXP The first problem here you will have in a legal sense is to prove that your work was used to train a model. There is pretty much no way to trace original individual training samples from a transformer model. So you lose right there…Even if a law existed that licenses had to be respected, it is unenforceable.
@vartak The NYT proves that pretty competently already, ChatGPT can just spit out entire parts of their articles ;)
@thelinuxEXP Nope. Almost all language has common phrases. And we are all using language the way someone else used it. That's how we learnt it. And this is much much more difficult with image generators like Stable Diffusion.