All it would take for AI to completely collapse is a ruling in the US saying these companies have to licence the content they used to train these tools.

They simply would never reach a sustainable business model if they had to fairly compensate all the people who wrote, drew, edited, sang or just created the content they use.

Simply being forced to respect attribution and licenses would kill them. Will that ruling ever happen? Maybe not. Should it? I think so.

@thelinuxEXP To play the devil's advocate a bit here, but people also learn in a similar way. You have to read to learn how to write. You have to listen to music to learn how to make your own, etc.

I think there are at least 2 main differences. The first one is that a human can only produce so much work on their own, while AI can mass produce.

@ivt @thelinuxEXP People do not learn in a similar way. People do not need to listen to music to make their own. People do not need to read thousands of books to learn how to read. People do not need to remember everything that was ever said to them to learn a language. A child does not need to see a single drawing, let alone thousands, before they can learn how to draw. A person consumes a miniscule amount of energy to function compared to "AI". Nothing about the process is remotely similar.

@TapiocaPearl @thelinuxEXP I must be living in an alternative universe. AFAICT this is a big part of how people learn new skills.

I agree about the energy, but that's another topic. Also the amount of data that people need to learn something is generally smaller than what AI needs. And the process of learning is similar, but not the same. But this doesn't invalidate my points.