If your corporation's business model relies on lawbreaking, your corporation has no legal legitimacy.

We don't let narcotic cartels and trafficking rings list themselves on the stock exchange: why should OpenAI or Facebook be any different?

@cstross it seems to me that there is a very simple choice. Kill AI by enforcing existing copyright laws, or take the long way around by letting AI trash copyright laws thereby killing all the creative industries AI needs to steal content from, thereby slowly starving the AI industries of the content they need to improve.

One choice at least leaves us with something valuable. The other will just take longer.

@pmb00cs @cstross it's not that binary I think, many other variants/scenarios are possible

@ErikJonker @cstross the choice the government has is binary. Enforce copyright laws, or carve out an exception.

The consequences are not so binary. I don't actually think enforcing copyright will kill the AI industry. It'll reduce the amount of profit, certainly, but that's hardly an existential threat. On the other hand, I do think letting AI firms get away with ignoring copyright is going to be devastating to the creative industries, and in the long term also AI training.

@pmb00cs
What about the third option.
Delete all IP law.
Then FOSS devs get access to all the AI companies' code!
@ErikJonker @cstross
@pmb00cs
Or at least, are able to legally reverse engineer it.
@ErikJonker @cstross
@light @ErikJonker @cstross while I do not agree with the length of copyright currently I don't think scrapping it is a good idea either. FOSS software depends on strong copyright too. And if we scrap copyright the large companies will take steps to protect their secrets.
@pmb00cs
FOSS software only depends on copyright in self-defense.
If there was no copyright, all software would be effectively FOSS.
>And if we scrap copyright the large companies will take steps to protect their secrets.
Like what?
@ErikJonker @cstross

@light @ErikJonker @cstross like not making it available in any readable form.

Keeping private information private is something that is done currently. Granted, not very well given the prevalence of ransomware with data extraction.

@pmb00cs @cstross some movement is already there, content deals are being made between large media companies and BigTech
@ErikJonker @cstross because media companies are moving to protect their content now that they've seen AI companies stealing it.
@pmb00cs @cstross perhaps real creativity will need to go underground. Or is that an extreme response? Cos I believe that we will start to leave the internet: we will have to for our privacy and our sanity.