Hmm, and presumably anyone operating a general-purpose chatbot that could conceivably be prompted to give such advice (e.g. as the conversational interface to a regular web-page) are also plausibly at risk?
@cstross @davidgerard who will you imprison? The ceo? The programmers? The qa team?
One of the big draws of tech is the ability to turn human error (and malfeasance) into "computer error". And society has been trained to believe software errors aren't anyone's fault so there's no one to hold accountable
That needs to change. Companies need to be accountable for their "computer errors" - especially when they're baked into design and not actually errors
@cstross @wronglang @Jer @davidgerard
Exactly.
When the board votes out a CEO, they lose all unvested stock. All of the salary that they’ve received and all stock that they have that has vested remains theirs. This is normally (for a moderately large company) enough money to live comfortably for the rest of their lives without working.
I would happily endure this ‘punishment’.
@david_chisnall @cstross @Jer @davidgerard yes but we only do mock punishments
edit: my point is that both are options, if we're talking about modifying the law either mechanism could work iff actually applied.
@wronglang @cstross @Jer @davidgerard
You possibly could treat all compensation paid to the CEO while the company knowingly engaged in illegal activity as the proceeds of immoral earnings. There are existing laws that allow such money to be confiscated.
@cstross @wronglang @Jer @davidgerard
The laws in most places allow prosecuting individual members of a company, the difficulty is proving who in a diffuse group that all signed off on part of something is actually responsible. Targeting the company in addition is intended to act as a disincentive by applying financial penalties that make the cost:benefit calculations different. Sadly, the costs are rarely high enough to matter.
The only de jure liability shield that incorporation gives is for shareholders. And this can go away in some cases. Both the UK and USA have a legal notion of ‘piercing the corporate veil’ that can, in extreme cases, make the owners of a company liable.
@cstross @wronglang @Jer @davidgerard
And the minimum-wage person who actually did the illegal thing, but was threatened with being fired and losing their home if they didn’t? And the paper trail that says everyone on the committee voted against it, but this rogue employee did the illegal thing unsupervised?
Whistleblower protections would need to be orders of magnitude stronger for this to be enforceable (something I would be very much in favour of).
@david_chisnall @cstross @wronglang @Jer @davidgerard
The only de jure liability shield that incorporation gives is for shareholders.
Any shareholders that had voting rights and voted for doing illegal shit should also be hit with the same legal liability.
Benefiting from the proceeds of crime, especially crime one ordered is not a protection from liability for it.
@Jer @davidgerard That's a broader corporate liability question. Personally I'd LIKE to see the C-suite and boards of corporations that kill people sentenced to serious prison time. (Lower level staff too, but only if it's found that they made decisions that led to deaths on their own initiative. The directors *are responsible for the company's actions*.)
Going further: the current privileged legal status of corporations is an obscenity and needs to be de-legitimized.
Well, its also just as absurd that, in states like New York, even offering ibuprofen to a friend who has a headache is a felony of "practicing medicine without a license".
And a garbage doctor with multiple malpractice suits is still a doctor.
And, if I did exhaust all other avenues, yeah, I would investigate a unknown medical condition with an LLM. I wouldn't trust US AI though.
I have a friend whose wife had that exact thing. Tests after tests had no answer to what was wrong. They got the medical record and fed it into ChatGPT and it provided a differential diagnosis. They took the output and brought it to a human doc, and validated the top disease as correct.
Its way too easy to oversimplify to "slop machine" or "do everything machine". Its neither, but something much more complex and weird.
In some parts of the world, it is an offence to give legal advice without being an actual lawyer. But that doesn't seem to stop some lawyers from using LLMs for generating legal documents full of slop.