TIL that saying "holy shit don't use ChatGPT for medical advice" is a "purity test". i didn't know that before. in fact I still don't.
@davidgerard I am pretty sure that OpenAI do not have a license to practice medicine and are not a (human) member of the BMA so by giving medical advice they (the humans responsible for the software) are potentially committing an imprisonable offense ...

@cstross @davidgerard

Hmm, and presumably anyone operating a general-purpose chatbot that could conceivably be prompted to give such advice (e.g. as the conversational interface to a regular web-page) are also plausibly at risk?

@dwm @davidgerard Yes, although it all depends on whether the GMC (and the Police) have the guts to go after a large foreign corporation with deep pockets. It probably won't happen unless there's a major death-related scandal and/or one of the aforementioned corporations decides to go after the competition, i.e. small locally run and/or open source models with broad training sets.
@cstross @davidgerard Any coin can give medical advice. I just ask the coin: should I take this medicine, say "head". Then I throw the coin. I hope the people at the coin minting facility get imprisoned for that.
@waffelhard @cstross @davidgerard …as coins are often claimed to be able to replace doctors by the coin minting industry and its adherents.
@waffelhard @cstross @davidgerard this is not the clever response you think it is.
@tedmielczarek I just love it when other people claim to know what I think!
@cstross @davidgerard needs an IANAD subroutine.
@cstross @davidgerard
I think that applies to veterinary advice, but not to human. Hence Chiropractic, Homeopathy, assorted woo.
When people complain to the GMC that someone ^^^ is giving bad advice, the GMC says that they only have powers over registered medical practitioners.
But there are laws about animals.

@cstross @davidgerard who will you imprison? The ceo? The programmers? The qa team?

One of the big draws of tech is the ability to turn human error (and malfeasance) into "computer error". And society has been trained to believe software errors aren't anyone's fault so there's no one to hold accountable

That needs to change. Companies need to be accountable for their "computer errors" - especially when they're baked into design and not actually errors

@Jer @cstross @davidgerard it's the CEOs job to manage legal risk. Imprison the CEO.
@wronglang @cstross @davidgerard I actually agree. It would certainly justify the vast amounts of money they make if they had to take personal responsibility for their harmful decisions. Might make them think a little harder about their decisions
@Jer @cstross @davidgerard I'm into it and I'm also not sure it's necessary. A corporation is just a bunch of greedy people in a trench coat. If you hurt the board with financial consequences for the company that CEO is going to get hurt in the way the care about the most. The broader problem is that we don't properly enforce consequences for companies at all even when the law is pretty clear.
@wronglang @Jer @davidgerard No, the CEO is only hurt *very indirectly* and usually they'll have moved on to another job (with better pay/options) before the pigeons come home to roost. Consider it took more than two decades for the OxyContin scandal to lead to court verdicts, and the Purdue owners still escaped most liability for thousands of deaths by declaring bankruptcy. How many CEOs did Purdue have during that period?

@cstross @wronglang @Jer @davidgerard

Exactly.

When the board votes out a CEO, they lose all unvested stock. All of the salary that they’ve received and all stock that they have that has vested remains theirs. This is normally (for a moderately large company) enough money to live comfortably for the rest of their lives without working.

I would happily endure this ‘punishment’.

@david_chisnall @cstross @Jer @davidgerard yes but we only do mock punishments

edit: my point is that both are options, if we're talking about modifying the law either mechanism could work iff actually applied.

@wronglang @cstross @Jer @davidgerard

You possibly could treat all compensation paid to the CEO while the company knowingly engaged in illegal activity as the proceeds of immoral earnings. There are existing laws that allow such money to be confiscated.

@david_chisnall @cstross @Jer @davidgerard I really wouldn't mind making those laws stronger... and the fact we prosecute shoplifting food but fail to enforce these laws is a bigger problem
@wronglang @david_chisnall @Jer @davidgerard I think a more urgent need is to globally abolish corporate personhood and apply criminal liability law for corporate harms to the individuals who caused the harm. Cut back companies to being a money shelter again, but not a responsibility shelter.

@cstross @wronglang @Jer @davidgerard

The laws in most places allow prosecuting individual members of a company, the difficulty is proving who in a diffuse group that all signed off on part of something is actually responsible. Targeting the company in addition is intended to act as a disincentive by applying financial penalties that make the cost:benefit calculations different. Sadly, the costs are rarely high enough to matter.

The only de jure liability shield that incorporation gives is for shareholders. And this can go away in some cases. Both the UK and USA have a legal notion of ‘piercing the corporate veil’ that can, in extreme cases, make the owners of a company liable.

@david_chisnall @wronglang @Jer @davidgerard That right there is where we need to lean hard into applying the "joint enterprise" doctrine in prosecution. *Everybody* who signed off on it is responsible. If it's a committee? Fine, the committee goes to prison unless they can individually point to a paper trail documenting their objections.

@cstross @wronglang @Jer @davidgerard

And the minimum-wage person who actually did the illegal thing, but was threatened with being fired and losing their home if they didn’t? And the paper trail that says everyone on the committee voted against it, but this rogue employee did the illegal thing unsupervised?

Whistleblower protections would need to be orders of magnitude stronger for this to be enforceable (something I would be very much in favour of).

@david_chisnall @wronglang @Jer @davidgerard Yep, we need stronger whistleblower protections. An assumption that "blame the messenger" is the default company response to whistle-blowing should be baked-in and determine the outcome of wrongful dismissal cases for *any cause whatsoever* for several years after the incident.

@david_chisnall @cstross @wronglang @Jer @davidgerard

The only de jure liability shield that incorporation gives is for shareholders.

Any shareholders that had voting rights and voted for doing illegal shit should also be hit with the same legal liability.

Benefiting from the proceeds of crime, especially crime one ordered is not a protection from liability for it.

@cstross @Jer @davidgerard no, *actually* hurt the company enough to hurt the board, make it clear that the CEO's judgement makes them a bad hire. We do this too little so of course CEOs just float around on golden parachutes.

@Jer @davidgerard That's a broader corporate liability question. Personally I'd LIKE to see the C-suite and boards of corporations that kill people sentenced to serious prison time. (Lower level staff too, but only if it's found that they made decisions that led to deaths on their own initiative. The directors *are responsible for the company's actions*.)

Going further: the current privileged legal status of corporations is an obscenity and needs to be de-legitimized.

@cstross @Jer @davidgerard
We already have exactly this for some regulations like PCI-DSS. It's funny how we can get that sort of things when it protects an industry like the credit card industry.
@Jer @cstross @davidgerard The fun fact is that liability then becomes a mix of everyone who has touched it or enabled it to be in that position.

I see no downsides to applying the liability just like that, with proportional responsibility based on decision power.
@Jer @cstross @davidgerard Imprison the company itself as a legal entity. Freeze all of its assets and cease all activity for the duration of the sentence.
@mikeash @Jer @davidgerard Depending on scale and type of company, congratulations: you just laid of tens to thousands of uninvolved people *and* fucked over their suppliers and customers because a handful of dipshits broke the law. (This is why corporations can get away with these stunts in the first place.)
@cstross @Jer @davidgerard Such a system would strongly discourage companies from growing beyond a certain size. Not sure what the economic effects would be, but it’s interesting to consider.
@Jer @cstross @davidgerard "society has been trained to believe software errors aren't anyone's fault so there's no one to hold accountable" See also: Horizon / Fujitsu

@cstross @davidgerard

Well, its also just as absurd that, in states like New York, even offering ibuprofen to a friend who has a headache is a felony of "practicing medicine without a license".

And a garbage doctor with multiple malpractice suits is still a doctor.

And, if I did exhaust all other avenues, yeah, I would investigate a unknown medical condition with an LLM. I wouldn't trust US AI though.

I have a friend whose wife had that exact thing. Tests after tests had no answer to what was wrong. They got the medical record and fed it into ChatGPT and it provided a differential diagnosis. They took the output and brought it to a human doc, and validated the top disease as correct.

Its way too easy to oversimplify to "slop machine" or "do everything machine". Its neither, but something much more complex and weird.

@cstross @davidgerard

In some parts of the world, it is an offence to give legal advice without being an actual lawyer. But that doesn't seem to stop some lawyers from using LLMs for generating legal documents full of slop.

@cstross @davidgerard I feel like somebody should have learned from the tons of made up citations that lawyers are experiencing. AI doesn't have a law degree, either.