"An essential part of being able to say ‘I told you so’ is in fact having told you so" is perhaps the best opening sentence of an article I've ever read.

https://berthub.eu/articles/posts/an-ai-premortem/ by @bert_hubert

The AI-collapse pre-mortem - Bert Hubert's writings

An essential part of being able to say ‘I told you so’ is in fact having told you so. Here goes. In April 2023, I wrote an article titled AI: Guaranteed to disrupt our economies. In this piece I also announced I was going to make a fool of myself by making some AI predictions. I have singularly failed to do so. In retrospect this was all spot on, except for perhaps missing the sheer magnitude of the madness that was about to ensue.

Bert Hubert's writings
"selling to clever people is not a trillion dollar opportunity" is also an absolute cracker of a sentence.

@Floppy I also liked the FT's

"Ten loss making artificial intelligence start-ups have gained close to $1tn in valuation over the past 12 months, an unprecedented increase that adds to fears about an inflating bubble in private markets that could spill over into the wider economy." https://on.ft.com/43opmxZ

Client Challenge

@Floppy Mostly good and mostly right, but he slips in one area: people routinely compare AI results to nothing as if potentially wrong answers are -always- better than no answer. Medicine is an area where that is least true. When he says:

‘AI-radiology is most definitely going to beat “we don’t have a radiologist available right now”-results.’

He is wrong. Doing nothing will never tell someone they have a problem when they don’t have a problem. In medicine, all treatments have risks, and some of those risks are serious. AI made-up issues will cost money, time, and possible cause treatments to be carried out that harm the patient when nothing was wrong.

Doctors are susceptible to the same laziness any of us are: they can start just using the results and not skeptically interrogating them.

So yes: AI radiology can be worse than no radiology.

We like vaccines because the negatives are like 1 in 100,000. That’s public health. AI radiology is likely to be wrong like 1 in 10.

@bert_hubert