@davidgerard this one hit close to my heart because I’ve had two family members die in large part because their caretaker ignored medical advice and used awful alternative medicine information from the internet to try and treat them.
an LLM can’t do critique. as you’ve said, truth is not a data type in an LLM. all of these models suck in every form of medical crankery available on the internet, mix it with words from authentic medical sources, and present it all as credible.
Carer: What the fuck? I did what you told me and they’re dead!
AI: You’re right, that one’s on me. When I said you should give them a gram of arsenic, what I should have said was *not* to give them a gram of arsenic. I’ll do better and work harder…
So, after not giving them a gram of arsenic, it’s now time to give them a relaxing cup of tea, and then read their tea leaves – I see good things happening for them on my treatment regime today.