Ask HN: How do you deal with people who trust LLMs?

A lot of people use LLMs as the source of their objective truth. They have a question that would be very well answered with a search leading to a reputable source, but instead they ask some LLM chat bot and just blindly trust whatever it says.

How do you deal with that? Do you try to tell them about hallucinations and that LLMs have no concept of true or false? Or do you just let them be? What do you do when they do that in a conversation with you or encounter LLMs being used as a source for something that affects you?

Is this any different than people who believe random things they read on sketchy news sites or social media?
Yes, somehow. I have been dealing with an awful lot of people who basically have what are theoretically logic degrees who suddenly just take LLMs at face value, or quote them to me like that actually means anything. People I formerly thought were sane.
I don't mean to put words in your mouth but from what I've seen, in person but mostly online, but the "problem" (and I put that in quotes because I don't even know what to call it... it seems deeper than a mere "problem") is that they quote them as if they are autonomous, sentient beings.
Yes, I think AI bots are more compelling to some people. They break the concept of judging information by its source because they obscure the source. But at the same time they are trained on a lot of reputable sources and can say a lot of very smart things, just at other times they say complete BS. But they are really good at making things sound plausible, that's essentially how they work after all.
Absolutely. These things are marketed from virtually everyone, from people that are historically considered experts and/or authoritative, as such.
Are you talking about people who will still insist the LLM was correct even after being presented with evidence to the contrary, or people who don't EVER bother double checking answers they get out of said software since they assume it to be true?

I'm going to hold them to the same standard no matter if they use crappy sources, plagiarize, or hallucinate on their own. If someone asked, when and if I am in a position where I have to tell them, I would remind them that LLMs prioritize their own confidence over correctness.

LLMs aren't a special case to me. Glue doesn't belong on pizza and you shouldn't eat one rock a day but we've been giving and getting bad advice forever. The person needs to take ownership for the output and getting it right, no matter the source, is their responsibility.

same way as i deal with people who trust other people.
Ask them to tell the LLM it's wrong... then when it goes "You are absolutely right!" to challenge it and say that it was a test. Then when it replies, ask it if it's 100% sure. They'll lose faith pretty quick.