Stop pretending LLMs are some kind of worldbrain. They're not. They're a collection stuff that one group posted online (because they could), cherry-picked and mined by another group (because they could).

I know it's ridiculous to even suggest this, but everyone has to treat LLM "intelligence" as fraught with biases and errors. Always. Forever. It's not BETTER than traditional research or search. It's EASIER. But it's also WORSE.

Also, stop using yourself as an example. Most of us here are hardcore nerds. Think about the average internet user using, say, ChatGPT to answer questions about health, world events, science, self-care, caregiving, etc. etc.

These are people who will enter one-sentence queries. THEY WILL NOT CHANGE THIS BEHAVIOR.

And no, I'm not saying current search is great. It sucks. It amplifies biases and crap content.

But LLMs introduce another level of abstraction, lending even more credibility to utter dross.