Whoops! Google's reached a new level of JAQing off and decided to answer the questions instead of just leading to them

https://awful.systems/post/2704329

Whoops! Google's reached a new level of JAQing off and decided to answer the questions instead of just leading to them - awful.systems

> Hermansson logged in to Google and began looking up results for the IQs of different nations. When he typed in “Pakistan IQ,” rather than getting a typical list of links, Hermansson was presented with Google’s AI-powered Overviews tool, which, confusingly to him, was on by default. It gave him a definitive answer of 80. > When he typed in “Sierra Leone IQ,” Google’s AI tool was even more specific: 45.07. The result for “Kenya IQ” was equally exact: 75.2. Hmm, these numbers seem very low. I wonder how these scores were determined.

I don’t understand the title. LLM hallucinations have nothing to do with JAQing off.
Just asking questions

Just asking questions (also known as JAQing off, or as emojis: "🤔🤔🤔"[1]) is a way of attempting to make wild accusations acceptable (and hopefully not legally actionable) by framing them as questions rather than statements. It shifts the burden of proof to one's opponent; rather than laboriously having to prove that all politicians are reptoid scum, one can pull out one single odd piece of evidence and force the opponent to explain why the evidence is wrong.

RationalWiki
Problem it wasn’t a hallucination - it was referencing a paper that has been debunked. These aren’t made up numbers, they’re VERY specific numbers that come from a VERY specific paper.
Okay, but it’s still got nothing to do with the dishonest rhetorical technique called “JAQing off” (a.k.a. “Just Asking Questions,” a.k.a. “sealioning”).
It’s kind of a … symptom … of the community we’re in. I wouldn’t read into it too deeply.