The Google AI summary suggesting that people eat rocks is amusing, but it's not a great example of AI "hallucination". The text is a pretty straight and accurate summary of a satirical Onion article. This isn't a complex algorithm synthesizing bogus conclusions from good data (something that's definitely a real risk in AI systems). This is simply Google mis-categorizing non factual input as factual, something it could have (and has) done just as easily without "AI".
@mattblaze the same was true of the ones I've seen for fighting snakes are a thesis defense, recipes for gasoline pizza and glue in pizza, and a couple others. But doesn't help that it has stripped the source and gives the impression its a synthesis of many sources when it actually just grabbed one source.

@PlasmaGryphon
I'm not saying this is *good*. I'm just saying this isn't a useful example of AI hallucination.

Google has long (and without help from AI) conflated "popular" (which the Onion certainly is) with "authoritative" (which the Onion certainly isn't).

@mattblaze @PlasmaGryphon Is there some specific definition of "AI hallucination" that you are referencing? Because it seems that the main difference between this case and other cases is just that it's easier to localize and identify where the algorithm picked up the wrong information.

Just because a different algorithm _also_ makes this mistake doesn't really distinguish this LLM screwup from other LLM screwups.

@gregtitus @PlasmaGryphon It's NOT an LLM screwup. It's accurately reporting the contents of the Onion piece. The problem is that the Onion piece isn't factual. The input was mislabeled from the start. This has nothing to do with how LLMs work.

@mattblaze @PlasmaGryphon In the sense that "we feed it the whole Internet and let it remix it", it is absolutely a failure of THAT algorithm.

Yes, I shouldn't have used the phrase "LLM screwup", which is vague. What I meant was "use of an LLM for a clearly inappropriate task", which is really the issue for all of these search engine "hallucinations". There is no way in which this example is any worse in appropriateness for the task than any of the other hallucinations.