The Google AI summary suggesting that people eat rocks is amusing, but it's not a great example of AI "hallucination". The text is a pretty straight and accurate summary of a satirical Onion article. This isn't a complex algorithm synthesizing bogus conclusions from good data (something that's definitely a real risk in AI systems). This is simply Google mis-categorizing non factual input as factual, something it could have (and has) done just as easily without "AI".
@mattblaze the same was true of the ones I've seen for fighting snakes are a thesis defense, recipes for gasoline pizza and glue in pizza, and a couple others. But doesn't help that it has stripped the source and gives the impression its a synthesis of many sources when it actually just grabbed one source.

@PlasmaGryphon
I'm not saying this is *good*. I'm just saying this isn't a useful example of AI hallucination.

Google has long (and without help from AI) conflated "popular" (which the Onion certainly is) with "authoritative" (which the Onion certainly isn't).

@mattblaze yeah, I was just trying to add that there are a wide variety of these popping up in last couple days and all the ones I looked at were the same pattern of it repeating a single search result or reddit post.