The Google AI summary suggesting that people eat rocks is amusing, but it's not a great example of AI "hallucination". The text is a pretty straight and accurate summary of a satirical Onion article. This isn't a complex algorithm synthesizing bogus conclusions from good data (something that's definitely a real risk in AI systems). This is simply Google mis-categorizing non factual input as factual, something it could have (and has) done just as easily without "AI".