đź’Ż agree. I wrote this expressing the same opinion:
A hallucination is “an experience involving the apparent perception of something not present” according to the OED.
A LLM neither experiences or perceives anything. It’s lazy to anthropomorphise LLMs.
I wrote ebook-notes.el, an Emacs Lisp package, to streamline the process of importing highlights and notes from an Amazon Kindle’s “My Clippings.txt” file directly into Org mode files. It automatically handles the association of notes with their corresponding highlights and prevents the import of duplicate entries. To make life interesting, I decided to try using a LLM to “help”. I used Google’s Gemimi 2.5 Flash model. Don’t judge me. This was research!