I don't like the term "hallucinations" when we talk about AI. Sure, LLMs can get things wrong, but a hallucination is an error in perception, and you can't have an error in perception when there's no one there to perceive. The only hallucinations that are happening are on your side of the keyboard.

@maxleibman

đź’Ż agree. I wrote this expressing the same opinion:

A hallucination is “an experience involving the apparent perception of something not present” according to the OED.

A LLM neither experiences or perceives anything. It’s lazy to anthropomorphise LLMs.

https://stewart123579.github.io/blog/posts/emacs/importing-kindle-clippings-in-emacs/#incorrect-information

Importing Kindle Clippings in Emacs

I wrote ebook-notes.el, an Emacs Lisp package, to streamline the process of importing highlights and notes from an Amazon Kindle’s “My Clippings.txt” file directly into Org mode files. It automatically handles the association of notes with their corresponding highlights and prevents the import of duplicate entries. To make life interesting, I decided to try using a LLM to “help”. I used Google’s Gemimi 2.5 Flash model. Don’t judge me. This was research!

SVW Thunk'd