Hey we solved software development, no need to learn programming anymore (Claude's source code leak)
https://lemmy.world/post/45282644
I’ve literally seen someone include “Don’t hallucinate” in an agent’s instructions.
Asking Claude to not hallucinate is like telling a person to not breathe. it’s gonna happen, and happen conistently.
I think the important bit to understand here is that LLMs are never not hallucinating. But they sometimes happens to hallucinate something correct.