Hey we solved software development, no need to learn programming anymore (Claude's source code leak)

https://lemmy.world/post/45282644

“make no mistakes”
I’ve literally seen someone include “Don’t hallucinate” in an agent’s instructions.
Asking Claude to not hallucinate is like telling a person to not breathe. it’s gonna happen, and happen conistently.
I think the important bit to understand here is that LLMs are never not hallucinating. But they sometimes happens to hallucinate something correct.
This fact of how LLMs work is not at all widespread enough IMO.