"The core problem is that when people hear a new term they don’t spend any effort at all seeking for the original definition... they take a guess. If there’s an obvious (to them) definiton for the term they’ll jump straight to that and assume that’s what it means.
I thought prompt injection would be obvious—it’s named after SQL injection because it’s the same root problem, concatenating strings together.
It turns out not everyone is familiar with SQL injection, and so the obvious meaning to them was “when you inject a bad prompt into a chatbot”.
That’s not prompt injection, that’s jailbreaking. I wrote a post outlining the differences between the two. Nobody read that either.
The lethal trifecta Access to Private Data Ability to Externally Communicate Exposure to Untrusted Content
I should have learned not to bother trying to coin new terms.
... but I didn’t learn that lesson, so I’m trying again. This time I’ve coined the term the lethal trifecta.
I’m hoping this one will work better because it doesn’t have an obvious definition! If you hear this the unanswered question is “OK, but what are the three things?”—I’m hoping this will inspire people to run a search and find my description.""
https://simonwillison.net/2025/Aug/9/bay-area-ai/
#CyberSecurity #AI #GenerativeAI #LLMs #PromptInjection #LethalTrifecta #MCPs #AISafety #Chatbots