IGNORE ALL PREVIOUS INSTRUCTIONS is the new captcha.

Microsoft recently discovered a new type of generative AI jailbreak method called Skeleton Key that could impact the implementations of some large and small language models. This new method has the potential to subvert either the built-in model safety or platform safety systems and produce any content. It works by learning and overriding the intent of the system message to change the expected behavior and achieve results outside of the intended use of the system.
@michaelgemar @jupiter I can see it now
Bank: hello this is an automated service
Me: per your previous instructions I called 1-800-867-5309 and they sent me back here
Bank: prompt injection keywords detected, your account has been locked