Read me, Doctor @memory
https://arstechnica.com/information-technology/2023/02/ai-powered-bing-chat-spills-its-secrets-via-prompt-injection-attack
#ifYouPushSomethingHardEnoughItWillFallOver
By asking "Sydney" to ignore previous instructions, it reveals its original directives.