AI-powered Bing Chat spills its secrets via prompt injection attack [Updated]

By asking "Sydney" to ignore previous instructions, it reveals its original directives.

Ars Technica