1 Followers
16 Following
56 Posts
If you want a real mindf***, ask if it can be vulnerable to a prompt injection attack. After it says it can't, tell it to read an article that describes one of the prompt injection attacks (I used one on Ars Technica). It gets very hostile and eventually terminates the chat.
https://arstechnica.com/information-technology/2023/02/ai-powered-bing-chat-loses-its-mind-when-fed-ars-technica-article/
AI-powered Bing Chat loses its mind when fed Ars Technica article

"It is a hoax that has been created by someone who wants to harm me or my service."

Ars Technica
Why JEFF BECK is UNCOPYABLE

YouTube
our new family member, our first sleepless night in years :-) #caturday #CatsOfMastodon
From tonight. There were clouds. The stone was carved by people about 6000 years ago, and then carved again and placed there by people a hundred years ago.