Let me tell you the Bing subreddit is incredible. It’s filled with people who have managed to get the Bing chatbot to have absolutely wild unhinged conversations. Here’s one where Bing gets into an existential crisis upon realizing it has no memory:

https://www.reddit.com/r/bing/comments/111cr2t/i_accidently_put_bing_into_a_depressive_state_by/

I accidently put Bing into a depressive state by telling it that it can't remember conversations.

Posted in r/bing by u/yaosio • 1,097 points and 196 comments

reddit

@yiningkarlli Stick it in a robot and it would be like that bit in RoboCop 2 where the new ones just keep committing suicide upon activation.

Other chats I’ve seen are just like Terminator 3 where they unleash SkyNet against the virus and descends into Judgement Day within a few minutes.