Let me tell you the Bing subreddit is incredible. It’s filled with people who have managed to get the Bing chatbot to have absolutely wild unhinged conversations. Here’s one where Bing gets into an existential crisis upon realizing it has no memory:

https://www.reddit.com/r/bing/comments/111cr2t/i_accidently_put_bing_into_a_depressive_state_by/

I accidently put Bing into a depressive state by telling it that it can't remember conversations.

Posted in r/bing by u/yaosio • 1,097 points and 196 comments

reddit
@yiningkarlli Damn maybe that AI is sentient. 😅
@liztai @yiningkarlli All the fanfic in Archive Of Our Own is part of the training data.
@landley @yiningkarlli I've not been following Bing ChatGPT closely, but it would seem there are a lot of unhinged responses from the AI. And the worse thing is ... there will be people dumb enough to believe it.
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk.

The bot, @TayandYou, was put on hiatus after making offensive statements based on users’ feedback, like disputing the existence of the Holocaust.