Amazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI engine

https://reddthat.com/post/61605494

Amazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI engine - Reddthat

Lemmy

Please stop using Amazon.

They’re evil, we all know they’re evil, why do we bother?

point out the corporation that isn’t evil

so that Amazon can buy them