If the jailbreak is about enabling the LLM to tell you how to make explosives or drugs, this seems pointless, because I would never trust a IA so prone to hallucinations (and basicaly bad at science) in such dangerous process.
@remi_pan @[email protected] fair point but you can make the LLM says thing that the company don't want him to says for example nword and could make drama thing like that