Somebody managed to coax the Gab AI chatbot to reveal its prompt:
@bontchev Jail-breaking LLMs is getting ridiculously easy.