"a a a a a"

is going around again, not long before they start jailbreaking GPT in new and interesting ways.

are custom instructions a way to handwave the issue?