Hey fellow #AI / #LLM #nerds, I've discovered a weird issue with some LLMs I've been foolin' around with. I'm using #oobabooga, and Some models seem to just quit when formatting block text.

```markdown
```

For example, the above is what *some* models do. Others don't seem to care. Something about the ``` token makes it give up. Seen anything like this?

Maybe I should clarify that it doesn't crash. It just seems to decide that when a formatted block comes up, it just makes it empty and puts a stop token. Like some models are thinkin' "Oh, I know what goes in these. Nothing. END OF RESPONSE LOL"
Mistral Instruct 0.2 seems to handle them like a champ, tho. So props to #Mistral!