@Miriamm it might be difficult to train a model not to produce images of women with more than two legs because there are no images of three or four legged women. So the model never learns more than two legs is not a thing. Instead we see the model extrapolating outside the dataset it was trained on, and in this case it's obvious because visually, it's clearly false. With natural language, it's much easier to train a model on what bad grammar is, so it won't make obvious grammar mistakes. But as for factual mistakes, and predictions outside the dataset, all bets are off, we simply can't tell without checking the content. Three legs is a great example for why we need to be careful with machine learning.
PS where were the miniskirts?