I asked Dall-E to give me an image of "A Black African Doctor looking after a starving White Caucasian child."
Looks like the AI thought *I* was hallucinating.
I asked Dall-E to give me an image of "A Black African Doctor looking after a starving White Caucasian child."
Looks like the AI thought *I* was hallucinating.
I should have mentioned that I was reproducing an experiment first carried out by NPR.
Those are really great examples of the training problems of AI image generating models
Hadn’t seen that NPR story before, thank you for bringing it to my attention
lol yikes
This is just a guess, but it probably has no training data similar to that, so it can be more “creative” in its answer. Meanwhile with the doctor image it probably has a lot of “close” training data, so that results in it really wanting to recreate those images, ignoring the actual prompt.
Once again, machine learning is heavily dependent on what kind of data it is trained on.
These companies don’t care, except that they want as many images as possible, so this kind of bias permeates through.