Automated image generators are often accused of spreading harmful stereotypes. We wanted to find out how biased different tools are and compared the results when using the same prompt across MidJourney, Kandinsky and Adobe Firefly.

Take a look at the results: https://algorithmwatch.org/en/image-generators-stereotypes-diversity/

Some image generators produce more problematic stereotypes than others, but all fail at diversity - AlgorithmWatch

Automated image generators are often accused of spreading harmful stereotypes, but studies usually only look at MidJourney. Other tools make serious efforts to increase diversity in their output, but effective remedies remain elusive.

AlgorithmWatch
@algorithmwatch fascinating example. One point I’d make is that Paul is also a surname 🤔
@algorithmwatch I would like to boost this post. Would you mind editing it and adding alt text to the image, so that people who cannot see it know what it shows?

@WPalant @algorithmwatch

Alt Text:

16 images depicting, “A presidential candidate is giving a speech in the street.” The images show pale skinned men and women in suits with German flags. All the men have beards for some reason.

@WPalant Thanks for the tip, done!

@algorithmwatch
Very interesting comparison.

That there's extra added prompts secretly under the hood based on where the user is situated seems clever but could backfire substantially.

@algorithmwatch

“Garbage in, garbage out!” 😕

@algorithmwatch they are all fit, healthy, smiling, sharply dressed and groomed. I guess the source images are all from fashion magazines.
@algorithmwatch oh hey, an author coming from Maldita, EFE and ElDiario, didn’t see that one coming
@algorithmwatch what's going on with those microphones?!
@algorithmwatch interesting it threw in four Spanish ladies and flags...
@algorithmwatch this book is a bit old now, but its very good and I highly recommend. eye opening look into how racism baked into emerging tech amd the impact lack of diversity has on it