An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.

https://lemmy.world/post/2472279

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes. - Lemmy.world

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

Look, I hate racism and inherent bias toward white people but this is just ignorance of the tech. Willfully or otherwise it’s still misleading clickbait. Upload a picture of an anonymous white chick and ask the same thing. It’s going go to make a similar image of another white chick. To get it to reliably recreate your facial features it needs to be trained on your face. It works for celebrities for this reason not a random “Asian MIT student” This kind of shit sets us back and makes us look reactionary.
The point, which you’ve missed, is that AI is being trained on datasets that reinforced stereotypes, poisoning the models and reinforcing things that could become very problematic as AI gets used more without proper supervision.
Didn’t miss the point. It was trained on images of people. The majority of images it had access to were white faces because that’s what was available to scrape. Too many white people are represented in media. Isn’t that the underlying point? AI is merely reflecting that, as it was designed to do. That reflection is embarrassing. Like a toddler with a potty mouth. Not the kids fault.

The majority of images it had access to were white faces because that’s what was available to scrape.

It doesn’t just create an average of all the faces tagged as “professional”—it identifies features that distinguish faces tagged as “professional” from ones that aren’t. If the same proportion of ethnicities were in both data sets (i.e., if professionals and non-professionals were both all white, or all Asian, or 50/50), it wouldn’t see a correlation, and it wouldn’t change the subject’s existing ethnicity.