“It’s not actually you”: Teens cope while adults debate harms of fake nudes
Most kids know that deepfake nudes are harmful, Thorn survey says.
Google fucking sucks more and more every day:
https://www.404media.co/google-search-includes-paid-promotion-of-nudify-apps/
Come on Google, get it together!
#google #ai #artificialintelligence #fakenudes #nonconsentualNudes
#SocialMedia #AI #GeneratedImages #DeepFakes #FakeNudes: "Slate: Fake nude images aren’t an entirely new issue. What’s the history of this problem?
Sophie Maddocks: There’s a historian, Jessica Lake, and she’s done some really interesting research tracing the potential origins of the creation of fake nude images. She talks a lot about the rise of photography in the late 19th century, and writes about an example of face-swapping in late-19th-century photography where images of the faces of high-society women were pasted onto nude bodies and then circulated. And not only is that one possible starting point when thinking about the history of fake nudes, it’s also an interesting starting point for how we see the creation of A.I.–generated fake nudes. Fake nudes first went viral in the online sense in 2017 with the creation of the DeepNude app where the faces of individuals were digitally pasted onto the bodies of adult film actors, almost exactly mimicking what had been done in the late 19th century with photography.
So there is a long history to this harm, but I think there is that long-standing desire to produce fake nude images—almost exclusively of women. With the rise of the internet, we’ve seen ways of creating and sharing ever more photorealistic images—until we get to the last year with the rise of video- and image-generation models that create extremely realistic imagery and A.I. tools trained on millions of images of girls and women scraped from the internet without their consent. You can either use a text prompt or an existing image to produce a very realistic fake nude.
So A.I. has increased the volume and severity of this problem on the internet.
Absolutely. In 2017, when activists and the first people affected by A.I.–assisted deepfakes, like famous actors and singers, started to raise the alarm about this issue, they really gave us a roadmap for what would happen."
#LisaAI #fakenudes : Une progression supérieure à 2 400 % en un an