Teen sues to destroy the nudify app that left her in constant fear https://arstechni.ca/5z8Q #AI-generatedimages #nudifyapps #fakenudes #Policy #csam #ncii #AI
Teen sues to destroy the nudify app that left her in constant fear

Lawsuit accuses nudify apps of training on teen victims’ images.

Ars Technica
Grok generates fake Taylor Swift nudes without being asked

Elon Musk so far has only encouraged X users to share Grok creations.

Ars Technica
Nudify app’s plan to dominate deepfake porn hinges on Reddit, docs show https://arstechni.ca/kVGK #ArtificialIntelligence #deepfakeporn #revengeporn #nudifyapps #fakenudes #Policy #AI
Nudify app’s plan to dominate deepfake porn hinges on Reddit, docs show

Report: Clothoff ignored California’s lawsuit while buying up 10 rivals.

Ars Technica
Trump to sign law forcing platforms to remove revenge porn in 48 hours https://arstechni.ca/vohm #takeitdownact #DonaldTrump #revengeporn #fakenudes #ainudes #Policy #AI
Trump to sign law forcing platforms to remove revenge porn in 48 hours

Take It Down Act’s 48-hour timeline may be both too fast and too slow.

Ars Technica

“It’s not actually you”: Teens cope while adults debate harms of fake nudes

Most kids know that deepfake nudes are harmful, Thorn survey says.

https://arstechnica.com/tech-policy/2025/03/peer-pressure-revenge-horniness-teens-explain-why-they-make-fake-nudes/

#news #tech #technology #security #privacy #ai #fakenudes

“It’s not actually you”: Teens cope while adults debate harms of fake nudes

Most kids know that deepfake nudes are harmful, Thorn survey says.

Ars Technica
Google Search Includes Paid Promotion of “Nudify” Apps

Google Search is providing promoted links to nonconsensual AI “undress” apps.

404 Media

#SocialMedia #AI #GeneratedImages #DeepFakes #FakeNudes: "Slate: Fake nude images aren’t an entirely new issue. What’s the history of this problem?

Sophie Maddocks: There’s a historian, Jessica Lake, and she’s done some really interesting research tracing the potential origins of the creation of fake nude images. She talks a lot about the rise of photography in the late 19th century, and writes about an example of face-swapping in late-19th-century photography where images of the faces of high-society women were pasted onto nude bodies and then circulated. And not only is that one possible starting point when thinking about the history of fake nudes, it’s also an interesting starting point for how we see the creation of A.I.–generated fake nudes. Fake nudes first went viral in the online sense in 2017 with the creation of the DeepNude app where the faces of individuals were digitally pasted onto the bodies of adult film actors, almost exactly mimicking what had been done in the late 19th century with photography.

So there is a long history to this harm, but I think there is that long-standing desire to produce fake nude images—almost exclusively of women. With the rise of the internet, we’ve seen ways of creating and sharing ever more photorealistic images—until we get to the last year with the rise of video- and image-generation models that create extremely realistic imagery and A.I. tools trained on millions of images of girls and women scraped from the internet without their consent. You can either use a text prompt or an existing image to produce a very realistic fake nude.

So A.I. has increased the volume and severity of this problem on the internet.

Absolutely. In 2017, when activists and the first people affected by A.I.–assisted deepfakes, like famous actors and singers, started to raise the alarm about this issue, they really gave us a roadmap for what would happen."

https://slate.com/technology/2024/01/taylor-swift-deepfake-porn-cyber-violence-abuse-research.html?mc_cid=6ff095fc03

We’re Completely Unprepared for the Deepfake Porn Boom

Taylor Swift is just the highest-profile target.

Slate
Deepfake : l'inquiétante montée du dénudage par intelligence artificielle

Des streameuses telles que QTCinderella ont décidé d’attaquer frontalement les applications de deepfake qui engendrent la circulation d’images dénudées de leur personne…

Futura
¿Creías que los "fake nudes" y el porno de venganza eran algo moderno? Esta foto de 1862 es de María Sofía de Baviera o, al menos, lo es su cabeza. Se mandó a monarcas y periódicos para desacreditar a los Borbones en el exilio. #fakenudes #revengeporn #montajes #fotografia #photography