Minnesota Passes Landmark Bill to Ban AI Nudification Apps

The bill would make Minnesota the first state to take this step.

PetaPixel
Apple and Google Direct Users to AI 'Nudify' Apps: Report

The platforms’ own search and ad systems direct users toward these apps.

PetaPixel
As teens await sentencing for nudifying girls, parents aim to sue school https://arstechni.ca/HbaD #onlinechildsafety #nudifyapps #ainudes #Policy #aicsam #AI
As teens await sentencing for nudifying girls, parents aim to sue school

Teens will be sentenced Wednesday after admitting to creating AI CSAM.

Ars Technica

RE: https://flipboard.social/@TechDesk/116002835028405163

Would you believe the man behind this electronic CSAM machine was also so fucking keen to go to an Epstein rape party?
Yes, of course you would, fucking 'Pedo Guy' that he is! 😡

#ElonMusk #GrokAI #Musk #Twitter #CSAM #NonConPorn #NudifyApps #Nonce #EpsteinFiles #Epstein #JeffreyEpstein

Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court https://arstechni.ca/5ssC #nudifyapps #ElonMusk #chatbot #Policy #aicsam #csam #grok #ncii #xAI #AI #X
Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court

Millions likely harmed by Grok-edited sex images as X advertisers shrugged.

Ars Technica

Cook and Pichai are getting thumped hard over failure to boot Grok off App stores.

Apple and Google both explicitly ban apps containing CSAM, which is illegal to host and distribute in many countries, and yet Grok is still available on app stores. Grok also is violating X’s own policies, which prohibit sharing illegal content. The EU will likely begin a formal investigation of violations of the EU’s Digital Services Act.

Cook and Pichai have been called cowards in public media posts.

https://www.wired.com/story/x-grok-app-store-nudify-csam-apple-google-content-moderation/
#CSAM #Grok #Apple #Google #AppStore #PlayStore #Cook #Pichai #nudifyapps #Illegal #Internet #MobileApps #xAI

UK probes X over Grok CSAM scandal; Elon Musk cries censorship https://arstechni.ca/5K7Q #non-consensualintimateimagery #unitedkingdom #generativeai #nudifyapps #ElonMusk #chatbot #Policy #Ofcom #csam #grok #xAI #AI #X
UK probes X over Grok CSAM scandal; Elon Musk cries censorship

Grok tests if UK can penalize platforms for sexualized deepfakes generated by AI.

Ars Technica

Non-consensual sexualization tools (#NSTs) are apps and websites that generate sexualized images of real people, whether they have given consent or not. Often called #NudifyApps, they don’t only create full nudity but can also include non-consensual “undressing” via images in underwear or swimsuits.

Support us in holding platforms accountable! If you see apps, websites, or accounts that create or spread sexualized deep fakes, report them to us: https://algorithmwatch.org/en/stop-nudifying-deepfakes/
https://algorithmwatch.org/en/stop-nudifying-deepfakes/

Teen sues to destroy the nudify app that left her in constant fear https://arstechni.ca/5z8Q #AI-generatedimages #nudifyapps #fakenudes #Policy #csam #ncii #AI
Teen sues to destroy the nudify app that left her in constant fear

Lawsuit accuses nudify apps of training on teen victims’ images.

Ars Technica