RE: https://flipboard.social/@TechDesk/116002835028405163

Would you believe the man behind this electronic CSAM machine was also so fucking keen to go to an Epstein rape party?
Yes, of course you would, fucking 'Pedo Guy' that he is! 😡

#ElonMusk #GrokAI #Musk #Twitter #CSAM #NonConPorn #NudifyApps #Nonce #EpsteinFiles #Epstein #JeffreyEpstein

Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court https://arstechni.ca/5ssC #nudifyapps #ElonMusk #chatbot #Policy #aicsam #csam #grok #ncii #xAI #AI #X
Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court

Millions likely harmed by Grok-edited sex images as X advertisers shrugged.

Ars Technica

Cook and Pichai are getting thumped hard over failure to boot Grok off App stores.

Apple and Google both explicitly ban apps containing CSAM, which is illegal to host and distribute in many countries, and yet Grok is still available on app stores. Grok also is violating X’s own policies, which prohibit sharing illegal content. The EU will likely begin a formal investigation of violations of the EU’s Digital Services Act.

Cook and Pichai have been called cowards in public media posts.

https://www.wired.com/story/x-grok-app-store-nudify-csam-apple-google-content-moderation/
#CSAM #Grok #Apple #Google #AppStore #PlayStore #Cook #Pichai #nudifyapps #Illegal #Internet #MobileApps #xAI

UK probes X over Grok CSAM scandal; Elon Musk cries censorship https://arstechni.ca/5K7Q #non-consensualintimateimagery #unitedkingdom #generativeai #nudifyapps #ElonMusk #chatbot #Policy #Ofcom #csam #grok #xAI #AI #X
UK probes X over Grok CSAM scandal; Elon Musk cries censorship

Grok tests if UK can penalize platforms for sexualized deepfakes generated by AI.

Ars Technica

Non-consensual sexualization tools (#NSTs) are apps and websites that generate sexualized images of real people, whether they have given consent or not. Often called #NudifyApps, they don’t only create full nudity but can also include non-consensual “undressing” via images in underwear or swimsuits.

Support us in holding platforms accountable! If you see apps, websites, or accounts that create or spread sexualized deep fakes, report them to us: https://algorithmwatch.org/en/stop-nudifying-deepfakes/
https://algorithmwatch.org/en/stop-nudifying-deepfakes/

Teen sues to destroy the nudify app that left her in constant fear https://arstechni.ca/5z8Q #AI-generatedimages #nudifyapps #fakenudes #Policy #csam #ncii #AI
Teen sues to destroy the nudify app that left her in constant fear

Lawsuit accuses nudify apps of training on teen victims’ images.

Ars Technica

Police investigating reports of explicit deepfake images of girls from Sydney school

Police are investigating reports that digitally altered explicit images using the faces of female students from a Sydney…
#NewsBeep #News #Headlines #AI #ai-generated #AU #Australia #courtneyhoussos #eastwoodpolicestation #femalestudents #Legislation #nsw #nswpolice #nudifyapps #Parliament #rydepoliceareacommand #sydneyschool #sydneyschoolboys
https://www.newsbeep.com/188198/

To shield kids, California hikes fake nude fines to $250K max

California cracks down on AI as child safety concerns grow.

Ars Technica

Non-consensual Sexualization Tools (#NSTs) sind Apps und Websites, die sexualisierende #Deepfakes von echten Menschen generieren – ohne deren Zustimmung. Diese #NudifyApps genannten Tools zeigen Menschen ganz nackt oder in Unterwäsche oder Badeanzügen.

Hilf uns, NSTs zu finden! Wenn du Apps, Websites oder Accounts siehst, die sexualisierende Deepfakes erstellen oder verbreiten, melde sie uns bitte: https://algorithmwatch.org/de/lasst-uns-deepfake-apps-gemeinsam-stoppen/

Non-consensual sexualization tools #NSTs are apps and websites that generate sexualized images of real people, against their consent. Often called #NudifyApps, they don’t only create fully nude images but can also include non-consensual “undressing” via images in underwear or swimsuits – as seen recently on the X chatbot #Grok.

We need your help to track them down. If you see apps, websites, or accounts that create or spread sexualized deep fakes, report them to us: https://algorithmwatch.org/en/stop-nudifying-deepfakes/