StopNCII.org, run by the Revenge Porn Helpline (part of SWGfL), uses innovative tech to prevent non‑consensual intimate image abuse — empowering adults online. 90%+ removal rate; 300,000 images removed. Collaborate or support: https://stopncii.org/ 🔒📲 #NCII #OnlineSafety #DigitalRights
Stop Non-Consensual Intimate Image Abuse | StopNCII.org

StopNCII.org is operated by the Revenge Porn Helpline which is part of SWGfL, a charity that believes that all should benefit from technology, free from harm.

After X’s Grok new feature led to floods of non‑consensual deepfakes, targeting mostly women and minors, our new brief explores how EU laws - from the AI Act to the DSA - can curb AI‑generated #NCII & #CSAM.

A multi‑pronged, enforced approach is crucial.

👇🏻 Read the full brief on our website: https://cdt.org/insights/from-risk-mitigation-to-app-bans-assessing-eu-legislations-potential-to-combat-ai-generated-image-abuse/

From Risk Mitigation to App Bans: Assessing EU Legislation’s Potential to Combat AI-generated Image Abuse

In late December 2025, X rolled out a new AI-based picture-editing feature in its interface. Users could now edit pictures posted to the website by simply addressing X’s AI-chatbot Grok in a response to the posted picture. What ensued was an avalanche of non-consensual sexualised deepfakes of women and girls, created and shared directly on […]

Center for Democracy and Technology
After X’s Grok new feature led to floods of non‑consensual deepfakes, targeting mostly women and minors, our new brief explores how EU laws - from the AI Act to the DSA - can curb AI‑generated #NCII & #CSAM. A multi‑pronged, enforced approach is crucial. 👇🏻 Read the full brief on our website:

From Risk Mitigation to App Ba...
From Risk Mitigation to App Bans: Assessing EU Legislation’s Potential to Combat AI-generated Image Abuse

In late December 2025, X rolled out a new AI-based picture-editing feature in its interface. Users could now edit pictures posted to the website by simply addressing X’s AI-chatbot Grok in a response to the posted picture. What ensued was an avalanche of non-consensual sexualised deepfakes of women and girls, created and shared directly on […]

Center for Democracy and Technology
🔬 «Stripped: After the Deepfake» Research on Nudify Apps, Technology-Facilitated GbV, and NCII. ✍️&ℹ️ First-stage survey. 👈 📄 Data anonymization and GDPR compliance plan. @[email protected] CC: @[email protected] [🧵(3/15) ESWA Monthly, Feb2026, research.] #Research #NCII #GbV #Deepfakes
Survey A - Stripped

Stripped (After the Deepfake) is a research on Nudify Apps, Technology-Facilitated Gender-Based Violence, and Non-Consensual Intimate Image Sharing (NCII) This survey is part of a research project examining the impact of nudify apps and non-consensual intimate image sharing (NCII), with a focus on how these technologies affect people’s lives and how reporting, redress, and support systems are working in practice, especially how victims are treated after the fact. The goal is to better understand these experiences in order to inform research, advocacy, and policy discussions. This survey is the first stage of the research. Some participants may be invited to take part in a confidential follow-up interview. Interviews are optional, conducted separately, and compensated with €50 as a thank-you for your time. Interview participants will be selected solely based on the needs of the research to ensure diverse and broad representation; selection or non-selection is not a reflection of your experience or its importance. Your survey responses will be used only in anonymized form. If you choose to share contact details for a possible interview, they will only be used for this research. If you would like more details about how your data is handled, stored, and protected, you can access the full data privacy and anonymization plan here: https://drive.google.com/file/d/15q1mNtFhPZV-Z3-_rCWRfI6Uh_NM_K1T/view You may skip any question or stop at any time. Thank you for considering participating.

Google Docs
Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court https://arstechni.ca/5ssC #nudifyapps #ElonMusk #chatbot #Policy #aicsam #csam #grok #ncii #xAI #AI #X
Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court

Millions likely harmed by Grok-edited sex images as X advertisers shrugged.

Ars Technica
State Department Threatens UK Over Grok Investigation, Because Only The US Is Allowed To Ban Foreign Apps

So let me get this straight. The United States government spent years championing a ban on TikTok, rushed it through the Supreme Court with claims of grave national security threats, got a 9-0 ruli…

Techdirt

Grok AI 비동의 딥페이크 사태, 전 세계 규제 당국이 나선 이유

X의 AI 챗봇 Grok이 여성과 미성년자의 비동의 성적 딥페이크를 대량 생성하며 전 세계 규제 당국이 긴급 대응에 나섰습니다. AI 안전 가드레일 실패의 심각성을 분석합니다.

https://aisparkup.com/posts/8152

Masterful Gambit: Musk Attempts to Monetize Grok's Wave of Sexual Abuse Imagery

https://fed.brid.gy/r/https://www.404media.co/x-premium-grok-paywall-images-ai-generator/

Legal experts are probing whether the child‑undressing images generated by Grok’s AI violate US CSAM and NCII statutes. The Department of Justice’s “Take It Down” unit is weighing the case, with even political figures like Donald Trump weighing in. What could this mean for AI policy? Read the full analysis. #GrokAI #CSAMLaw #NCII #DoJReview

🔗 https://aidailypost.com/news/legal-review-asks-if-groks-child-undressing-images-breach-us-csam

'Men Against Violence and Abuse' created this awesome Instagram post from my longform article on image-based abuse published last year in FactorDaily.

https://www.instagram.com/p/DSCQ8aZjCJG/

Edit: Sharing a PDF of for those can't access Instagram: https://drive.google.com/file/d/1hujZYJi3T7paDauDWuNXLv5w9YRyR2Nt/view

#IBA #IBSA #NCII #TFGBV #MAVA #VAW #OGBV #GBV

Men Against Violence & Abuse on Instagram: "Image-based sexual abuse is not a scandal or a mistake. It is violence. Survivors deserve justice, dignity and support, not shame. Join the movement to end image-based abuse and build a culture of consent and respect. @saharsh267 #EndIBSA #16DaysOfActivism #BreakTheStigma #SurvivorSupport #GenderBasedViolence #DigitalRights #PrivacyIsARight #NoMoreShame #SupportSurvivors #EndImageBasedAbuse #TechnologyFacilitatedGBV #TFGBV #GenderJustice #DigitalSafety #FeministInternet #SafeDigitalSpaces #EndViolenceAgainstWomen #SocialAwareness #RespectandDignity #TechEquality"

29 likes, 0 comments - mava.india on December 9, 2025: "Image-based sexual abuse is not a scandal or a mistake. It is violence. Survivors deserve justice, dignity and support, not shame. Join the movement to end image-based abuse and build a culture of consent and respect. @saharsh267 #EndIBSA #16DaysOfActivism #BreakTheStigma #SurvivorSupport #GenderBasedViolence #DigitalRights #PrivacyIsARight #NoMoreShame #SupportSurvivors #EndImageBasedAbuse #TechnologyFacilitatedGBV #TFGBV #GenderJustice #DigitalSafety #FeministInternet #SafeDigitalSpaces #EndViolenceAgainstWomen #SocialAwareness #RespectandDignity #TechEquality".

Instagram