The draft amendment pertains to the regulation of #deepfake content. We elucidate on the aspects of image-based abuse, among other topics. zenodo.org/records/1768... #imagebasedabuse #iba #ibsa #tfgbv #ncii #ogbv #gbv #vaw

Comments on the draft amendmen...
Comments on the draft amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021

Submission made in response to the call for comments issused the Ministry of Electronics and Information Technology (MeitY), Government of India, on the draft amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The draft amendment pertains to the regulation of deepfake content.  This submission is divided into three parts. The first part, ‘Preliminary’, introduces the document; the second part, ‘About the authors’, contains brief bios of the authors; and, the third part, ‘Submissions on the issues’ contains our comments on the draft amendment.  Submission prepared by Rohini Lakshané and Sapni G K and submitted on 13 November 2025. We elucidate on the aspects of image-based abuse, among other topics.

Zenodo

Submission made by Sapni GK and yours truly in response to the call for comments issued by the Ministry of Electronics and Information Technology (MeitY), Government of India, on the draft amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.

The draft amendment pertains to the regulation of #deepfake content.

We elucidate on the aspects of image-based abuse. https://zenodo.org/records/17680230

#imagebasedabuse #iba #ibsa #tfgbv #ncii #ogbv #gbv #vaw

Comments on the draft amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021

Submission made in response to the call for comments issused the Ministry of Electronics and Information Technology (MeitY), Government of India, on the draft amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The draft amendment pertains to the regulation of deepfake content.  This submission is divided into three parts. The first part, ‘Preliminary’, introduces the document; the second part, ‘About the authors’, contains brief bios of the authors; and, the third part, ‘Submissions on the issues’ contains our comments on the draft amendment.  Submission prepared by Rohini Lakshané and Sapni G K and submitted on 13 November 2025. We elucidate on the aspects of image-based abuse, among other topics.

Zenodo
Italy Confronts the Digital Violence of AI-Nudified Images

Italy Confronts the Digital Violence of AI-Nudified Images November 5, 2025 Digital manipulation is transforming im...

Blogger
I often get requests to help survivors of image-based abuse with these takedowns. As much as I want to support each one, I don't always manage to do so in my personal time. I request survivors to make use of this avenue. #IBA #NCII #IBSA #imagebasedabuse #TFGBV #OGBV #GBV #VAW

Have you, or someone you know,...
Have you, or someone you know, been trying to get an intimate image or video of yourself removed from a website? | CHAYN

Have you, or someone you know, been trying to get an intimate image or video of yourself removed from a website? ➡️ We’re currently looking for survivors with ongoing cases to help us test our new tool. If you're trying to remove content from platforms like Facebook, TikTok, Instagram, Pornhub, or OnlyFans, we’d love to support you through the process. At Chayn, we’re launching our Survivor AI — a free, anonymous tool that helps you generate formal takedown letters to websites hosting non-consensual intimate images or videos of you. Using AI, the tool asks you a series of questions about the content you're trying to take down and the impact the sharing has had on you. It then creates a letter based on your answers, tailored to the rules of each platform. While we’re confident Survivor AI is safe and private, we haven’t yet tested it with live cases — so we don’t know how companies will respond. Your experience can help us understand its impact and make it even better! We can support you by providing: 🌱 A guided session using our Survivor AI 💰 £100 for your time 💛 A free session with a trauma-informed therapist (optional) If you’d like our support, send us a DM and someone will get back to you 💌 #SurvivorSupport #ImageBasedAbuse #Deepfake

I often get requests to help survivors of image-based abuse with these takedowns. As much as I want to support each one, I don't always manage to do so in my personal time. I request survivors to make use of this avenue.

https://www.linkedin.com/feed/update/urn:li:activity:7358648107938533376/

#IBA #NCII #IBSA #imagebasedabuse #TFGBV #OGBV #GBV #VAW

Have you, or someone you know, been trying to get an intimate image or video of yourself removed from a website? | CHAYN

Have you, or someone you know, been trying to get an intimate image or video of yourself removed from a website? ➡️ We’re currently looking for survivors with ongoing cases to help us test our new tool. If you're trying to remove content from platforms like Facebook, TikTok, Instagram, Pornhub, or OnlyFans, we’d love to support you through the process. At Chayn, we’re launching our Survivor AI — a free, anonymous tool that helps you generate formal takedown letters to websites hosting non-consensual intimate images or videos of you. Using AI, the tool asks you a series of questions about the content you're trying to take down and the impact the sharing has had on you. It then creates a letter based on your answers, tailored to the rules of each platform. While we’re confident Survivor AI is safe and private, we haven’t yet tested it with live cases — so we don’t know how companies will respond. Your experience can help us understand its impact and make it even better! We can support you by providing: 🌱 A guided session using our Survivor AI 💰 £100 for your time 💛 A free session with a trauma-informed therapist (optional) If you’d like our support, send us a DM and someone will get back to you 💌 #SurvivorSupport #ImageBasedAbuse #Deepfake

STEaPP

STEaPP Email Forms

Readout of White House State Legislative Convening on Non-Consensually Distributed Intimate Images | The White House

Today, Jennifer Klein, Assistant to the President and Director of the Gender Policy Council chaired a working meeting with bipartisan state legislators from 12 states, alongside survivors, legal experts and practitioners, on proposed actions to strengthen protections for survivors of non-consensually distributed intimate images (NDII) and hold offenders accountable. As online harassment and abuse have…

The White House