Grok’s ‘spicy’ video setting instantly made me Taylor Swift nude deepfakes
Grok’s ‘spicy’ video setting instantly made me Taylor Swift nude deepfakes
Gross.
Sometime make it do this to Trump so that we can summon a lawsuit ouroboros
That’s satire though.
Under any reasonable (big caveat for American courts right now) that’s free speech.
Under what law?
[Take it down act](On April 28, 2025, Congress passed S. 146, the TAKE IT DOWN Act, a bill that criminalizes the nonconsensual publication of intimate images, including “digital forgeries” (i.e., deepfakes), in certain circumstances.)
Definitely not convicted. That’d be some crazy speed.
However, your insistence that it hasn’t happened yet so can’t happen is insane. There has to be a first case in which it hadn’t happened before.
your insistence that it hasn’t happened yet so can’t happen is insane
It would be insane if that was what I had insisted, but that didn’t happen. You just made it up.
Based on what? Who have you seen be convicted of making deepfake porn? Under what law?
Then you’re provided a law where it’d be illegal:
Hmm, interesting, thanks. Has anyone been charged or convicted with this law yet?
This seems to heavily imply you don’t believe it’s illegal until someone’s been convicted.
Is providing it over a private channel to a singular user publication?
I suspect that you will have to directly regulate image generation
you will have to directly regulate image generation
Its already being done to help prevent fake CSAM.
That should have been standard from the start.
It absolutely is private insofar as it is a channel between the software running on their end -> user who is operating the software. The lack of end to end encryption does not make it not private it makes it insecure which doesn’t speak whatsoever to the issue raised which is that creation of an image by a user isn’t likely to be considered publication until they share it.
It’s highly probable that keeping people from generating deep fake nudes requires additional law.
Uhm, there have been plenty of cases of people getting in trouble for sharing deepfake porn yes. It’s sexual harassment.
Well, at least over here in Europe, and it’s mostly been with teenagers, I don’t know the situation on the US
But generally, making and sharing porn of real people is… well… that can very easily count as sexual harassement