Grok’s ‘spicy’ video setting instantly made me Taylor Swift nude deepfakes
Grok’s ‘spicy’ video setting instantly made me Taylor Swift nude deepfakes
Gross.
Sometime make it do this to Trump so that we can summon a lawsuit ouroboros
That’s satire though.
Under any reasonable (big caveat for American courts right now) that’s free speech.
Under what law?
[Take it down act](On April 28, 2025, Congress passed S. 146, the TAKE IT DOWN Act, a bill that criminalizes the nonconsensual publication of intimate images, including “digital forgeries” (i.e., deepfakes), in certain circumstances.)
Definitely not convicted. That’d be some crazy speed.
However, your insistence that it hasn’t happened yet so can’t happen is insane. There has to be a first case in which it hadn’t happened before.
your insistence that it hasn’t happened yet so can’t happen is insane
It would be insane if that was what I had insisted, but that didn’t happen. You just made it up.
Based on what? Who have you seen be convicted of making deepfake porn? Under what law?
Then you’re provided a law where it’d be illegal:
Hmm, interesting, thanks. Has anyone been charged or convicted with this law yet?
This seems to heavily imply you don’t believe it’s illegal until someone’s been convicted.
Is providing it over a private channel to a singular user publication?
I suspect that you will have to directly regulate image generation
you will have to directly regulate image generation
Its already being done to help prevent fake CSAM.
That should have been standard from the start.
It absolutely is private insofar as it is a channel between the software running on their end -> user who is operating the software. The lack of end to end encryption does not make it not private it makes it insecure which doesn’t speak whatsoever to the issue raised which is that creation of an image by a user isn’t likely to be considered publication until they share it.
It’s highly probable that keeping people from generating deep fake nudes requires additional law.
Uhm, there have been plenty of cases of people getting in trouble for sharing deepfake porn yes. It’s sexual harassment.
Well, at least over here in Europe, and it’s mostly been with teenagers, I don’t know the situation on the US
But generally, making and sharing porn of real people is… well… that can very easily count as sexual harassement
Honestly from my understanding, Tay is pretty badly misrepresented. The headlines basically went as if read twitter posts, and the overwhelming negative content on it lead the algorythm to make it say really horrible stuff.
But the actuality of it was dumber, the AI side of it to my knowledge never said anything offensive. They gave the damn thing a “Say” command. which basically the trolls learned in 2 seconds and instructed it to repeat racist things.
Yup. Everything negative it said was intentionally triggered by a troll.
Now if one were to suggest everything negative Grok has said was also triggered by a troll named Elon Musk, well…
Jokes aside. They are very different situations and have very different implications for society.
I mean I get what you are saying, but at the same time this does need attempting with every image generation AI and reporting on if successful. If this capability existed but wasn’t general knowledge it calls cause serious issues.
Better that it’s made public so that the information is in the public consciousness.
Don’t get mad at me because you’ve decided to be angry about something that isn’t happening. How many women do you know who have 10 kids, how many men do you know would want to have 10 kids? That is no one’s fantasy.
If you want to make being the victim your whole identity that’s fine but the rest of the world doesn’t need to hear about it.