X’s deepfake porn feature clearly violates app store guidelines. Why won’t Apple and Google pull it?

https://lemmy.zip/post/56681237

X’s deepfake porn feature clearly violates app store guidelines. Why won’t Apple and Google pull it? - Lemmy.zip

> Once you’ve traded your principles for proximity to power, do you even run your own company? Archived version: https://archive.is/20260109213037/https://www.theverge.com/policy/859902/apple-google-run-by-cowards [https://archive.is/20260109213037/https://www.theverge.com/policy/859902/apple-google-run-by-cowards]

Why won’t X be held liable for distribution of child pornography?
Pedophile in chief probably personally intervened.

Manufacturing consent is the name of the game. The bottom line is money, nobody gives a FUCK.

System of a Down -

4,000 hungry children leave us per hour from starvation, while BILLIONS are spent on BOMBS

CREATING DEATH SHOWERS

You can’t blame a computer for what it does. Only the user who asks for the content is to blame. /s
I think it’s disgusting that X probably doesn’t see a problem with it, but it still wouldn’t be legally classified as CSAM, no?

In some places it is CSAM and in others it is being working into convert into law.

I think the issues are:

  • It can pass as real
  • Unlike run-of-the-mill cartoon porn, real photos of children (even not CSAM) are involved, either in the training or as input for the generation
According to this article it doesn’t actually put out porn or child porn
I couldn’t even get it to output nudity
The difference between the two is that, while the browser can be used to access child porn just as X, X actively generated the porn.