RE: https://mastodon.social/@verge/116408411824430108
"Despite Apple’s approval and xAI’s claims it has tightened safeguards, Grok still appears to be able to generate sexualized deepfakes with relative ease. Cybersecurity sources told me they have been able to create explicit images of celebrities and political figures using the tool, and I have been able to produce similar images of myself and other consenting adults. NBC also reported similar findings yesterday."
So Apple's let a CSAM generator slide on the App Store for four months