South Korea has jailed a man for using AI to create sexual images of children in a first for country's courts

https://sh.itjust.works/post/6220815

South Korea has jailed a man for using AI to create sexual images of children in a first for country's courts - sh.itjust.works

(Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)

The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.

Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?

That’s a fundamental misunderstanding of how diffusion models work. These models extract concepts and can effortlessly combine them to new images.

If it learns woman + crown = queen

and queen - woman + man = king

it is able to combine any such concept together

As Stability has noted. any model that has the concept of naked and the concept of child in it can be used like this. They tried to remove naked for Stable Diffusion 2 and nobody used it.

Nobody trained these models on CSAM

This can be used by pedophiles is used as an argument to ban cryptography… I wonder if someone will apply that to the generative AI.

Depends how profitable it is.

If it can replace workers no, if it threatens the big players like Disney yes.