I just tried this a couple different ways:
I think there are two arguments going on here, though
Most people arguing point 1 would be willing concede point 2, especially since you linked evidence of it.
I don't believe if you're fully arguing in good faith here.
I'm assuming you've seen a naked adult, and if you had never seen a naked young person, I don't believe for one second you would be unable to infer what a naked young person might look like. You might not know for certain, but your best guess would likely be very accurate.
Generative AI can absolutely make those same inferences, so it does not need inappropriate training material for it to generate it.
The AI knows what a young person looks like.
It knows what a clothed adult looks like.
It knows what an unclothed adult looks like.
An AI trained on 100% legal material could make that inappropriate inference without even trying.
Now, have all the popular AI models actually been trained on 100% legal material? I have no way of knowing that answer, but you're incorrect to assume that just because it can output inappropriate images, that absolutely 100% proves that data was also included in its training input.