Calling code generated with a large language model 'clean room' is just silly. These are trained on everything, including the code you're claiming isn't in the room.
@marijn “clean” as in “laundered”
@marijn very important as a last resort is also the human vs LLM comparison: "the human also has to eat so evaporating the ocean is fine"; "the human also has to be trained so big tech selling you your own labour is fine"