Calling code generated with a large language model 'clean room' is just silly. These are trained on everything, including the code you're claiming isn't in the room.
@marijn “clean” as in “laundered”