LLMs are modeling outputs, not replicating a process, as a result the output looks the same but it isn't made of the same stuff.

It is a plastic banana.

There is nothing inherently wrong with a plastic banana but as soon as you claim you can use it to solve world hunger people are going to be upset and it doesn't matter how much it looks right.

@Vrimj In that case, how about chain of thought prompting? It's true that LMs only model an input/output distribution, but they can mimic and reproduce as many convincing artifacts along the way as necessary.