@chris_evelyn @mjg59 @greg I think "LLMs are like a compiler" isn't the right analogy. A better way to put it is that "LLMs move median user attention [and by extension, deep understanding] elsewhere in the stack, just like interpreted languages did to compiled languages, and compiled languages did to hand-written assembler".
My point about computers being nondeterminstic wasn't that the bytecode typically gets executed out of order (as you say, we fix those bugs when we find them) but that the environment within which a program executes very quickly DOES become non-deterministic - message queues, anything involving networking, scheduling, anything involving changing wall clocks, etc. The bytecode still executes one instruction after the next, but will the filesystem still be there? Will 30 minutes of wall clock time have passed between instructions? (the latter event completely hosed a cloud once for me).
Anybody claiming LLMs produce the same output every time either has the temperature set to zero (and then they're still mistaken), or is only asking for trivial work products. They're a tool, but they certainly don't replace compilers or (relatively) deterministic code execution.