Observing a debug/troubleshooting session of probabilistic system with deterministic mindset/mental model is fascinating.
Main question being asked is: “For exact same identical input text, why is response NOT exact same identical output?”
As a Content Designer, 7 years ago, this “conundrum” exploded my existing mental models & introduced concept+UX challenge of “producing variable content, probabilistically, personalised to individual, in a specific runtime system configuration & situation.
Related - They built a child they won’t raise by Abi Awomosu - https://abiawomosu.substack.com/p/they-built-a-child-they-wont-raise
@dahukanna
Interesting thoughts in there, but ultimately I don't buy any of the arguments posed here. The major theme as being like a child that must be taught and have relationships form is fundamentally incorrect because LLMs don't have any long-term memory. The network is trained (once), and it keeps an interaction state like short-timer memory, but it can't learn like a human does. (At least, not with current architectures).
@dahukanna
There are other flaws too: it regards LLMs as being relational, and contrasts with the alphabet causing serialization of thought. But LLMs dont' work holistically, they don't wait to understand an entire sentence. They compose one word at a time, each word following only from the words that precede it without planning. That's serialization taken to the extreme. It's why they can't tell jokes.

@ThreeSigma @dahukanna

On that last bit: not at all. LLMs write hierarchically, simultaneously composing words, sentences and paragraphs. Yes, statistical parrots, but not one word at a time.

As for waiting to parse complete sentences, the prompt is digested together with prior prompts and responses to them, again not one word at a time. There is indeed limited per-session memory, like someone with anterograde amnesia (can't form any new long-term memories after training is complete, only short-term ones).

@albertcardona @dahukanna

That wasn’t my understanding wrt output. Citation?