Hallucinations or confabulations are the key weakness of an artificial fluent system like GPT-4. This weakness has an analogy with the human brain. A brain with a severed connection between the two hemispheres will have the left-hemisphere confabulate explanations.
If we take this analogy further, GPT-4 lacks an analogous right-hemisphere. This is often interpreted as missing a "world model" (see: @ylecun), but this is too reductionist an explanation for my taste. https://medium.com/intuitionmachine/the-quaternion-process-model-of-human-cognition-cd1feeb0ab9d
The Quaternion Process Theory of Human Cognition - Intuition Machine - Medium

I’ve always believed that the quirks in the behavior of the human brain were clues to its working. Daniel Kahneman’s observations about thinking fast and slow (i.e. Dual-Process theory) kicked…

Intuition Machine
@ceperez so the right hemisphere here is something like a physical/grounded "map" that learns to do the job of constraining the symbol/representation system?