@simon_brooke Heh. Having spent the better part of a decade working on the Semantic Web (Metaweb's Freebase) and building what is quite literally the biggest and most comprehensive encoding of semantic knowledge in the world (Google's Knowledge Graph), I'm pretty confident I have some idea of what "semantic" means.
But sure, let's debate the semantics of "semantic", as a LISPer I'm sure you'll find it fun as well. ๐ธ
If I'm understanding your argument, you're saying that since LLMs lack any referent to the real world, they don't have true semantics, at last in the classical philosophical sense of the term. They can encode knowledge, but they can't "know" what it means because they don't experience the world, and so can't map those terms to real-world objects.
If that's the case, sure ... no argument from me. I don't believe LLMs are conscious entities. They have no lived experience, and so by definition can't map embeddings to the "real world". They are simply statistical encodings of human knowledge.