A thought experiment in the National Library of Thailand—or why #ChatGPT (or any other language model) isn't actually understanding.
A thought experiment in the National Library of Thailand—or why #ChatGPT (or any other language model) isn't actually understanding.
@emilymbender I wanted to ask a/b something I didn't see discussed before.
The article posits that all meaning grounds in external reference, and that in lang acquisition that grounding must be direct or indirect (from a prior language).
ISTM *some* meaning can be grounded in self-reference: basic arithmetic for example (see "Contact) which yields meanings of truth and falsity.
Self-referential meaning could also arise out of language instructional texts, such as for children.
@emilymbender For a concrete example, one could derive a concept of the number 3 from observation of a "counted list" pattern:
"Here are three examples of reptiles: snakes, lizards, and turtles."