@zero_gravitas @mttaggart
Yes. Is it that different.
As has been pointed out, LLMs operate using Linear Algebra Matrix Math to make predictions for "next token". That is all. Any appearance of bullshitting, bloviation, lying, confabulation, hallucinations, etc are just us anthropomorphizing math. I am guilty of it because it is much easier to talk about these systems when you do it. I'm also guilty of anthropmorphizing my car when it won't start and I beg it to please start this one time, I'll get the good gas next time. Neither system has any understanding of what is being asked of it; one of them just happens to have been trained on enough human language that it can spit out something to make it look like it does.
Is language the same as intelligence The AI industry desperately needs it to be
I had a good article that summarized historical metaphors for how the human mind works. They tend to crop up around whatever the hottest new technology is. "Mind as Steam Engine" was popular for a while. And we've all heard about keeping the humors in balance. I cant' find the article now.
There are entire books written about the idea. Our minds are not simply in our brains. Our minds are our entire body, every cell and nerve impulse, every sense and all the information that is processed back and forth. All of that is our "mind". A machine cannot create that or simulate that. LLMs aren't even built to do that. They are built to produce something that looks like our language as if we are having a conversation.