Seriously the sheer amount of people that equate coherent speech with sentience is mind boggling.

All jokes aside, I have heard some decently educated technical people say “yeah, it’s really creepy that it put a random laugh in what it said” or “it broke the 4th wall when talking”… it’s fucking programmed to do that and you just walked right in to it.

And people are programmed to talk like that too. It’s just a matter of scale.

The difference is knowledge. You know what an apple is. A LLM does not. It has training data that has the word apple is associated with the words red, green, pie, and doctor.

The model then uses a random number generator to mix those words up a bit, and see if the result looks a bit like the training data, and if it does, the model spits out a sequence of words that may or may not be a sentence, depending on the size and quality of the training data.

At no point is any actual meaning associated with any of the words. The model is just trying to fit different shaped blocks through different shaped holes, and sometimes everything goes through the square hole, and you get hallucinations.

Our brains just get signals coming in from our nerves that we learn to associate with a concept of the apple. We have years of such training data, and we use more than words to tokenize thoughts, and we have much more sophisticated state / memory; but it’s essentially the same thing, just much much more complex. Our brains produce output that is consistent with its internal models and constantly use feedback to improve those models.

You think you are saying things which proves you are knowledgeable on this topic, but you are not.

The human brain is not a computer. And any comparisons between the two are wildly simplistic and likely to introduce more error than meaning into the discourse.

The human brain is exactly like an organic highly parallel computer system using convolution system just like ai models. It’s just way more complex. We know how synapses work. We know the form of grey matter. It’s too complex for us to model it all artificially at this point, but there’s nothing indicating it requires a magical function to make it work.