It’s basically impossible to create conciousness when we don’t even fully understand what conciousness is or how it works.
If we don’t understand it, how can we say whether something is or or not consciousness?

You don’t need a culinary degree to identify if your cake is burned, or if it was frosted with feces instead of actual frosting.

We’re nowhere near that being a remotely valid concern.

Sure, because we understand cake, and we can construct one from scratch. We know what makes cake cake, we don’t know what makes something conscious.

To be clear, I absolutely believe LLMs do not have consciousness. They are statistical prediction machines.

But then, animals are also just really complex chemical processes. I don’t know what the differentiating factor is.

To be fair to Kent, he’s only the best engineer in the world, not the best philosopher.