We don’t know enough about human consciousness to know that for sure. Plenty of animals have fewer braincells than humans, but we don’t know enough about their consciousness to say whether they have an internal experience.

That’s what I mean. It’s hubris to assume we can culture human braincells in a petri dish just because there’s a lack of evidence one way or a other whether it’s aware.

There’s a lack of evidence for anything not being conscious.

Neurons work by generating electrical signals in response to stimulus (either electrical inputs from other neurons or physical/sensory inputs activated by light or touch etc.) and they do this in a physical way.

If they’re conscious, then there’s a pretty good chance that power plants are conscious, computers are conscious and pretty much everything else in the world is conscious.

I’m not sure there’s any requirement for consciousness to include “human-like reasoning” or “understanding” for it to have some kind of experience and perspective. Humans make a lot of assumptions about the world to make it for the patterns we’re used to.

A cluster of neurons trained to play doom might have consciousness but it’s not likely to think like a human, just like a rock or a plant or an ant or an iPhone might have consciousness.

Whether it’s ethical to squash an ant or turn off an iPhone or stimulate a lab-grown neuron depends on your ethical framework and your philosophical worldview.

I think there’s a lower limit of complexity for sentience, based on memory-persistence, self-firing, and self-recognition. I think there’s no need for moral concern for non-sentient things. (But, that’s just my ethical framework and philosophical worldview; the only “evidence” I’m at all aware of is thin and vague.)

But, as far as having a subjective experience, I think that might go quite small and alien including fungi and plant or even certain sub-cellular structures. Probably anything is maintains a border and internal homeostasis including parts of the bodies of larger experiencers could be having an internal perspective – and any human words applied so those experiences would tell you more about human bias than the experience.

In my view, although I am neither a neurologist nor a philosopher, things should absolutely scale with neuron blob complexity, and it should do so in a non-linear way. I dislike harming an animal with a complex brain like mammals, cephalopods etc. much more than I dislike harming the equivalent nerve mass in insects, for instance.

That’s also the way I feel, but I think that’s probably human bias and closely related to the evolutionary pressure behind my mirror neurons and how strongly they trigger correlates with outside sentient phenotype.

I think if I knew what it felt like (if anything) to be an ant colony, I might have different views around the causal use of boric acid (and related) to keep them out of human spaces.