Reducing felt experience requires not preemptively dismissing the solutions
Annaka Harris has a new audio book out which she is promoting. I haven’t listened to it, but based on the interviews and spots like the one below, it appears that she’s doubling down on the conclusions she reached in her book from a few years ago, that consciousness is fundamental and pervasive.
https://www.youtube.com/watch?v=nP2swgDVl5M
Harris starts off by discussing the profound mystery of consciousness. But she clarifies that she isn’t thinking about higher order thought, like the kind in humans, but something more basic: “felt experience.” She takes this to be something that can exist without thought, and so discusses the possibility of it existing in plants and other organisms that don’t trigger most people’s intuitions of a fellow consciousness.
As I’ve noted in a couple of recent posts, the hard problem of consciousness seems specific to a particular theory of consciousness, that of fundamental consciousness, the idea that manifest conscious experience is exactly what it seems and nothing else, that there is no appearance / reality distinction or hidden complexities. I’m sure Harris, like so many others, will argue that there’s no choice but to accept fundamental consciousness. How else to explain the mystery?
But like David Chalmers and many others, she starts off by dismissing the possible solution, “higher order” processing. Without that, felt experience, the feelings of conscious experience, do look simple and irreducible. But that’s only because we’ve chosen to isolate something that didn’t evolve to be isolated, that has a functional role to play in organisms.
Harris’ example of looking at decisions vines make in where to grow is a good example. In most biological descriptions, this behavior is automatic, done without any volition. She wonders if this may not involve any felt experience. But she doesn’t seem to wonder if similar behavior in a Roomba, self driving car, or thermostat has similar types of feelings. (Some panpsychists do admit that their view implies experience in these types of systems, but in my experience most resist it.)
Many animal researchers have similar intuitions, that the observable behavioral reactions in relatively simple animals must involve feeling, since similar reactions in us are accompanied by them (at least in healthy mentally complete humans). Of course, similar to most panpsychists, they typically resist the implication for machines, often gesturing at some unknown biological ingredient or principle which will distinguish the systems they want to conclude experience feelings from those they don’t.
My take is that the solution is to reject the theory of fundamental consciousness. What’s the alternative? A reductive theory. But how do we reduce felt experience? Remember, to do a true reduction, the phenomenon must be broken down into components that are not that phenomenon. If anywhere in the description we have to include the overall phenomenon itself, we’ve failed.
Along those lines, I think part of the explanation of what feelings are is that they are composed of automatic reactions that can be either allowed or overridden. So if an animal sees a predator and always automatically reacts by running away, that in an of itself isn’t evidence of fear. On the other hand, if sometimes the animal can override their impulse to run away, maybe because there’s food nearby and they judge the risk to be worth it, then we have an animal capable of feeling fear.
So a feeling is a perception, a prediction, of an impulse which an organism uses in its reasoning to decide whether to inhibit or indulge the impulse. This means the higher order thinking Harris immediately excludes from her consideration is actually part of the answer. That answer, incidentally, also explains why we evolved feelings.
An organism is generally only going to have feelings if they provide a survival advantage, but that advantage only exists if they have some reasoning aspect to make use of it. Note that this reasoning aspect doesn’t have to be as sophisticated as what happens in humans, or even mammals or birds necessarily, although the sophistication makes it easier to detect. It just needs to be present in some incipient form to act as one endpoint in the relationship between it and the impulse, the relationship that we refer to as a “feeling”.
This requirement for a minimal level of reasoning seems to rule out felt experience in simple animals, plants, robots, and thermostats. It also gives us an idea of what a technological system would need to have it, a system of automatic reactions, which can be optionally overridden by other parts of the system simulating possible scenarios, even if only a second or two in the future.
Figuring out how to do this is not trivial. None of the current systems people wonder about are capable of it. But while it’s hard, it’s not the utter intractability of the hard problem of consciousness. Once we dismiss fundamental consciousness, that problem seems to no longer exist.
Unless of course I’m missing something?
#Consciousness #FundamentalConsciousness #panpsychism #Philosophy #PhilosophyOfMind

