A proof against epiphenomenal consciousness

Dec 16, 2016 15:41

For the last couple years, I have found myself thinking more and more about the nature of consciousness. It's just weird that we don't have any theoretical understanding of the single most evident fact available to us -- that we exist, and that we are experiencing things.

At one point I realized I didn't understand it to such a degree that I couldn't even rule out rocks having some form of awareness! I think that shows pretty clearly just how hard it is to get a solid grip on the subject. (And I'm not even the only one to end up there -- the philosophical term for this is "panpsychism".) I've managed to work my way back from THAT a bit, at least. I still haven't actually convinced myself what consciousness is, or how it even begins to work, but I did make some important progress recently being able to rule out one of the common paths people take.

The core problem of consciousness is that it makes absolutely no sense if you look at it from a purely materialist perspective. We have a decent understanding of how cognition works, sure. How neurons and neural nets work, how data can be processed, how an ever-increasing subset of "intelligent" behavior comes about. We have a long way to go, of course, maybe centuries of work remain, but the foundations are pretty solid. In contrast, consciousness? Who the fuck knows. There is absolutely no basis for even beginning to see how a first person, subjective experience can come out of data processing.

The classic solution is dualism. Consciousness is a metaphysical property, like a soul, which somehow gets stuffed into a body and drives it around. A very popular option, even today, but it has lots of problems. How does the interface work? By definition, it would have to be violating the laws of physics as we understand them, making atoms do things they wouldn't otherwise do. This has never been observed, and obviously there is a lot of resistance to the very idea.

(I'm ignoring subtleties like monism here, but that doesn't change anything at this level.)

One tempting compromise which I found myself coming back to is epiphenomenalism. Basically, what if consciousness existed as some special metaphysical property, but it didn't actually do anything? (AKA, it's causally isolated.) So some quality of neurons or feedback loops or massive computation or whatever causes a little bubble of subjective experience to spin up, conscious of what is going on. We already know that the actual thinking is going on in the brain, and now we have a way to understand how we're experiencing it so vividly. Great!

It really makes sense in a lot of ways. Even ignoring the interface issue, how could consciousness have an output? What would that mean? Imagine you're looking at something that is red. (All philosophy of consciousness papers have to talk about red, it's the law.) So your brain is getting signals of "red" from your eyes. Your consciousness intercepts that, and you experience redness. Does your consciousness then send a new signal saying "I'm experiencing redness" to the rest of your brain? From your cold, unfeeling cognition's point of view, there would be no difference between that and the original signal. Data is data, and it already knows your eyes are receiving red light. Why would it even pay attention to this weird ghost signal? So much simpler to just wall consciousness off and not get into this!

So epiphenomenalism is great, except for one big problem: I'm physically typing sentences that contain ideas about consciousness right now! How can consciousness be epiphenomenal if consciousness is directly affecting the physical world by making me yammer on and on about it? There is only one possible way to get around this that I see: I'm not actually thinking about consciousness. While I am experiencing things, including my thoughts on the subject, those thoughts are not actually about that experience. They're about whatever internal model my cognitive processes keep about themselves, which naturally would look a lot like consciousness to them. (If you're some kind of zombie* lacking consciousness, then the concept of 'consciousness' just means 'data about yourself' to you.) My cognition has just kind of happened to invent an explanation for something it could never know was going on in the first place.

While this is all kinds of twisty, it does nicely explain why consciousness is so hard to think about -- because we're not actually thinking about it! Our cognitive tools have no access to conscious experiences, just raw data, so of course they have trouble explaining them properly.

Cute answer, but no. This is self-contradictory. We started this whole exploration by thinking about the unique spark of consciousness, how weird it is that subjective experiences can emerge from the ruthlessly materialist universe we see around us. Following that line of thoughts has now lead us to the point of deciding that those thoughts were never about consciousness in the first place. Those thoughts had never experienced consciousness, didn't know it actually existed, and could never do so. Which means that, as far as our cognition is concerned, materialism is a perfectly fine answer, since it can provably generate a situation that convincingly feels like experience to a purely materialist cognitive system.

Whew!

Is this the end of my search? No, because it only rules out epiphenomenal explanations. There are many other options available, some of which I think even have testable hypotheses. But it's the first bit of real progress I've made, and I'm pretty happy about that.

* Yes, zombie is the philosophical term of art.
Previous post Next post
Up