The "hard" problem of consciousness is the least interesting problem of consciousness

by Hazard4 min read8th Jun 202046 comments

23

ConsciousnessThe Hard Problem of ConsciousnessWorld Optimization
Frontpage

Cross-posted from my roam-blog. Nothing new to anyone who's read about these ideas, meant to be a reference for how I think about things.

If you come up with what you think is a decent and useful theory/partial-theory of consciousness, you will be visited by a spooky daemon who will laugh and inform you that you merely dabble in the "easy problem of consciousness" and have gotten nowhere near "The Hard Problem of Consciousness"

(the aforementioned daemon in his penultimate form: Mr. Bean undercover at a rock concert)

Chalmers put the hard problem like this:

What makes the hard problem hard and almost unique is that it goes beyond problems about the performance of functions. To see this, note that even when we have explained the performance of all the cognitive and behavioral functions in the vicinity of experience—perceptual discrimination, categorization, internal access, verbal report—there may still remain a further unanswered question: Why is the performance of these functions accompanied by experience? A simple explanation of the functions leaves this question open

The idea goes that even if you explained how the brain does everything it does, you haven't explained why this doing is accompanied by subjective experience. It's rooted in this idea that subjective experience is somehow completely isolated from, and separate to behavior.

This supposed isolation is already a little suspicious to me. As Scott Aaronson points out in Why Philosophers Should Care About Complexity Theory, people judge each other to be conscious and capable of subjective experience after very short interactions. From a very short interaction with my desk I've concluded it doesn't have subjective experience. After a few years of interacting with my dog, I'm still on the fence on if it has subjective experience.

So when I hear a claim that "subjective experience" and "qualia" are divorced from any and all behavior or functionality in the mind, I'm left with a sense that Chalmers is talking about something very different from my subjective experience, and what it seems like to be me. My subjective experience seems deeply integrated with my behavior and functioning.

Colors of Experience

To explore this more, let's briefly look at a classic puzzler:

What if everyone say different colors? What if when we looked at the sky, and the ocean, we both used the English word "Blue", but you experienced what I experience when I look at what we both agree is called "Green"? How would you even tell if this was the case?

To talk about color, I'm first going to talk about Enums.

Enums for Illumination

Often when you code you assign "values" to "variables", where the "value" is the actual content that the computer works with, and the "variable" is a English word that you use to talk about the value.

health = 12
WHEN health LESS-THAN OR EQUAL-TO 0: PLAYER DIES

In this pseudo-code example, you can see it matters what value the health variable has, because there's code that will do different things based on different values.

Most languages also have a thing called "Enums". Enums are like variables were you don't care what the value is. You just want a set of English words to be able to differentiate between different things. Another pseudo code example:

Colors = Enum{RED, GREEN, BLUE, YELLOW, ORANGE, VIOLET}

WHEN LIGHTWAVE IN RANGE(380nm,450nm): SEE Colors.VIOLET
WHEN LIGHTWAVE IN RANGE(590nm,625nm): SEE Colors.ORANGE

WHEN SEE Colors.ORANGE: FEEL HAPPY
WHEN SEE Colors.VIOLET: FEEL COMPASSION

Under the hood, an actual number will be substituted for each Enum. So WHEN SEE Color.ORANGE: FEEL HAPPY will become WHEN SEE 15: FEEL HAPPY. The computer still needs some value associate with Colors.ORANGE so that it can check if other values are equal to it. It's just that you don't care what the value is.

I bring up Enums because they are a very concrete example of a system where an entity only has meaning based on it's relationship to other things. Under the hood Colors.ORANGE might be assigned 2837, but that number doesn't capture the meaning of Colors.ORANGE. The meaning is encapsulated by it's relationship to other colors, and what visible lightwaves get associated with it, and what the rest of the code decides to do when it sees Colors.ORANGE as opposed to Colors.PURPLE.

Color Swap

Now let's use Enums as a model to think about the color problem. There are some ways that we could "see different colors" that would easily be discovered, and others that might not. Check this image: read the tables as "when I see light in the wavelength of the right column, I have the experience I associate with the color in the left column. Normal is how we all see colors, and #1 and #2 are proposed ways that someone else could "see colors differently".

If your vision worked like normal and mine worked like #2, it would be EASY to tell. My vision has a fundamentally different structure. If you saw a violent in the grass, I'd be insisting that the violet and the grass were the same color. Because the "shape" of my experience is fundamentally different from yours, we wouldn't even be able to agree on a consistent naming schema.

If your vision worked like normal and mine worked like #1, we could totally end up agreeing on names for different experiences. Since to both of us, light with wavelength in the range 380-450 nm feels like a cohesive thing, we could both end up calling that feeling "red", even if the subjective experiences were "different".

Thinking about this, I'm imagining as if colors work the same way as my Enums example. The "code" in our heads would treat the same sort of light in the same sorta way*. We'd have similar regularities in our experience, because all of the "code" only uses Colors.ORANGE as a way to differentiate from other colors. In this metaphor, the concrete value of Colors.ORANGE (say, 44003) would be your qualia/subjective-experience. If everyone has more or less the same code for their color enums, everyone could have different concrete values associated with the same Enum, but it wouldn't effect anything that actually matters about color.

It does seem to be a thing that different cultures can have quite different naming schemes for color experiences, but I've never seen evidence that the fundamental "shape" of experience is different.

Well, that isn't quite true. As a djmoortex pointed out in a comment, the experience of a color is deeply related to all of the associations you have with a color. 

There's a store i pass on my to work that I dislike because the red and blue LED display in its window fools me into thinking there's a pack of police cars there.

So it seems like there's a part of people's experience of color (the fact that similar wavelengths feel similar, and far away wavelengths feel more distinct) that I expect is fairly uniform, and at the same time there's another important aspect of the experience of color (what associations you have to different colors, what they make you think and feel) which can vary wildly.

Outro

The more I think about qualia, the more I feel like the only meaning I can find in any of my subjective experiences is in how they relate to everything else in my head. It's the patters of what sorts of things out in the world lead to what sorts of subjective experience (different colors correspond to how hot something will get in the sun). It's the particular associations I have with different experiences (blue makes me think of a cool breeze). It's that certain subjective experiences imply others (I can normally tell if something will feel rough based on if it looks rough).

I'll admit, when I think about color this way, dwelling on how the meaning of "red" is deeply intertwined with the patterns of things I see as "red", I still feel like there is this "extra" thing I'm experiencing. The "redness" that is separate from all those connections and patterns. So I can see why one might still want to ponder and question this "mysterious redness".

Now, I do agree that there is in fact a "problem" that is the hard problem of consciousness (this comment explicates the ontological issues involved). I don't think this post has "dissolved the question". It's just that I personally can't find any way to relate this isolated "qualia of redness" to anything else I care about. All of the meaning I find in my own experience is stuff that I can related to behavior and function, the "easy" problems of consciousness. Given that, I think I want to rename Chalmers' categories as the "Boring" (hard) and "Interesting" (easy) problem of consciousness.

23