I was reading about the effectiveness of bicycle helmet laws (here) and wondered how worthwhile it is to save your life at the expense of some aspect key to your current identity (Note that the paper linked doesn't say that this is the situation; this was just a tangential thought).

Let's say that I perform some activity that carries a 10% chance that I will die but otherwise carries no risk of injury. There is some piece of safety gear that I can wear that cuts that risk in half, but for some reason adds a 10% chance that I will be permanently brain damaged such that I will not be "me" as I understand it now. Should I rate this as 15% fatal with the safety gear or is there some other way that this should be evaluated?

New Comment
8 comments, sorted by Click to highlight new comments since:
[-][anonymous]80

If that was a proposed safety measure, I'd probably want the statistics of types of brain damage before using it. As proposed, it's not actually clear: I feel like I can redefine the risks of brain damage by changing my personal definition of "me."

As an example, with a very specific definition of parts of my identity, I could have a new dislike of fastfood be in the list of possible brain damages. I'm different (My approval of cheap and easy food causes me to eat it several times a week, that would be a large change), but that's just not that bad.

And with a very broad definition of parts of my identity, I might have the only possible brain damages be horribly serious, such as "Either your relationship with your wife sours horribly, or you go on a killing rampage, or some other vast personality change that alienates everyone around you."

And with bizzare irony, I might use the safety gear to HELP me, if I identify myself as only my worst traits, say only as a horribly depressed mess, then the brain damage has a significant chance of making me not identify as that.

Now clearly, this doesn't feel like the intended use of the safety gear analogy. But I don't feel like I HAVE a solid current identity anyway: My identity feels far to malleable based on circumstances. So I suppose the best thing for me to do is to read statistics about what kind of personality changes are expected.

But If for some reason Omega just comes up and says "You're going to be mining on Alpha Centauri now, whether you want to or not, But I've set up safety gear which protects your life at the cost of your identity as per pinyaka's example above, and you don't have time to gather statistics. Do you want to use the safety gear?"

In that case, I'd pick no. If my sense of Identity is that flexible, anything that I didn't identify as me would probably have a (from my perspective) near monstrous sense of ethics and I would consider it's release comparably as bad as me dying.

Of course, that in itself is indicative of something about my personality. Biologically me with advanced ethics that are foreign to me now? Still me. Me with with monstrous ethics? Definitely not me.

I'd think there's a simpler test: at what odds would I risk myself to save someone else? It's a nice clean demarcation between valuing "me" and "life". If I'll run in to a burning fire where I have a 50% chance of dying and a 50% chance of escaping alive with one trapped person, then clearly I only value "me" because I'm a life. If I wait until I've got a 95% chance of rescue, then clearly I value "me" vastly more than I value life.

By using an actual-other-person, we have a very clear demarcation of what is, and is not, "me" :)

I strongly suspect that most people who risk their lives do so precisely because it preserves their identity. There may also be a EDT aspect here where I value {me who would rush into a burning building to save another's life} more than {me who would not}. So if you have your identity invested at all into being a good person in that kind of way, I don't think this thought experiment will be isomorphic to the one in which you're under dangerous surgery.

There's also the matter that me-changed-enough-to-be-a-different-person is a new person, at least to the extent of the change, while someone trapped in a burning building already exists. Most people (I think for good reasons even on simple act utilitarianism, but that's another matter) value preserving life over creating new life.

I'd weight the "percent-me" of the change. Me-but-without-LW-rationality is maybe 90-95% me. Me-without-any-interesting-in-critical-thinking-at-all is like 75% me. Some form of silliness and self-expression is about 35% of my personality-makeup. I think I value general inclinations more than specific knowledge/ideas, since those change more over time.

So a 10% chance of me surviving-sans-silliness-and-art-inclination is basically like a 6.5% chance of surviving.'

Will it change me enough that I no longer will want to be a professional philanthropist? If so, it only counts as saving one life, and it's negligible in comparison to saving my identity.

Should I rate this as 15% fatal with the safety gear or is there some other way that this should be evaluated?

What value does your utility function place on having no experiences, versus having the experiences a brain-damaged person has? This definitely depends on the type and severity of brain damage.

That's the only real question - utilitarian decision making takes care of the rest.

That's the only real question - utilitarian decision making takes care of the rest.

This doesn't seem like a very charitable response, akin to replying "just do what you want". He's trying to figure out what he wants, and is asking for help in figuring this out.

And ThrustVectoring asked a pertinent question to help him figure it out.