terminally meaning in and of itself, as oppose to instrumentally meaning as a mean to other ends

New to LessWrong?

New Answer
New Comment

4 Answers sorted by

10 days of connected experience vs X days of disconnected experience? Honestly, I can't compound experiences/values very much in 10 days, so the amnesia doesn't cost that much - somewhere between 11 and 20 days seems reasonable to me.

I know people with severe memory problems, and they enjoy life a significant fraction (at least 10%, perhaps 80%, some days over 100%) as much as they might if they remembered yesterday.

This question gets much harder for 2, 10, or 50 years. The amount of joy/satisfaction/impact that can be had in those timeframes by building on previous days, is perhaps orders of magnitude higher if an agent has continuity than if not.

I think 10,000 or so. Assuming the days are meaningfully different.

I'm not sure whether X keeps getting more valuable-to-me as it becomes larger than 1

7 comments, sorted by Click to highlight new comments since: Today at 10:54 AM

I encourage you to answer the question before reading this comment.

.

.

.

.

.

I think answering this question has an interesting implication:

  • either you say a small number, in which case (at least for some solution to identity), it means that you would value short-lived copies of yourself a lot
  • or you say a large number, in which case it means you almost don't value at all the moments you completely forget, presumably at least some dreams and drug intoxication

Indeed. I almost don't value at all the moments I completely forget (and which leave no other residue in the present).

There are TONS of moments I forget, but they _do_ leave residue. Either in income, effect on other people, or environmental improvements (the lightbulb I changed continues to work). Not sure if this scenario removes or carries forward unconscious changes in habits or mental pathways, but for real memory loss, victims tend to retain some amount of such changes, even if they don't consciously remember doing so.

I also value human joy in the abstract. Whether some other person, or some un-remembered version of me experiences it, there is value.

If you give a very very large value, do you also believe that all mortal lives are very-low-value, as they won't have any memory once they die?

If you give a very very large value, do you also believe that all mortal lives are very-low-value, as they won't have any memory once they die?

They are of no value to them, because they're dead. They may be of great value to others.

I recognize that time-value-of-utility is unsolved, and generally ignored for this kind of question. But I'm not sure I follow the reasoning that current-you must value future experiences based on what farther-future-you values.

Specifically, why would you require a very large X? Shouldn't you value value both possibilities at 0, because you're dead either way?

Specifically, why would you require a very large X? Shouldn't you value value both possibilities at 0, because you're dead either way?

No, because I'm alive now, and will be until I'm dead. Until then, I have the preferences and values that I have.

Either in income, effect on other people, or environmental improvements

Those are instrumental. They are important to consider, but for the purpose of this post I'm mostly interested in fundamental values.

Not sure if this scenario removes or carries forward unconscious changes in habits or mental pathways

It does for the purpose of making the thought experiment clearcut. But yeah, that's something I wonder as well in practice.

If you give a very very large value, do you also believe that all mortal lives are very-low-value, as they won't have any memory once they die?

Two (mutually exclusive) moral theories I find plausible are:

  • (All else equal) someone's life is as valuable as it's longest instantiation (all shorter instantiations are irrelevant)
  • Finite lives are value-less