...and I see no particular reason to promote hypotheses involving those people being negligent rather than otherwise without much more additional information.

It seems that our simulators are at the very least indifferent if not negligent in terms of our values; there have been 100 billion people that have lived before us and some have lived truly cruel and tortured lives. If one is concerned aboutNonperson Predicates in which an AI models a sentient you trillions of times over just to kill you when it is done, wouldn't you also be concerned about simulations that model universes of sentient people that die and suffer?

I suppose we can't do much about it anyway, but it's still an interesting thought that if one has values that reflect either ygert's commets or Nonperson Predicates and they wish to always want to want these values, then the people running our simulation are indifferent to our values.

Interestingly, all this thought has changed my credence ever so slightly towards Nick Bostrom's second of three possibilities regarding the simulation argument, that is:

... (2) The fraction of posthuman civilizations that are interested in running ancestor-simulations is very close to zero;...

In this video Bostrom states ethical concerns as a possible reason why a human-level civilization would not carry out simulations. These are the same kinds of concerns as that of Nonperson Predicates and ygert's comments.

there have been 100 billion people that have lived before us

If we are, in fact, running in a simulation, there's little reason to think this is true.

Open thread, August 12-18, 2013

by David_Gerard 1 min read12th Aug 2013125 comments


If it's worth saying, but not worth its own post (even in Discussion), then it goes here.