# Feeling Moral

Created by Rob Bensinger at 3y

Suppose that a disease, or a monster, or a war, or something, is killing people. And suppose you only have enough resources to implement one of the following two options:

1. Save 400 lives, with certainty.
2. Save 500 lives, with 90\% probability; save no lives, 10\% probability.

Most people choose option 1. Which, I think, is foolish; because if you multiply 500 lives by 90% probability, you get an expected value of 450 lives, which exceeds the 400-life value of option 1. (Lives saved don't diminish in marginal utility, so this is an appropriate calculation.)

``What!" you cry, incensed. ``How can you gamble with human lives? How can you think about numbers when so much is at stake? What if that 10% probability strikes, and everyone dies? So much for your damned logic! You're following your rationality off a cliff!"

Ah, but here's the interesting thing. If you present the options this way:

1. 100 people die, with certainty.
2. 90% chance no one dies; 10% chance 500 people die.

Then a majority choose option 2. Even though it's the same gamble. You see, just as a certainty of saving 400 lives seems to feel so much more comfortable than an unsure gain, so too, a certain loss feels worse than an uncertain one.

You can grandstand on the second description too: "How can you condemn 100 people to certain death when there's such a good chance you can save them? We'll all share the risk! Even if it was only a 75% chance of saving everyone, it would still be worth it---so long as there's a chance---everyone makes it, or no one does!"

You know what? This isn't about your feelings. A human life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain's feelings of comfort or discomfort with a plan. Does computing the expected utility feel too cold-blooded for your taste? Well, that feeling isn't even a feather in the scales, when a life is at stake. Just shut up and multiply.

A googol is 10^100---a 1 followed by one hundred zeroes. A googolplex is an even more incomprehensibly large number---it's 10^googol, a 1 followed by a googol zeroes. Now pick some trivial inconvenience, like a hiccup, and some decidedly untrivial misfortune, like getting slowly torn limb from limb by sadistic mutant sharks. If we're forced into a choice between either preventing a googolplex people's hiccups, or preventing a single person's shark attack, which choice should we make? If you assign any negative value to hiccups, then, on pain of decision-theoretic incoherence, there must be some number of hiccups that would add up to rival the negative value of a shark attack. For any particular finite evil, there must...