"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.
What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.
For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:
- 3^3 = 27.
- 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
- 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?
All else equal, do you disagree with: "A googolplex people dust specked x times during their lifetime without further ill effect is worse than one person dust specked for x*2 times during their lifetime without further ill effect" for the range concerned?
I agree with that. My point is that agreeing that "A googolplex people being dust speckled every second of their life without further ill effect is worse than one person being horribly tortured for the shortest period experiencable" doesn't oblige me to agree that "A few billion* googolplexes of people being dust specked once without further ill effect is worse than one person being horribly tortured for the shortest period experiencable". (Unless "a further ill effect" is meant to exclude not only car accidents but superlinear personal emotional effects, but that would be stupid.)
* 1 billion seconds = 31.7 years
I think that what we're dealing here is more like the irrationality of trying to impose and rationalize comfortable moral absolutes in defiance of expected utility
Since real problems never possess the degree of certainty that this dilemma does, holding certain heuristics as absolutes may be the utility-maximizing thing to do. In a realistic version of this problem, you would have to consider the results of empowering whatever agent is doing this to torture people with supposedly good but nonverifiable results. If it's a human or group of humans, not such a good idea; if it's a Friendly AI, maybe you can trust it but can't it figure out a better way to achieve the result? (There is a Pascal's Mugging problem here.)
One more thing for TORTURErs to think about: if every one of those 3^^^3 people is willing to individually suffer a dust speck in order to prevent someone from suffering torture, is TORTURE still the right answer? I lean towards SPECK on considering this, although I'm less sure about the case of torturing 3^^^3 people for a minute each vs. 1 person for 50 years.