I prefer dust specks because I insist on counting people one at a time. I think it's obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so. Any of those 3^^^3 people who would not voluntarily do so, I don't have enough sympathy for such individuals to step in on their behalf and spare them the dust speck.

I haven't yet worked out a good way to draw the line in the escalation scenario, since I suspect that "whatever level of discomfort I, personally, wouldn't voluntarily experience to save some random person from 50 years of torture" is unlikely to be the right answer.

By the same argument (i.e. refusing to multiply), wouldn't it also be better to torture 100 people for 49 years than to torture one person for 50 years?

1cousin_it11yI think it's obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so. Woah. How'd they end up responsible for the choice you make? You can't have it both ways, that's not how it works.
-1Vladimir_Nesov11yThis is defection, a suboptimal strategy. Each person in isolation prefers to defect in Prisoner's dilemma [http://wiki.lesswrong.com/wiki/Prisoner%27s_dilemma]. And this is preference for fuzzies [http://wiki.lesswrong.com/wiki/Fuzzies] over utility, inability to shut up and multiply [http://wiki.lesswrong.com/wiki/Shut_up_and_multiply].

Revisiting torture vs. dust specks

by [anonymous] 1 min read8th Jul 200966 comments


In line with my fine tradition of beating old horses, in this post I'll try to summarize some arguments that people proposed in the ancient puzzle of Torture vs. Dust Specks and add some of my own. Not intended as an endorsement of either side. (I do have a preferred side, but don't know exactly why.)

  • The people saying one dust speck is "zero disutility" or "incommensurable utilities" are being naive. Just pick the smallest amount of suffering that in your opinion is non-zero or commensurable with the torture and restart.
  • Escalation argument: go from dust specks to torture in small steps, slightly increasing the suffering and massively decreasing the number of people at each step. If each individual change increases utility, so does the final result.
  • Fluctuation argument: the probability that the universe randomly subjects you to the torture scenario is considerably higher than 1/3^^^3 anyway, so choose torture without worries even if you're in the affected set. (This doesn't assume the least convenient possible world, so fails.)
  • Proximity argument: don't ask me to value strangers equally to friends and relatives. If each additional person matters 1% less than the previous one, then even an infinite number of people getting dust specks in their eyes adds up to a finite and not especially large amount of suffering. (This assumption negates the escalation argument once you do the math.)
  • Real-world analogy: we don't decide to pay one penny each to collectively save one starving African child, so choose torture. (This is resolved by the proximity argument.)
  • Observer splitting: if you split into 3^^^3 people tomorrow, would you prefer all of you to get dust specks, or one of you to be tortured for 50 years? (This neutralizes the proximity argument, but the escalation argument also becomes non-obvious.)

Oh what a tangle. I guess Eliezer is too altruistic to give up torture no matter what we throw at him; others will adopt excuses to choose specks; still others will stay gut-convinced but logically puzzled, like me. The right answer, or the right theory to guide you to the answer, no longer seems so inevitable and mathematically certain.

Edit: I submitted this post to LW by mistake, then deleted it which turned out to be the real mistake. Seeing the folks merrily discussing away in the comments long after the deletion, I tried to undelete the post somehow, but nothing worked. All right; let this be a sekrit area. A shame, really, because I just thought of a scenario that might have given even Eliezer cause for self-doubt:

  • Observer splitting with a twist: instead of you, one of your loved ones will be split into 3^^^3 people tomorrow. Torture a single branch for 50 years, or give every branch a dust speck?