"What's the worst that can happen?" goes the optimistic saying.  It's probably a bad question to ask anyone with a creative imagination.  Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years.  This is one of the worse things that can realistically happen to one person in today's world.

What's the least bad, bad thing that can happen?  Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.

For our next ingredient, we need a large number.  Let's use 3^^^3, written in Knuth's up-arrow notation:

  • 3^3 = 27.
  • 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
  • 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).

3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall.  You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times.  That's 3^^^3.  It's the smallest simple inconceivably huge number I know.

Now here's the moral dilemma.  If neither event is going to happen to you personally, but you still had to choose one or the other:

Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?

I think the answer is obvious.  How about you?

Torture vs. Dust Specks
New Comment
626 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Does this analysis focus on pure, monotone utility, or does it include the huge ripple effect putting dust specks into so many people's eyes would have? Are these people with normal lives, or created specifically for this one experience?

1lockeandkeynes
I think you can be allowed to imagine that any ripple effect caused by someone getting a barely-noticeable dust speck in their eyes (perhaps it makes someone mad enough to beat his dog) would be about the same as that of the torture (perhaps the torturers go home and beat their dogs because they're so desensitized to torturing).
4VAuroch
The ripple effect is real, but as in Pascal's Wager, for every possible situation where the timing is critical and something bad will happen if you are distracted for a moment, there's a counterbalancing situation where the timing is critical and something bad will happen unless you are distracted for a moment, so those probably balance out into noise.
1DragonGod
I doubt this.
1VAuroch
Why?
[-]g440

The answer that's obvious to me is that my mental moral machinery -- both the bit that says "specks of dust in the eye can't outweigh torture, no matter how many there are" and the bit that says "however small the badness of a thing, enough repetition of it can make it arbitrarily awful" or "maximize expected sum of utilities" -- wasn't designed for questions with numbers like 3^^^3 in. In view of which, I profoundly mistrust any answer I might happen to find "obvious" to the question itself.

Isn't this just appeal to humility? If not, what makes this different?

5MathMage
It is not humility to note that extrapolating models unimaginably far beyond their normal operating ranges is a fraught business. Just because we can apply a certain utility approximation to our monkeysphere, or even a few orders of magnitude above our monkeysphere, doesn't mean the limiting behavior matches our approximation.
0adamisom
In other words, you're meta-cogitation is 1 - do I trust my very certain intuition? or 2 - do I trust the heuristic from formal/mathematical thinking (that I see as useful partially and specifically to compensate for inaccuracies in our intuition)?

Since there was a post about what seems obvious to the speaker might not be to the listener in this blog a few days ago, I thought I would point out that : It was NOT AT ALL obvious to me what should be preferred, torture 1 man for 50 years or speck of dust in 3^^^3 people. Can you please plase clarify/update what the point of the post was?

The dust speck is described as "barely enough to make you notice", so however many people it would happen to, it seems better than even something a lot less worse than 50 years of horrible torture. There are so many irritating things that a human barely notices in his/her life, what's an extra dust speck?

I think I'd trade the dust specks for even a kick in the groin.

But hey, maybe I'm missing something here...

-1Insert_Idionym_Here
If 3^^^3 people get dust in their eye, an extraordinary number of people will die. I'm not thinking even 1% of those affected will die, but perhaps 0.000000000000001% might, if that. But when dealing with numbers this huge, I think the death toll would measure greater than 7 billion. Knowing this, I would take the torture.

If 3^^^3 people get dust in their eye, an extraordinary number of people will die.

The premise assumes it's "barely enough to make you notice", which was supposed to rule out any other unpleasant side-effects.

-1Insert_Idionym_Here
No, I'm pretty sure it makes you notice. It's "enough". "barely enough", but still "enough". However, that doesn't seem to be what's really important. If I consider you to be correct in your interpretation of the dilemma, in that there are no other side effects, then yes, the 3^^^3 people getting dust in their eyes is a much better choice.
2dlthomas
The thought experiment is, 3^^^3 bad events, each just so bad that you notice their badness. Considering consequences of the particular bad thing means that in fact there are other things as well that are depending on your choice, and that's a different thought experiment.
-4Insert_Idionym_Here
That is in no way what was said. Also, the idea of an event that somehow manages to have no effect aside from being bad is... insanely contrived. More contrived than the dilemma itself. However, let's say that instead of 3^^^3 people getting dust in their eye, 3^^^3 people experience a single nano-second of despair, which is immediately erased from their memory to prevent any psychological damage. If I had a choice between that and torturing a person for 50 years, then I would probably choose the former.
2dlthomas
The notion of 3^^^3 events of any sort is far more contrived than the elimination of knock-on effects of an event. There isn't enough matter in the universe to make that many dust specks, let alone the eyes to be hit and nervous systems to experience it. Of course it's contrived. It's a thought experiment. I don't assert that the original formulation makes it entirely clear; my point is to keep the focus on the actual relevant bit of the experiment - if you wander, you're answering a less interesting question.
1Insert_Idionym_Here
I don't agree. The existence 3^^^3 people, or 3^^^3 dust specks, is impossible because there isn't enough matter, as you said. The existence of an event that has only effects that are tailored to fit a particular person's idea of 'bad' does not fit my model of how causality works. That seems like a worse infraction, to me. However, all of that is irrelevant, because I answered the more "interesting question" in the comment you quoted. To be blunt, why are we still talking about this?
0dlthomas
I'm not sure I agree, but "which impossible thing is more impossible" does seem an odd thing to be arguing about, so I'll not go into the reasons unless someone asks for them. I meant a more generalized you, in my last sentence. You in particular did indeed answer the more interesting question.
1dlthomas
Can you explain a bit about your moral or decision theory that would lead you to conclude that?
2Insert_Idionym_Here
Yes. I believe that because any suffering caused by the 3^^^3 dust specks is spread across 3^^^3 people, it is of lesser evil than torturing a man for 50 years. Assuming there to be no side effects to the dust specks.
-1Nornagest
That's not general enough to mean very much: it fits a number of deontological moral theories and a few utilitarian ones (what the right answer within virtue ethics is is far too dependent on assumptions to mean much), and seems to fit a number of others if you don't look too closely. Its validity depends greatly on which you've picked. As best I can tell the most common utilitarian objection to TvDS is to deny that Specks are individually of moral significance, which seems to me to miss the point rather badly. Another is to treat various kinds of disutility as incommensurate with each other, which is at least consistent with the spirit of the argument but leads to some rather weird consequences around the edge cases.
-1Insert_Idionym_Here
No-one asked for a general explanation. The best term I have found, the one that seems to describe the way I evaluate situations the most accurately, is consequentialism. However, that may still be inaccurate. I don't have a fully reliable way to determine what consequentialism entails; all I have is Wikipedia, at the moment. I tend to just use cost-benefit analysis. I also have a mental, and quite arbitrary, scale of what things I do and don't value, and to what degree, to avoid situations where I am presented with multiple, equally beneficial choices. I also have a few heuristics. One of them essentially says that given a choice between a loss that is spread out amongst many, and an equal loss divided amongst the few, the former is the more moral choice. Does that help?
0Nornagest
It helps me understand your reasoning, yes. But if you aren't arguing within a fairly consistent utilitarian framework, there's not much point in trying to convince others that the intuitive option is correct in a dilemma designed to illustrate counterintuitive consequences of utilitarianism. So far it sounds like you're telling us that Specks is intuitively more reasonable than Torture, because the losses are so small and so widely distributed. Well, yes, it is. That's the point.
0Insert_Idionym_Here
At what point is utilitarianism not completely arbitrary?
0Nornagest
I'm not a moral realist. At some point it is completely arbitrary. The meta-ethics here are way outside the scope of this discussion; suffice it to say that I find it attractive as a first approximation of ethical behavior anyway, because it's a simple way of satisfying some basic axioms without going completely off the rails in situations that don't require Knuth up-arrow notation to describe. But that's all a sideline: if the choice of moral theory is arbitrary, then arguing about the consequences of one you don't actually hold makes less sense than it otherwise would, not more.
0Insert_Idionym_Here
I believe I suggested earlier that I don't know what moral theory I hold, because I am not sure of the terminology. So I may, in fact, be a utilitarian, and not know it, because I have not the vocabulary to say so. I asked "At what point is utilitarianism not completely arbitrary?" because I wanted to know more about utilitarianism. That's all.
0Nornagest
Ah. Well, informally, if you're interested in pissing the fewest people off, which as best I can tell is the main point where moral abstractions intersect with physical reality, then it makes sense to evaluate the moral value of actions you're considering according to the degree to which they piss people off. That loosely corresponds to preference utilitarianism: specifically negative preference utilitarianism, but extending it to the general version isn't too tricky. I'm not a perfect preference utilitarian either (people are rather bad at knowing what they want; I think there are situations where what they actually want trumps their stated preference; but correspondence with stated preference is itself a preference and I'm not sure exactly where the inflection points lie), but that ought to suffice as an outline of motivations.
0Insert_Idionym_Here
Thank you.
0dlthomas
That's not quite what I meant by "explain" - I had understood that to be your position, and was trying to get insight into your reasoning. Drawing an analogy to mathematics, would you say that this is an axiom, or a theorem? If an axiom, it clearly must be produced by a schema of some sort (as you clearly don't have 3^^^3 incompressible rules in your head). Can you explore somewhat the nature of that schema? If a theorem, what sort of axioms, and how arranged, produce it?
0TimS
When I participated in this debate, this post convinced me that a utilitarian must believe that dust specks cause more overall suffering (or whatever badness measure you prefer). Since I already wasn't a utilitarian, this didn't bother me.
2dlthomas
As a utilitarian (in broad strokes), I agree, and this doesn't bother me because this example is so far out of the range of what is possible that I don't object to saying, "yes, somewhere out there torture might be a better choice." I don't have to worry about that changing what the answer is around these parts.

Anon, I deliberately didn't say what I thought, because I guessed that other people would think a different answer was "obvious". I didn't want to prejudice the responses.

0[anonymous]
So what do you think?
6dxu
He gives his answer here.
-1[anonymous]
Thank you!
0coldlyrationalogic
Exactly, if Elizier had went out and said what he thought, nothing good would come out of it. The point is to make you think.

Even when applying the cold cruel calculus of moral utilitarianism, I think that most people acknowledge that egalitarianism in a society has value in itself, and assign it positive utility. Would you rather be born into a country where 9/10 people are destitute (<$1000/yr), and the last is very wealthy (100,000/yr)? Or, be born into a country where almost all people subsist on a modest (6-8000/yr) amount?

Any system that allocates benefits (say, wealth) more fairly might be preferable to one that allocates more wealth in a more unequal fashion. And, the same goes for negative benefits. The dust specks may result in more total misery, but there is utility in distributing that misery equally.

0DanielLC
I don't believe egalitarianism has value in itself. Tell me, would you rather get all your wealth continuously throughout the year, or get a disproportionate amount on Christmas? If wealth is evenly distributed, it will lead to more total happiness, but I don't see any advantage in happiness being evenly distributed. I don't see how your comment relates to this post.
1byrnema
Perhaps it could be framed in terms of the utility of psychological comfort. Suppose that one person is tortured to avoid 3^^^3 people getting dust specks. Won't almost every one of those 3^^^3 people empathize with the tortured person enough to feel a pang of discomfort more uncomfortable than a dust speck?
8jimrandomh
Only if they find out that the tortured person exists, which would be an event that's not in the problem statement.
2Mestroyer
Well, there's valuing money at more utility per dollar when you have less money and less utility per dollar when you have more money, which makes perfect sense. But that's not the same as egalitarianism as part of utility.
-2JasonCoston
Third-to-last sentence sets up a false dichotomy between "more fairly" and "more unequal."
[-]Kat20

The dust specks seem like the "obvious" answer to me, but how large the tiny harm must be to cross the line where the unthinkably huge number of them outweighs a single tremendous one isn't something I could easily say, when clearly I don't think simply calculating the total amount of harm caused is the right measure.

It seems obvious to me to choose the dust specks because that would mean that the human species would have to exist for an awfully long time for the total number of people to equal that number and that minimum amount of annoyance would be something they were used to anyway.

I too see the dust specks as obvious, but for the simpler reason that I reject utilitarian sorts of comparisons like that. Torture is wicked, period. If one must go further, it seems like the suffering from torture is qualitatively worse than the suffering from any number of dust specks.

-2[anonymous]
I think you have misunderstood the point of the thought experiment. Eliezer could have imagined that the intense and prolonged suffering experienced by the victim was not intentionally caused, but was instead the result of natural causes. The "torture is wicked" reply cannot be used to resist the decision to bring about this scenario. (There may, of course, be other reasons for objecting to that decision.)

Anon prime: dollars are not utility. Economic egalitarianism is instrumentally desirable. We don't normally favor all types of equality, as Robin frequently points out.

Kyle: cute

Eliezer: My impulse is to choose the torture, even when I imagine very bad kinds of torture and very small annoyances (I think that one can go smaller than a dust mote, possibly something like a letter on the spine of a book that your eye sweeps over being in a shade less well selected a font). Then, however, I think of how much longer the torture could last and still not outweigh the trivial annoyances if I am to take the utilitarian perspective and my mind breaks. Condoning 50 years of torture, or even a day worth, is pretty much the same as condoning universes of agonium lasting for eons in the face of numbers like these, and I don't think that I can condone that for any amount of a trivial benefit.

7Eliezer Yudkowsky
(This was my favorite reply, BTW.)

I admire the restraint involved in waiting nearly five years before selecting a favorite.

1Friendly-HI
Well too bad he didn't wait a year longer then ;). I think preferring torture is the wrong answer for the same reason that I think universal health-care is a good idea. The financial cost of serious illness and injury is distributed over the taxpaying population so no single individual has to deal with a spike in medical costs ruining their life. And I think it's still the correct moral choice regardless of whether universal health-care happens to be more expensive or not. Analogous I think the exact same applies to dust vs torture. I don't think the correct moral choice is about minimizing the total area under the pain-curve at all, it's about avoiding severe pain-spikes for any given individual even at the cost of having a larger area under the curve. I don't think "shut up and multiply" applies here in it's simplistic conception in the way it might apply in the scenario where you have to choose whether 400 people live for sure or 500 people live with .9 probability (and die with .1 probability). ---------------------------------------- Irrespective of the former however, the thought experiment is a bit problematic because it's more complex than apparent at first, if we really take it seriously. Eliezer said the dust-specks are "barely noticed", but being conscious or aware of something isn't an either-or thing, awareness falls on a continuum so whatever "pain" the dust-specks causes has to be multiplied by how aware the person really is. If someone is tortured that person is presumably very aware of the physical and emotional pain. Other possible consequences like lasting damage or social repercussions not counting, I don't really care all that much about any kind of pain that happens to me while I'm not aware of it. I could probably figure out whether or not pain is actually registered in my brain during having my upcoming operation under anesthesia, but the fact that I won't bother tells me very clearly, that awareness of pain is an important weight we have
1Jiro
If you're going to say that, you'll need some threshhold, and pain over the threshhold makes the whole society count as worse than pain under the threshhold. This will mean that any number of people with pain X is better than one person with pain X + epsilon, where epsilon is very small but happens to push it over the threshhold. Alternately, you could say that the disutility of pain gradually changes, but that has other problems. I suggest you read up on the repugnant conclusion ( http://plato.stanford.edu/entries/repugnant-conclusion/ )--depending on exactly what you mean, what you suggest is similar to the proposed solutions, which don't really work.

Personally, I choose C: torture 3^^^3 people for 3^^^3 years. Why? Because I can.

Ahem. My morality is based on maximizing average welfare, while also avoiding extreme individual suffering, rather than cumulative welfare.

So torturing one man for fifty years is not preferable to annoying any number of people.

This is different when the many are also suffering extremely, though - then it may be worthwhile to torture one even more to save the rest.

Trivial annoyances and torture cannot be compared in this quantifiable manner. Torture is not only suffering, but lost opportunity due to imprisonment, permanent mental hardship, activation of pain and suffering processes in the mind, and a myriad of other unconsidered things.

And even if the torture was 'to have flecks of dust dropped in your eyes', you still can't compare a 'torturous amount' applied to one person, to substantial number dropped in the eyes of many people: We aren't talking about cpu cycles here - we are trying to quantify qualifiables.

If ... (read more)

5DanielLC
Can you compare apples and oranges? You certainly don't seem to have much trouble when you decide how to spend your money at the grocery store. It was rather clear from the context that the "dust in the eye" was a very, very minor torture. People are not going blind. They are perfectly capable of dealing with it. It's just not 3^^^3 times as minor as the torture. If you were to torture two people in exactly the same way, they'd suffer about equally. Why do you imply that's some sort of unanswerable question? If you weren't talking about the ethical side, what were you talking about? He wasn't trying to compare everything about the two choices, just which was more ethical. It would be impossible if he didn't limit it like that.
0snewmark
I'm pretty sure the question itself revolves around ethics, as far as I can tell the question is: given these 2 choices, which would you consider, ethically speaking, the ideal option?

I think this all revolves around one question: Is "disutility of dust speck for N people" = N*"disutility of dust speck for one person"?

This, of course, depends on the properties of one's utility function.

How about this... Consider one person getting, say, ten dust specks per second for an hour vs 106060 = 36,000 people getting a single dust speck each.

This is probably a better way to probe the issue at its core. Which of those situations is preferable? I would probably consider the second. However, I suspect one person getting a billion dust specks in their eye per second for an hour would be preferable to 1000 people getting a million per second for an hour.

Suffering isn't linear in dust specks. Well, actually, I'm not sure subjective states in general can be viewed in a linear way. At least, if there is a potentially valid "linear qualia theory", I'd be surprised.

But as far as the dust specks vs torture thing in the original question? I think I'd go with dust specks for all.

But that's one person vs buncha people with dustspecks.

Oh, just had a thought. A less extreme yet quite related real world situation/question would be this: What is appropriate punishment for spammers?

Yes, I understand there're a few additional issues here, that would make it more analogous to, say, if the potential torturee was planning on deliberately causing all those people a DSE (Dust Speck Event)

But still, the spammer issue gives us a more concrete version, involving quantities that don't make our brains explode, so considering that may help work out the principles by which these sorts of questions can be dealt with.

The problem with spammers isn't the cause of a singular dust spec event: it's the cause of multiple dust speck events repeatedly to individuals in the population in question. It's also a 'tragedy of the commons' question, since there is more than one spammer.

To respond to your question: What is appropriate punishment for spammers? I am sad to conclude that until Aubrey DeGray manages to conquer human mortality, or the singularity occurs, there is no suitable punishment for spammers.

After either of those, however, I would propose unblocking everyone's toilets and/or triple shifts as a Fry's Electronics floor lackey until the universal heat death, unless you have even >less< interesting suggestions.

If you could take all the pain and discomfort you will ever feel in your life, and compress it into a 12-hour interval, so you really feel ALL of it right then, and then after the 12 hours are up you have no ill effects - would you do it? I certainly would. In fact, I would probably make the trade even if it were 2 or 3 times longer-lasting and of the same intensity. But something doesn't make sense now... am I saying I would gladly double or triple the pain I feel over my whole life?

The upshot is that there are some very nonlinear phenomena involved with calculating amounts of suffering, as Psy-Kosh and others have pointed out. You may indeed move along one coordinate in "suffering-space" by 3^^^3 units, but it isn't just absolute magnitude that's relevant. That is, you cannot recapitulate the "effect" of fifty years of torturing with isolated dust specks. As the responses here make clear, we do not simply map magnitudes in suffering space to moral relevance, but instead we consider the actual locations and contours. (Compare: you decide to go for a 10-mile hike. But your enjoyment of the hike depends more on where you go, than the distance traveled.)

8JoeSchmoe
"If you could take all the pain and discomfort you will ever feel in your life, and compress it into a 12-hour interval, so you really feel ALL of it right then, and then after the 12 hours are up you have no ill effects - would you do it? I certainly would."" Hubris. You don't know, can't know, how that pain would/could be instrumental in processing external stimuli in ways that enable you to make better decisions. "The sort of pain that builds character, as they say". The concept of processing 'pain' in all its forms is rooted very deep in humanity -- get rid of it entirely (as opposed to modulating it as we currently do), and you run a strong risk of throwing the baby out with the bathwater, especially if you then have an assurance that your life will have no pain going forward. There's a strong argument to be made for deference to traditional human experience in the face of the unknown.

Yes the answer is obvious. The answer is that this question obviously does not yet have meaning. It's like an ink blot. Any meaning a person might think it has is completely inside his own mind. Is the inkblot a bunny? Is the inkblot a Grateful Dead concert? The right answer is not merely unknown, because there is no possible right answer.

A serious person-- one who take moral dilemmas seriously, anyway-- must learn more before proceeding.

The question is an inkblot because too many crucial variables have been left unspecified. For instance, in order for thi... (read more)

The non-linear nature of 'qualia' and the difficulty of assigning a utility function to such things as 'minor annoyance' has been noted before. It seems to some insolvable. One solution presented by Dennett in 'Consciousness Explained' is to suggest that there is no such thing as qualia or subjective experience. There are only objective facts. As Searle calls it 'consciousness denied'. With this approach it would (at least theoretically) be possible to objectively determine the answer to this question based on something like the number of ergs needed to... (read more)

Uh... If there's no such thing as qualia, there's no such thing as actual suffering, unless I misunderstand your description of Dennett's views.

But if my understanding is correct, and those views were correct, then wouldn't the answer be "nobody actually exists to care one way or another?" (Or am I sorely mistaken in interpreting that view?)

Regarding your example of income disparity: I might rather be born into a system with very unequal incomes, if, as in America (in my personal and biased opinion), there is a reasonable chance of upping my income through persistence and pluck. I mean hey, that guy with all that money has to spend it somewhere-- perhaps he'll shop at my superstore!

But wait, what does wealth mean? In the case where everyone has the same income, where are they spending their money? Are they all buying the same things? Is this a totalitarian state? An economy without disparity ... (read more)

If even one in a hundred billion of the people is driving and has an accident because of the dust speck and gets killed, that's a tremendous number of deaths. If one in a hundred quadrillion of them survives the accident but is mangled and spends the next 50 years in pain, that's also a tremendous amount of torture.

If one in a hundred decillion of them is working in a nuclear power plant and the dust speck makes him have a nuclear accident....

We just aren't designed to think in terms of 3^^^3. It's too big. We don't habitually think much about one-in-a-million chances, much less one in a hundred decillion. But a hundred decillion is a very small number compared to 3^^^3.

1marenz
I would say that it is pretty easy to think in terms of 3^^^3. Just assume that everything that could happen due to a dust speck in your eye, will happen.
7ata
That is an interesting argument (I've considered it before) though I think it misses the point of the thought experiment. As I understand it, it's not about any of the possible consequences of the dust specks, but about specks as (very minor) intrinsically bad things themselves. It's about whether you're willing to measure the unpleasantness of getting a dust speck in your eye on the same scale as the unpleasantness of being tortured, as (vastly) different in degree rather than fundamentally different in kind.
0homunq
How do you know that more accidents are caused than avoided by dust specks? (Of course I realize I'm saying "you" to a 5-year-old comment but you get the picture.)
[-]g10

Douglas and Psy-Kosh: Dennett explicitly says that in denying that there are such things as qualia he is not denying the existence of conscious experience. Of course, Douglas may think that Dennett is lying or doesn't understand his own position as well as Douglas does.

James Bach and J Thomas: I think Eliezer is asking us to assume that there are no knock-on effects in either the torture or the dust-speck scenario, and the usual assumption in these "which economy would you rather have?" questions is that the numbers provided represent the situati... (read more)

J Thomas: You're neglecting that there might be some positive-side effects for a small fraction of the people affected by the dust specks; in fact, there is some precedent for this. The resulting average effect is hard to estimate, but (considering that dust specks seem to mostly add entropy to the thought processes of the affected persons), would likely still be negative.

Copying g's assumption that higher-order effects should be neglected, I'd take the torture. For each of the 3^^^3 persons, the choice looks as follows:

1.) A 1/(3^^^3) chance of being tort... (read more)

Hmm, tricky one.

Do I get to pick the person who has to be tortured?

As I read this I knew my answer would be the dust specks. Since then I have been mentally evaluating various methods for deciding on the ethics of the situation and have chosen the one that makes me feel better about the answer I instinctively chose.

I can tell you this though. I reckon I personally would choose max five minutes of torture to stop the dust specks event happening. So if the person threatened with 50yrs of torture was me, I'd choose the dust specks.

What if it were a repeatable choice?

Suppose you choose dust specks, say, 1,000,000,000 times. That's a considerable amount of torture inflicted on 3^^^3 people. I suspect that you could find the number of times equivalent to torturing each of thoes 3^^^3 people 50 years, and that number would be smaller than 3^^^3. In other words, choose the dust speck enough times, and more people would be tortured effectually for longer than if you chose the 50-year torture an equivalent number of times.

If that math is correct, I'd have to go with the torture, not the dust specks.

0themusicgod1
Likewise, if this was iterated 3^^^3+1 times(ie 3^^^3 plus the reader),it could easily be 50*3^^^3 (ie > 3^^^3+1) people tortured. The odds are if it's possible for you to make this choice, unless you have reason to believe otherwise they may too, making this an implicit prisoner's dilemma of sorts. On the other side, 3^^^3 specks could possibly crush you, and/or your local cluster of galaxies into a black hole, so there's that to consider if you consider the life within meaningful distance of of every one of those 3^^^3 people valuable.
2Benquo
I'm not sure I follow your argument. I'm going to assume that for a single person, 3^^3 dust specks = 50 years of torture. (My earlier figure seems wrong, but 3^^3 dust specks over 50 years is a little under 5,000 dust specks per second.) I'm going to ignore the +1 because these are big numbers already. If this were iterated 3^^^3 times, then we have the choice between: TORTURE: 3^^^3 people are each tortured for 50 years, once. DUST SPECKS: 3^^^3 people are tortured for 50 years, repeated (3^^^3)/(3^^3)=3^(3^^3-3^3) times.
0themusicgod1
The probability I'm the only person person selected out of 3^^^3 for such a decision p(i) is less than any reasonable estimate of how many people could be selected, imho. Let's say well below 700dB against. The chances are much greater that some probability fo those about to be dust specked or tortured also gets this choice (p(k)). p(k)*3^^^3 > p(i) => 3^^^3 > p(i)/p(k) => true for any reasonable p(i)/p(k) So this means that the effective number of dust particles given to each of us is going to be roughly (1-p(i))p(k)3^^^3. I'm going to assume any amount of dust larger in mass than a few orders of magnitude above the Chandrasekhar limit (1e33 kg) is going to result in a black hole. I can even assume a significant error margin in my understanding of how black holes work, and the reuslts do not change. The smallest dust particle is probably a single hydrogen atom(really everything resoles to hydrogen at small enough quantities, right?). 1 mol of hydrogen weighs about 1 gram. So (1-p(i))(p(k)3^^^3 (1 gram/mol)(6e-23 'specks'/mol) (1e-3 kg/g) (1e-33 kg/black hole) = roughly ( 3^^^3 ) (~1e-730) = roughly 3^^^3 black holes. ie 3^(3_1^3_2^3_3^...^3_7e13 -730) = roughly 3^(3_1^3_2^3_3^...^3_7e13) ie 3_1^3_2^3_3^...^3_7e13 - 730 = roughly 3_1^3_2^3_3^...^3_7e13. In conclusion, I think at this level, I would choose 'cancel' / 'default' / 'roll a dice and determine the choice randomly/not choose' BUT would woefully update my concept of the sizee of the universe to contain enough mass to even support a reasonably infentessimal probability of some proportion of 3^^^3 specks of dust, and 3^^^3 people or at least some reasonable proportion thereof. The question I have now is how is our model of the universe to update given this moral dillema? What is the new radius of the universe given this situation? It can't be big enough for 3^^^3 dust specks piled on the edge of our universe outside of our light cone somewhere. Either way I think the new radius ought to be termed the "
1Benquo
I don't really care what happens if you take the dust speck literally; the point is to exemplify an extremely small disutility.
0themusicgod1
I suppose you could view the utility as a meaninful object in this frame and abstract away the dust, too, but in the end the dust-utility system is going to encompaps both anyway so solving the problem on either level is going to solve it on both.

Kyle wins.

Absent using this to guarantee the nigh-endless survival of the species, my math suggests that 3^^^3 beats anything. The problem is that the speck rounds down to 0 for me.

There is some minimum threshold below which it just does not count, like saying, "What if we exposed 3^^^3 people to radiation equivalent to standing in front of a microwave for 10 seconds? Would that be worse than nuking a few cities?" I suppose there must be someone in 3^^^3 who is marginally close enough to cancer for that to matter, but no, that rounds down to 0... (read more)

1ThoughtSpeed
Why would that round down to zero? That's a lot more people having cancer than getting nuked! (It would be hilarious if Zubon could actually respond after almost a decade)

Wow. The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too. But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet. What does that say about our abilities in moral reasoning?

Given that human brains are known not to be able to intuitively process even moderately large numbers, I'd say the question can't meaningfully be asked - our ethical modules simply can't process it. 3^^^3 is too large - WAY too large.

I'm unconvinced that the number is too large for us to think clearly. Though it takes some machinery, humans reason about infinite quantities all the time and arrive at meaningful conclusions.

My intuitions strongly favor the dust speck scenario. Even if forget 3^^^^3 and just say that an infinite number of people will experience the speck, I'd still favor it over the torture.

[-]cw90

Robin is absolutely wrong, because different instances of human suffering cannot be added together in any meaningful way. The cumulative effect when placed on one person is far greater than the sum of many tiny nuisances experienced by many. Whereas small irritants such as a dust mote do not cause "suffering" in any standard sense of the word, the sum total of those motes concentrated at one time and placed into one person's eye could cause serious injury or even blindness. Dispersing the dust (either over time or across many people) mitigates... (read more)

3Pablo
The problem with this claim is that you can construct a series of overlapping comparisons involving experiences that differ but slightly in how painful they are. Then, provided that the series has sufficiently many elements, you'll reach the conclusion that an experience of pain, no matter how intense, is preferable to arbitrarily many instances of the mildest pain imaginable. (Strictly speaking, you could actually avoid this conclusion by assuming that painful experiences of a given intensity have diminishing marginal value and that this value converges to a finite quantity. Then if the limiting value of a very mild pain is less than the value of a single extremely painful experience, the continuity argument wouldn't work. However, I see no independent motivation for embracing a theory of value of this sort. Moreover, such a theory would have incredible implications, e.g., that to determine how bad someone's pain is one needs to consider whether sentient beings have already experienced pains of that intensity in remote regions of spacetime.)
0shminux
Yeah, this is a common attempt to avoid this particular repugnant conclusion. This approach leads to conclusions like that a 3^^^3 mildly stabbed toes are better than a single moderately stabbed one. (Because if not, we can construct an unbroken chain of comparable pain experiences from specks to torture.) The motivation is there, to make dust specks and torture incomparable. Unfortunately, this approach doesn't work, as it results in infinitely many arbitrarily defined discontinuities.

The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too.

That is the straightforward utilitarian answer, without any question. However, it is not the common intuition, and even if Eliezer agrees with you he is evidently aware that the common intuition disagrees, because otherwise he would not bother blogging it. It's the contradiction between intuition and philosophical conclusion that makes it an interesting topic.

Robin's answer hinges on "all else being equal." That condition can tie up a lot of loose ends, it smooths over plenty of rough patches. But those ends unravel pretty quickly once you start to consider all the ways in which everything else is inherently unequal. I happen to think the dust speck is a 0 on the disutility meter, myself, and 3^^^3*0 disutilities = 0 disutility.

I believe that ideally speaking the best choice is the torture, but pragmatically, I think the dust speck answer can make more sense. Of course it is more intuitive morally, but I would go as far as saying that the utility can be higher for the dust specks situation (and thus our intuition is right). How? the problem is in this sentence: "If neither event is going to happen to you personally," the truth is that in the real world, we can't rely on this statement. Even if it is promised to us or made into a law, this type of statements often won't ... (read more)

0themusicgod1
Your link is 404ing. Is http://spot.colorado.edu/~norcross/Comparingharms.pdf‎ the same one?
0Pablo
Here's the link (both links above are dead).
2ignoranceprior
Here's the latest working link (all three above are dead) Also, here's an archive in case that one ever breaks!

Robin, could you explain your reasoning. I'm curious.

Humans get barely noticeable "dust speck equivalent" events so often in their lives that the number of people in Eliezer's post is irrelevant; it's simply not going to change their lives, even if it's a gazillion lives, even with a number bigger than Eliezer's (even considering the "butterfly effect", you can't say if the dust speck is going to change them for the better or worse -- but with 50 years of torture, you know it's going to be for the worse).

Subjectively for these people, ... (read more)

@Robin,

"But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet."

I thought that Sebastian Hagen and I had said it. Or do you think we gave weasel answers? Mine was only contingent on my math being correct, and I thought his was similarly clear.

Perhaps I was unclear in a different way. By asking if the choice was repeatable, I didn't mean to dodge the question; I meant to make it more vivid. Moral questions are asked in a situation where many people a... (read more)

Hmm, thinking some more about this, I can see another angle (not the suffering angle, but the "being prudent about unintended consequences" angle):

If you had the choice between very very slightly changing the life of a huge number of people or changing a lot the life of only one person, the prudent choice might be to change the life of only one person (as horrible as that change might be).

Still, with the dust speck we can't really know if the net final outcome will be negative or positive. It might distract people who are about to have genius ide... (read more)

Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?

The square of the number of milliseconds in 50 years is about 10^21.

Would you rather one person tortured for a millisecond (then no ill effects), or that 3^^^3/10^21 people get a dust speck per second for 50 centuries?

OK, so the utility/effect doesn't scale when you change the times. But even if each 1% added dust/torture time made things ten times worse, when you reduce the dust-speckled population to reflect that it's still countless universes worth of people.

[-]Bob300

I'm with Tomhs. The question has less value as a moral dilemma than as an opportunity to recognize how we think when we "know" the answer. I intentionally did not read the comments last night so I could examine my own thought process, and tried very hard to hold an open mind (my instinct was dust). It's been a useful and interesting experience. Much better than the brain teasers which I can generally get because I'm on hightened alert when reading El's posts. Here being on alert simply allowed me to try to avoid immediately giving in to my bias.

Averaging utility works only when law of large numbers starts to play a role. It's a good general policy, as stuff subject to it happens all the time, enough to give sensible results over the human/civilization lifespan. So, if Eliezer's experiment is a singular event and similar events don't happen frequently enough, answer is 3^^^3 specks. Otherwise, torture (as in this case, similar frequent enough choices would lead to a tempest of specks in anyone's eye which is about 3^^^3 times worse then 50 years of torture, for each and every one of them).

Benquo, your first answer seems equivocal, and so did Sebastian's on a first reading, but now I see that it was not.

Torture,

Consider three possibilities:

(a) A dusk speck hits you with probability one, (b) You face an additional probability 1/( 3^^^3) of being tortured for 50 years, (c) You must blink your eyes for a fraction of a second, just long enough to prevent a dusk speck from hitting you in the eye.

Most people would pick (c) over (a). Yet, 1/( 3^^^3) is such a small number that by blinking your eyes one more time than you normally would you increase your chances of being captured by a sadist and tortured for 50 years by more than 1/( 3^^^3). Thus, (b) must be better than (c). Consequently, most people should prefer (b) to (a).

8timujin
You know, that actually persuaded me to override my intuitions and pick torture over dust specks.
3Jiro
You don't even have to go that far. Replace "dust specks" with "the inconvenience of not going outside the house" and "tiny chance of torture" with "tiny chance that being outside the house will lead to you getting killed".
1timujin
Yeah, I understood the point.

There isn't any right answer. Answers to what is good or bad is a matter of taste, to borrow from Nietzsche.

To me the example has messianic quality. One person suffers immensely to save others from suffering. Does the sense that there is a 'right' answer come from a Judeo-Christian sense of what is appropriate. Is this a sort of bias in line with biases towards expecting facts to conform to a story?

Also, this example suggests to me that the value pluralism of Cowen makes much more sense than some reductive approach that seeks to create one objective me... (read more)

Why is this a serious question? Given the physical unreality of the situation, the putative existence of 3^^^3 humans and the ability to actually create the option in the physical universe - why is this question taken seriously while something like is it better to kill Santa Claus or the Easter Bunny considered silly?

Fascinating, and scary, the extent to which we adhere to established models of moral reasoning despite the obvious inconsistencies. Someone here pointed out that the problem wasn't sufficiently defined, but then proceeded to offer examples of objective factors that would appear necessary to evaluation of a consequentialist solution. Robin seized upon the "obvious" answer that any significant amount of discomfort, over such a vast population, would easily dominate, with any conceivable scaling factor, the utilitarian value of the torture of a si... (read more)

The hardships experienced by a man tortured for 50 years cannot compare to a trivial experience massively shared by a large number of individuals -- even on the scale that Eli describes. There is no accumulation of experiences, and it cannot be conflated into a larger meta dust-in-the-eye experience; it has to be analyzed as a series of discreet experiences.

As for larger social implications, the negative consequence of so many dust specked eyes would be negligible.

Wow. People sure are coming up with interesting ways of avoiding the question.

Eliezer wrote "Wow. People sure are coming up with interesting ways of avoiding the question."

I posted earlier on what I consider the more interesting question of how to frame the problem in order to best approach a solution.

If I were to simply provide my "answer" to the problem, with the assumption that the dust in the eyes is likewise limited to 50 years, then I would argue that the dust is to be preferred to the torture, not on a utilitarian basis of relative weights of the consequences as specified, but on the bigger-picture view th... (read more)

[-]g10

Eliezer, are you suggesting that declining to make up one's mind in the face of a question that (1) we have excellent reason to mistrust our judgement about and (2) we have no actual need to have an answer to is somehow disreputable?

As for your link to the "motivated stopping" article, I don't quite see why declining to decide on this is any more "stopping" than choosing a definite one of the options. Or are you suggesting that it's an instance of motivated continuation? Perhaps it is, but (as you said in that article) the problem with ... (read more)

What happens if there aren't 3^^^3 instanced people to get dust specks? Do those specks carry over such that person #1 gets a 2nd speck and so on? If so, you would elect to have the person tortured for 50 years for surely the alternative is to fill our universe with dust and annihilate all cultures and life.

Robin, of course it's not obvious. It's only an obvious conclusion if the global utility function from the dust specks is an additive function of the individual utilities, and since we know that utility functions must be bounded to avoid Dutch books, we know that the global utility function cannot possibly be additive -- otherwise you could break the bound by choosing a large enough number of people (say, 3^^^3).


From a more metamathematical perspective, you can also question whether 3^^3 is a number at all. It's perfectly straightforward to construct a p... (read more)

2homunq
I once read the following story about a Russian mathematician. I can't find the source right now. Cast: Russian mathematician RM, other guy OG RM: "Truly large numbers don't really exist in the same sense that small ones do." OG: "That's ridiculous. Consider the powers of two. Does 2ˆ1 exist?"" RM: "Yes." OG: "OK, does 2ˆ2 exist?" RM: ".Yes." OG: "So you'd agree that 2ˆ3 exists?" RM: "...Yes." OG: "How about 2ˆ4?" RM: ".......Yes." OG: "So this is silly. Where would you ever draw the boundary?" RM: ".............................................................................................................................................."

Eliezer, are you suggesting that declining to make up one's mind in the face of a question that (1) we have excellent reason to mistrust our judgement about and (2) we have no actual need to have an answer to is somehow disreputable?

Yes, I am.

Regarding (1), we pretty much always have excellent reason to mistrust our judgments, and then we have to choose anyway; inaction is also a choice. The null plan is a plan. As Russell and Norvig put it, refusing to act is like refusing to allow time to pass.

Regarding (2), whenever a tester finds a user input that cr... (read more)

-5polymathwannabe

Fascinating question. No matter how small the negative utility in the dust speck, multiplying it with a number such as 3^^^3 will make it way worse than torture. Yet I find the obvious answer to be the dust speck one, for reasons similar to what others have pointed out - the negative utility rounds down to zero.

But that doesn't really solve the problem, for what if the harm in question was slightly larger? At what point does it cease rounding down? I have no meaningful criteria to give for that one. Obviously there must be a point where it does cease doing... (read more)

"Regarding (1), we pretty much always have excellent reason to mistrust our judgments, and then we have to choose anyway; inaction is also a choice. The null plan is a plan. As Russell and Norvig put it, refusing to act is like refusing to allow time to pass."

This goes to the crux of the matter, why to the extent the future is uncertain, it is better to decide based on principles (representing wisdom encoded via evolutionary processes over time) rather than on the flat basis of expected consequences.

Would you condemn one person to be horribly tortured for fifty years without hope or rest, to save every qualia-experiencing being who will ever exist one blink?

Is the question significantly changed by this rephrasing? It makes SPECKS the default choice, and it changes 3^^^3 to "all." Are we better able to process "all" than 3^^^3, or can we really process "all" at all? Does it change your answer if we switch the default?

Would you force every qualia-experiencing being who will ever exist to blink one additional time to save one person from being horribly tortured for fifty years without hope or rest?

> For those who would pick TORTURE, what about Vassar's universes of agonium? Say a googolplex-persons' worth of agonium for a googolplex years.

If you mean would I condemn all conscious beings to a googolplex of torture to avoid universal annihilation from a big "dust crunch" my answer is still probably yes. The alternative is universal doom. At least the tortured masses might have some small chance of finding a solution to their problem at some point. Or at least a googolplex years might pass leaving some future civilization free to prosper. ... (read more)

> Would you condemn one person to be horribly tortured for fifty years without hope or rest, to save every qualia-experiencing being who will ever exist one blink?

That's assuming you're interpreting the question correctly. That you aren't dealing with an evil genie.

[-]Zeus10

You never said we couldn't choose who specifically gets tortured, so I'm assuming we can make that selection. Given that, the once agonizingly difficult choice is made trivially simple. I would choose 50 years of torture for the person who made me make this decision.

[-]Kat320

Since I chose the specks -- no, I probably wouldn't pay a penny; avoiding the speck is not even worth the effort to decide to pay the penny or not. I would barely notice it; it's too insignificant to be worth paying even a tiny sum to avoid.

I suppose I too am "rounding down to zero"; a more significant harm would result in a different answer.

-1phob
You're avoiding the question. What if a penny was automatically payed for you each time in the future to avoid dust specks floating in your eye? The question is whether the dust speck is worth at least a negative penny of disutility. For me, I would say yes.

"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"

To avoid all the dust specks, yeah, I'd pay a penny and more. Not a penny per speck, though ;)

The reason is to avoid having to deal with the "unintended consequences" of being responsible for that very very small change over such a large number of people. It's bound to have some significant indirect consequences, both positive and negative, on the far edges of the bell curve... the net impact could be negative, and a penny is little to pay to avoid responsibility for that possibility.

The first thing I thought when I read this question was that the dust specks were obviously preferable. Then I remembered that my intuition likes to round 3^^^3 down to something around twenty. Obviously, the dust specks are preferable to the torture for any number at all that I have any sort of intuitive grasp over.

But I found an argument that pretty much convinced me that the torture was the correct answer.

Suppose that instead of making this choice once, you will be faced with the same choice 10^17 times for the next fifty years (This number was chosen... (read more)

-3aausch
The reasoning here seems very broken to me (I have no opinion on the conclusion yet): Look at a version of the reverse dial. Say that you start with 3^^^3 people having 1000000 dust-specks a second rubbed in their eye, and 0 people tortured. Each time you turn the dial up by 1, 1 person is moved over from the "speck in the eye" list over to the "tortured for 50 years" list, and the frequency is reduced by 1 spec/second. Would you turn the dial up to 1000000?
0phob
So because there is a continuum between the right answer (lots of torture) and the wrong answer (3^^^3 horribly blinded people), you would rather blind those people?
4Manfred
Nah, he was pretty clearly challenging the use of induction in the above post. The larger problem is assuming linearity in an obviously nonlinear situation - this also explains why the induction appears to work either way. Applying 1 pound of force to someone's kneecap is simply not 1/10th as bad as applying 10 pounds of force to someone's kneecap.
4XiXiDu
This has nothing to do with the original question. You rephrased it so that it now asks if you'd rather torture one person or 3^^^3. Of course you rather torture one person than 3^^^3. That does not equal torturing one person or that 3^^^3 people get dust specks in their eyes for a fraction of a second.

"... whenever a tester finds a user input that crashes your program, it is always bad - it reveals a flaw in the code - even if it's not a user input that would plausibly occur; you're still supposed to fix it. "Would you kill Santa Claus or the Easter Bunny?" is an important question if and only if you have trouble deciding. I'd definitely kill the Easter Bunny, by the way, so I don't think it's an important question."

I write code for a living; I do not claim that it crashes the program. Rather the answer is irrelevant as I don't thin... (read more)

By "pay a penny to avoid the dust specks" I meant "avoid all dust specks", not just one dust speck. Obviously for one speck I'd rather have the penny.

2phob
So if someone would pay a penny, they should pick torture if it were 3^^^^3 people getting dust specks, which makes it suspect that they understood 3^^^3 in the first place.

what about Vassar's universes of agonium? Say a googolplex-persons' worth of agonium for a googolplex years.

To reduce suffering in general rather than your own (it would be tough to live with), bring on the coddling grinders. (10^10^100)^2 is a joke next to 3^^^3.

Having said that, it depends on the qualia-experiencing population of all existence compared to the numbers affected, and whether you change existing lives or make new ones. If only a few googolplex-squared people-years exist anyway, I vote dust.

I also vote to kill the bunny.

For those who would pick TORTURE, what about Vassar's universes of agonium? Say a googolplex-persons' worth of agonium for a googolplex years.

Torture, again. From the perspective of each affected individual, the choice becomes:

1.) A (10(10100))/(3^^^3) chance of being tortured for (10(10100)) years.
2.) A 1 chance of a dust speck.
(or very slightly different numbers if the (10(10100)) people exist in addition to the 3^^^3 people; the difference is too small to be noticable)

I'd still take the former. (10(10100))/(3^^^3) is still so close to zero that there'... (read more)

[-]g50

Eliezer, it's the combination of (1) totally untrustworthy brain machinery and (2) no immediate need to make a choice that I'm suggesting means that withholding judgement is reasonable. I completely agree that you've found a bug; congratulations, you may file a bug report and add it to the many other bug reports already on file; but how do you get from there to the conclusion that the right thing to do is to make a choice between these two options?

When I read the question, I didn't go into a coma or become psychotic. I didn't even join a crazy religion or ... (read more)

Let's suppose we measure pain in pain points (pp). Any event which can cause pain is given a value in [0, 1], with 0 being no pain and 1 being the maximum amount of pain perceivable. To calculate the pp of an event, assign a value to the pain, say p, and then multiply it by the number of people who will experience the pain, n. So for the torture case, assume p = 1, then:

torture: 1*1 = 1 pp

For the spec in eye case, suppose it causes the least amount of pain greater than no pain possible. Denote this by e. Assume that the dust speck causes e amount of ... (read more)

"Wow. People sure are coming up with interesting ways of avoiding the question."

My response was a real request for information- if this is a pure utility test, I would select the dust specks. If this were done to a complex, functioning society, adding dust specks into everyone's eyes would disrupt a great deal of important stuff- someone would almost certainly get killed in an accident due to the distraction, even on a planet with only 10^15 people and not 3^^^^3.

Eliezer, in your response to g, are you suggesting that we should strive to ensure that our probability distribution over possible beliefs sum to 1? If so, I disagree: I don't think this can be considered a plausible requirement for rationality. When you have no information about the distribution, you ought to assign probabilities uniformly, according to Laplace's principle of indifference. But the principle of indifference only works for distributions over finite sets. So for infinite sets you have to make an arbitrary choice of distribution, which violates indifference.

"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"

Yes. Note that, for the obvious next question, I cannot think of an amount of money large enough such that I would rather keep it than use it to save a person from torture. Assuming that this is post-Singularity money which I cannot spend on other life-saving or torture-stopping efforts.

"You probably wouldn't blind everyone on earth to save that one person from being tortured, and yet, there are (3^^^3)/(10^17) >> 7*10^9 people being blinded for ea... (read more)

3phob
People are being tortured, and it wouldn't take too much money to prevent some of it. Obviously, there is already a price on torture.

My algorithm goes like this:
there are two variables, X and Y.
Adding a single additional dust speck to a person's eye over their entire lifetime increases X by 1 for every person this happens to.
A person being tortured for a few minutes increases Y by 1.

I would object to most situations where Y is greater than 1. But I have no preferences at all with regard to X.

See? Dust specks and torture are not the same. I do not lump them together as "disutility". To do so seems to me a preposterous oversimplification. In any case, it has to be argued that... (read more)

I am not convinced that this question can be converted into a personal choice where you face the decision of whether to take the speck or a 1/3^^^3 chance of being tortured. I would avoid the speck and take my chances with torture, and I think that is indeed an obvious choice.

I think a more apposite application of that translation might be:
If I knew I was going to live for 3^^^3+50*365 days, and I was faced with that choice every day, I would always choose the speck, because I would never want to endure the inevitable 50 years of torture.

The difference is that framing the question as a one-off individual choice obscures the fact that in the example proffered, the torture is a certainty.

1/3^^^3 chance of being tortured... If I knew I was going to live for 3^^^3+50*365 days, and I was faced with that choice every day, I would always choose the speck, because I would never want to endure the inevitable 50 years of torture.

That wouldn't make it inevitable. You could get away with it, but then you could get multiple tortures. Rolling 6 dice often won't get exactly one "1".

Tom McCabe wrote:
The probability is effectively much greater than that, because of complexity compression. If you have 3^^^^3 people with dust specks, almost all of them will be identical copies of each other, greatly reducing abs(U(specks)). abs(U(torture)) would also get reduced, but by a much smaller factor, because the number is much smaller to begin with.

Is there something wrong with viewing this from the perspective of the affected individuals (unique or not)? For any individual instance of a person, the probability of directly experiencing the tortu... (read more)