Is Evidential Decision Theory presumptuous?

by Tobias_Baumann 3y2nd Feb 20171 min read39 comments

3


I recently had a conversation with a staunch defender of EDT who maintained that EDT gives the right answer in the Smoker’s Lesion and even Evidential Blackmail. I came up with the following, even more counterintuitive, thought experiment:


--


By doing research, you've found out that there is either

(A) only one universe or

(B) a multiverse.

You also found out that the cosmological theory has a slight influence (via different physics) on how your brain works. If (A) holds, you will likely decide to give away all your money to random strangers on the street; if there is a multiverse, you will most likely not do that. Of course, causality flows in one direction only, i.e. your decision does not determine how many universes there are.

 

Suppose you have a very strong preference for (A) (e.g. because a multiverse would contain infinite suffering) so that it is more important to you than your money.

 

Do you give away all your money or not?

 

--


This is structurally equivalent to the Smoker's lesion, but what's causing your action is the cosmological theory, not a lesion or a gene. CDT, TDT, and UDT would not give away the money because there is no causal (or acausal) influence on the number of universes. EDT would reason that giving the money away is evidence for (A) and therefore choose to do so.


Apart from the usual “managing the news” point, this highlights another flaw in EDT: its presumptuousness. The EDT agent thinks that her decision spawns or destroys the entire multiverse, or at least reasons as if. In other words, EDT acts as if it affects astronomical stakes with a single thought.


I find this highly counterintuitive.

 

What makes it even worse is that this is not even a contrived thought experiment. Our brains are in fact shaped by physics, and it is plausible that different physical theories or constants both make an agent decide differently and make the world better or worse according to one’s values. So, EDT agents might actually reason in this way in the real world.

3