Posts

Sorted by New

Wiki Contributions

Comments

Fascinating. A few days after I read this, it struck me that a form of Newcomb's Problem actually occurs in real life--voting in a large election. Here's what I mean.

Say you're sitting at home pondering whether to vote. If you decide to stay home, you benefit by avoiding the minor inconvenience of driving and standing in line. (Like gaining $1000.) If you decide to vote, you'll fail to avoid the inconvenience, meanwhile you know your individual vote almost certainly won't make a statistical difference in getting your candidate elected. (Which would be like winning $1000000.) So rationally, stay at home and hope your candidate wins, right? And then you'll have avoided the inconvenience too. Take both boxes.

But here's the twist. If you muster the will to vote, it stands to reason that those of a similar mind to you (a potentially statistically significant number of people) would also muster the will to vote, because of their similarity to you. So knowing this, why not stay home anyway, avoid the inconvenience, and trust all those others to vote and win the election? They're going to do what they're going to do. Your actions can't change that. The contents of the boxes can't be changed by your actions. Well, if you don't vote, perhaps that means neither will the others, and so it goes. Therein lies the similarity to Newcomb's problem.