LESSWRONG
LW

Personal Blog

4

newcomb's altruism

by [anonymous]
21st Dec 2012
1 min read
16

4

Personal Blog

4

newcomb's altruism
12Manfred
2[anonymous]
5JoshuaZ
0[anonymous]
0tut
2gwern
0arundelo
4kilobug
3TrE
3[anonymous]
1ewbrownv
0Luke_A_Somers
0Tenoke
4Jayson_Virissimo
2Tenoke
0A1987dM
New Comment
16 comments, sorted by
top scoring
Click to highlight new comments since: Today at 4:18 AM
[-]Manfred13y120

http://en.wikipedia.org/wiki/Volunteer's_dilemma

http://en.wikipedia.org/wiki/Mixed_strategy#Mixed_strategy

:)

Reply
[-][anonymous]13y20

Grr...well played. I'm pretty sure that doesn't answer all the posited questions, but you've reduced my problem.

Reply
[-]JoshuaZ13y50

When one has a problem one has trouble with, it is a good idea often to try to make simpler versions or alternate versions to see if one can solve those and get insight from that. This entry seems to be an example of something that is worryingly common on LW, taking a difficult to understand problem, and looking at even more general, difficult versions. Sometimes this can also be helpful. But more often, it isn't.

Reply
[-][anonymous]13y00

I'm not using this problem to solve the original newcomb's problem, that one is already solved. "We are doing this, not because it is easy, but because it is difficult" (I forgot the source) I understand where you're coming from with that idea, but this post isn't a reduction of a problem, it is a new problem in itself (bearing elements of a similar, but distinct problem).

Reply
[-]tut13y00

"We are doing this, not because it is easy, but because it is difficult" (I forgot the source)

Kennedy. About going to the moon in the sixties.

Reply
[-]gwern13y20

It wouldn't be an impressive signal of capitalism's superiority over communism if the signal was cheap.

Reply
[-]arundelo13y00

And doing the other things!

Reply
[-]kilobug13y40

My own stupid "cheating" answer : giving $1,000,000 to every human will make the dollar to collapse, those $1,000,000 will not have any value, and will make the whole world economy to collapse due to the importance of the dollar in international exchange. I don't belong to the Chicago school and I do think massive monetary creation can have positive long-term consequences (and I do think it's the a significant part of the solution to the "European debt crisis"), but... not one-shot creation of 7*10^15 dollars.

Reply
[-]TrE13y30

Probably related: Hofstadter's "Superrationality"

Reply
[-][anonymous]13y30

Hmmm.

Assuming all people are rational, but not superrational: Two box. Expect $1000

Actual humans: two box; my thought processes would match a few hundred (LWers) people out of the billions. Expect 50% chance of $1000000.

N of me: If I one-box, I guarantee my block one boxes. If there were around 10 million of me, I would one-box.

Reasoning for N or me: given the 50% above, if I added 1/1000th of the population as known one-boxer-iff-I-one-box, it would make it worth it ($1k is 1/1000 of $1M). This is totally approximate and I didn't actually algebra.

Reply
[-]ewbrownv13y10

Knowing that philosophers are the only people who two-box on Newcomb's problem, and they constitute a vanishingly small fraction of Earth's population, I confidently one-box. Then I rush out to spend my winnings as quickly as possible, before the inevitable inflation hits.

Telling me what X is will have no effect on my action, because I already have that information. Making copies of me has no effect on my strategy, for the same reason.

Reply
[-]Luke_A_Somers13y00

The ratio is so huge that being a 'freeloader' is barely a benefit. I'll one-box just to make it more likely that we get the big prize.

Reply
[-]Tenoke13y00

TLDR as aforementioned if you give the same amount of money to everyone in the world nobody gains anything and it might even temporarily collapse the economy.

[This comment is no longer endorsed by its author]Reply
[-]Jayson_Virissimo13y40

TLDR as aforementioned if you give the same amount of money to everyone in the world nobody gains anything and it might even temporarily collapse the economy.

Wouldn't debtors gain at the expense of creditors?

Reply
[-]Tenoke13y20

Yes, I realized the generalization that poorer people as a whole will benefit at the expense of richer people, so I didn't want to post this. I must've done so by accident, I will retract it now.

Reply
[-]A1987dM13y00

$1,000,000 multiplied by the number of humans

That's a lot of money -- opening the box would just cause the dollar to devalue to waste paper. [mentally divides every number by 100]

I take box A with probability 49%.

Reply
Moderation Log
Curated and popular this week
16Comments

Hello lesswrong community. I want to play a game. *jigsaw music plays*

In box B, I have placed either $1,000,000 multiplied by the number of humans on earth or $0. In box A I placed $1000 multiplied by the number of humans on earth. Every human on the planet, including you, will now be asked the following question: do you take just box B, or both? If my friend omega predicted the majority of humans would one-box, box B has the aforementioned quantity of money, and it will be split accordingly (everyone in the world receives $1,000,000). If he predicted the majority of humans would two-box, it has nothing. Everyone who two-boxes receives $1000 in addition to whether or not the $1,000,000 was obtained. In fact forget the predicting, let's just say I'll tally up the votes and *then* decide whether to put the money in box B or not. Would it then be rational to two-box or one-box? If I told you that X is the proportion of humans that one-box in the classical newcomb's problem, should that affect your strategy? What if I told you that Y is the proportion of humans so far that have one-boxed out of those who have chosen so far? Would it even be morally permissible to two-box? Also, let's assume the number of humans are odd (since I know someone's going to ask what happens in a tie).

I do also have a follow-up in both cases. If you chose two-box, let's say I stopped your decision short to tell you that there are N other instances of yourself in the world, for I have cloned you secretly and without consent (sue me). How big would N have to be for you to one-box? If you chose one-box, and I stopped your decision short to say N people in the world have already two-boxed, or have already one-boxed, how big would N have to be for you to decide your effect is inconsequential and two-box?