Posts

Sorted by New

Wiki Contributions

Comments

All of your prompts have the same typo, saying that the first number is the payoff for both players. I have to imagine that this is part of what’s confusing to the AI.

I am off-put by the repeated implications that 1-boxing in Newcomb's is correct. I understand that is popular here, but it seems unreasonably confident to react to seeing decision theorists 2-box with "why are the experts wrong" rather than "hmm, maybe they are right". Especially when you go on to see that you generally agree with them on many other issues. Of course, as a 2-boxer myself I am biased, but without actually discussing Newcomb's paradox I think that this data is some strong evidence that the view should be treated more seriously than this.