Sorted by New

Wiki Contributions


Newcomb's Problem and Regret of Rationality

they would just insist that there is an important difference between deciding to take only box B at 7:00am vs 7:10am, if Omega chooses at 7:05am

But that's exactly what strategic inconsistency is about. Even if you had decided to take only box B at 7:00am, by 7:06am a rational agent will just change his mind and choose to take both boxes. Omega knows this, hence it will put nothing into box B. The only way out is if the AI self-commits to take only box B is a way that's verifiable by Omega.

Just Lose Hope Already

Anders, there's no obvious connection between slow discounting and irrational persistence. Slow discounting is relevant when deciding whether to undertake a sunk investment (based on its ex-ante expected return), but once the investment has been incurred the sunk cost should be ignored as having no bearing on further decisions.