Here is the complete solution to Newcomb's paradox.
Setup:
Omega performs a brainscan on you at 12pm, and expects you to choose a box at 6pm. Based on the brainscan, Omega makes a prediction and classifies you into two categories, either you will take only 1 box, or you will take 2 boxes. The AI is very good at classifying brainscans.
Box A is transparent and contains £1k
Box B has £1m if Omega thinks you are a 1-box person, or contains £0 if you are a 2-box person.
Do you choose to take both boxes, or only box B?
Note: I mention the brainscan to make things more concrete, the exact method of how Omega performs the... (read 817 more words →)
True beliefs doesn't mean omniscience. It is possible to have only true beliefs but still not know everything. In this case, the agent might not know if the driver can read minds but still have accurate beliefs otherwise.