The leering statue of the unnamed trickster god stared down at him from its sacred alcove, motionless as always. Maxwell sighed. He had been praying at the base of the statue for the past hour, and like every time previous, he hadn’t felt anything. The priests had promised salvation from his fiscal woes if he would only pray for an hour a day at the local temple for two weeks, and now two weeks had passed, but nothing had happened. Maxwell rose wearily to his feet, recited the closing prayer quickly, and turned to go. That was when he heard the voice.

“STOP WHERE YOU ARE.“

Maxwell whirled around, and found himself standing face-to-face with the leering stone statue, whose marble eyes were now glowing a dull red. It had stepped out of its alcove. Maxwell screamed, in accordance with the ancient and honored custom of screaming at creepy glowing statues that had moved right behind you.

“BE NOT AFRAID, MORTAL, FOR I HAVE COME IN ANSWER TO YOUR PRAYERS...ACTUALLY, I TAKE THAT BACK. BE MILDLY AFRAID, BECAUSE I AM ADMITTEDLY KIND OF CREEPY AND DON’T HAVE THE SAME UTILITY FUNCTIONS AS YOU.”

“What the heck...”

“I AM THE UNNAMED TRICKSTER GOD, COME TO DELIVER YOU FROM FINANCIAL SUFFERING.“

“Well thanks, unnamed trickster god, that’s just swell. So, uh... what’s the catch?”

“THE CATCH?”

“You’re a trickster god aren’t you? Don’t trickster gods always come with some sort of catch to their bounties?”

“OH YEAH, THANKS FOR REMINDING ME, I TOTALLY FORGOT ABOUT THAT.”

“Well it’s okay, you don’t have to—“

“NOPE, JUST MESSING WITH YOU, OF COURSE THERE’S A CATCH.”

“Ah well, I guess that is to be expected. So what is it then? I’m really desperate, so I’m willing to sacrifice pretty much whatever you need...”

“YOU ARE IN LUCK, YOUNG MORTAL, FOR YOUR REQUIRED SACRIFICE IS A CONDITIONAL ONE, WHICH INVOLVES MONEY YOU ARE NOT YET AWARE OF OWNING. I NOW PRESENT TO YOU THESE TWO BOXES.”

Upon saying this, the chest of the statue of the unnamed trickster god split open, revealing two black boxes inside a hollow cavity. The two boxes levitated soundlessly into the air, and hovered in front of Maxwell, slowly rotating. The leftmost box began to smoke, and the blackness dripped away, revealing clear glass. Inside the leftmost box sat ten crisp one hundred dollar bills. The rightmost box remained impenetrably black. Maxwell tentatively reached for the leftmost box.

“WAIT.”

Maxwell jerked his hand back, and waited.

“WHAT YOU ARE FACING IS A VERSION OF A HYPOTHETICAL EXPERIMENT ONCE PROPOSED TO ME BY A FELLOW TRICKSTER GOD. I HAVE SET THIS EXPERIMENT UP BEFORE YOU ENTERED THE ROOM, AND AT THIS POINT NOT EVEN I CAN CHANGE WHAT LIES INSIDE THOSE BOXES. THE LEFTMOST ONE OF COURSE CONTAINS ONE THOUSAND DOLLARS, AS I AM SURE YOU HAVE SURMISED.”

“Yep.”

“WELL THAT’S GOOD.”

“Um..so what’s in the other box?”

“HERE’S WHERE THIS EXPERIMENT GETS INTERESTING. I HAVE BORROWED FROM THE VAULT OF HEAVEN A PARTICULARLY POWERFUL COMPUTER, WHICH CAN SIMULATE ALL OF YOUR REALITY; PAST, PRESENT, AND FUTURE. THIS COMPUTER HAS BEEN RIGHT IN ITS PREDICTIONS FOR YOUR FUTURE ONE HUNDRED PERCENT OF THE TIME. A FEW HOURS BACK, I USED IT TO LOOK AHEAD AND SEE IF YOU WOULD CHOSE ONE OR BOTH OF THE BOXES IN FRONT OF YOU. IF YOU WILL CHOOSE ONLY ONE BOX, THEN I HAVE PUT ONE MILLION DOLLARS IN THE BLACK BOX. IF YOU WILL CHOOSE BOTH BOXES, THEN I HAVE PUT A SINGLE SPIDER IN THE BLACK BOX, AND I KNOW YOU HATE SPIDERS. AS MENTIONED BEFORE, I HAVE ALREADY FINALIZED THE CONTENTS OF BOTH BOXES BEFORE WE BEGAN TALKING. SO WHAT WILL IT BE, YOUNG MORTAL? ONE BOX OR TWO?”

“Um, wow. This is a tricky question... Can I think about this for a moment?”

“OF COURSE, TAKE ALL THE TIME YOU NEED. I’M A TRICKSTER GOD, NOT AN IMPATIENT ONE.”

Maxwell took a few moments to think, and as he pondered, his face grew grim.

“So this super powerful computer you said you used...how did it work?”

“THE INTERNAL DETAILS ARE RATHER COMPLICATED, BUT BASICALLY IT SIMULATED A VIRTUAL VERSION OF YOU, ALIKE IN ABSOLUTELY EVERY WAY, AND CHECKED THE NUMBER OF BOXES THE PERFECT COPY OF YOU WILL TAKE, AT WHICH POINT THE PROGRAM TERMINATED AND OUTPUT THE RESULT IN BINARY CODE TO ITS DIVINE DASHBOARD, WHICH I THEN READ AND ACTED UPON.”

“Okay, so that’s kind of freaky. May I ask if perfectly simulated copies of me are separate conscious beings?”

“I WILL LEAVE THAT UP TO YOU TO DETERMINE.”

“Okay, now I’m really kind of freaking out.”

“YOU WEREN’T BEFORE?”

“Well, I’m freaking out even more now, I guess. If it’s true that there existed an exact simulation of myself in this exact scenario—“

“IT IS”

“—Then if that copy of me is a separate conscious being from the ‘real’ me, there is no way for me to determine if I’m actually in the simulation right now, right?”

“INDEED. THERE IS NO WAY TO TELL IF YOU ARE IN A SIMULATION OR NOT, IF YOUR OTHER ASSUMPTIONS ARE TRUE.”

“You also said that the simulation halts as soon as I choose the number of boxes I want.”

“CORRECT.”

“What this means is that there’s a good chance that as soon as I make my choice, or more accurately as soon as I act on it, my existence as a conscious being will immediately terminate. Effectively, if I’m living in the simulation rather than the real world, I’ll die. And I don’t know about you , but I don’t want to die. I also want money, of course, but not enough to risk my life on this trap. So I’m leaving.”

“WHAT?”

“I’m not choosing either box, at least not right now. Maybe at the end of my life, when I’m feeling ready to die, I’ll come back here and choose both boxes for my children’s sake, but for me, it isn’t worth the risk of immediate termination of my existence. I hope you’ll be patient enough to wait for me then.”

“VERY WELL,” said the unnamed trickster god, but Maxwell had already left the temple, gone on his way to find a non-trickster god to pray to.

(This was written last night after having a fascinating conversation about Oracle AIs with my brother, which led me to wonder if Newcomb’s Problem might have a hidden ethical dark side to it. I have not heard this particular issue mentioned anywhere before, but if others have touched on it in the past, I'll be happy to link any resources provided. As always, I’d be delighted to hear any and all comments and criticism, so feel free to chat in the comments below :))

New to LessWrong?

New Comment
10 comments, sorted by Click to highlight new comments since: Today at 8:39 PM

"DO YOU CHOOSE TO NULL-BOX EVEN WHEN THERE ARE TEN DELICIOUS HUNDRED DOLLAR BILLS JUST WAITING TO BE PURSUED BY YOU?"

"Yes"

"AS A TRICKSTER GOD I WILL REWARD YOU NOW WITH TWO MILLION DOLLARS. ONE FOR EACH BOX YOU CHOSE TO NOT OPEN. THE TRICK IS THAT I NEVER PRESENTED YOU WITH THE BEST OPTION AVAILABLE, BUT YOU STILL GOT THE IDEAL SOLUTION. ALTHOUGH IT IS NOT TRUE THAT I WOULD IMMEDIATELY TERMINATE ALL SIMULATIONS OF YOUR COPIES IN CASE THEY HAD CHOSEN ANOTHER OPTIONS, AS SIMULATIONS ARE VERY CHEAP TO RUN FOR EVER"

The statue became motionless again. Maxwell donates the prize to the priests who promised him salvation, for they had always been right.

But this was the final trick, for as soon as Maxwell accepted the two million dollars, the simulation ended.

Thanks for the happy ending :)

If the trickster god personally reads the prediction, their behavior can depend on the prediction, which makes diagonalization possible (ask the trickster god what the prediction was, then do the opposite). This calls the claim of 100% precision of the predictor into question (or at least makes the details of its meaning relevant).

Following Scott Aaronson's The Ghost in the Quantum Turing Machine, section 2 being a required reading for an aspired rationalist, it makes sense to count multiple identical copies having the same value as a single one, since they add no new information to the world after the original. In this approach the original will not notice any difference after taking the box, and neither would the simulation, so there is no benefit in not taking the boxes, since the original will be a million dollar poorer.

"Ridiculous limited consciousness. The simulation, of course, included this part of the conversation, where I reminded you that I'll terminate the simulation and/or kill the human if you don't choose either box. That is a valid outcome in this experiment. And no, it doesn't change anything if you get a tavern-full of people to sing La Marseillaise first - that doesn't defy my predictions, it's just one more variable I'm measuring."

That would be an excellent solution—from the unnamed trickster god’s perspective. Personally though, I’m more interested in what Maxwell should do once the rules are already set.

The rules _are_ already set, just not fully described nor known. You're violating the setup if Maxwell can behave differently than simulations did, or if the behavior of the simulation breaks Omega's functioning.

There was a weird glitch posting this where it appeared as three separate copies of the same post; I deleted the other two, so hopefully that wasn’t too much of a problem.

[+][comment deleted]4y50