Missionary: I don’t believe in God.

Me: But you live a religious life, you’re trying to convert me, and you’re not getting paid for this?

Missionary: Yes. I’m hoping I’ll believe in God eventually.

Me: Why?

Missionary: Hear me out. The probability God exists is greater than zero. I’d be infinitely happy in heaven. And any positive number multiplied by infinity equals infinity. So being religious maximizes my expected utility. Even if I didn’t face infinite sadness in hell.

Me: I can’t disprove God exists. But I think there’s an infinitely small chance God exists. And ∞ * (1/∞) = 1, not infinity.

Missionary: Your math is wrong. 1/∞ is undefined. To be fair, I’ve seen it answered as the limit approaching zero.[1] But do you really think the odds God exists are that small? Billions of people believe in God!

Me: They haven’t offered much evidence for why God exists. But, even if 1/∞ is the limit approaching zero, I admit their opinions make enough of a difference to convince me the odds God exists are higher than 1/∞.

Missionary: So come to God!

MeI don’t think it makes sense to do that yetIt doesn’t feel right.

MissionaryJohn Von Neumann became religious based on the same logic I’ve explained. Are you smarter than him?

Me: Von Neumann converted on his deathbed. He probably felt he couldn’t do anything more productive than pray. 

Missionary: I suppose that’s possible. But you still haven’t given me a reason not to be religious. If you’re really a utility monster, you’ll confront this question.

Confronting The Question

Me: I think the odds are higher that technological improvement can provide infinite happiness than God. 

Missionary: How?

Me: An artificial superintelligence could create something like an experience machine. Maybe it could be an experience drug?

Missionary: You think a machine or drug could work forever? You can’t test that!

Me: You can’t test that heaven lasts forever either.

Missionary: Why should the probability that an experience machine is created matter? Once again, any number higher than zero times infinity equals infinity.[2] That would mean heaven and an experience machine have an equal expected utility. And you should be religious to avoid hell.

Me: My logic is that hell equals negative heaven. So heaven + not hell equals 2 heaven. And my odds I enter an experience machine are more than twice as high as the odds I enter heaven.[3]

Missionary: But you’re assuming an experience machine is as good as heaven. Do you really think a machine could make you feel infinitely happy? Only God could do that.

Me: I agree. That does seem like it could even be too hard of a problem for an artificial superintelligence. 

MissionaryAllahu akbar!

Me: You’re Muslim?

Missionary: If it was acceptable, I’d follow every religion. I originally planned to become Christian because it’s the most popular religion, and it has a heaven and a hell. But in the Bible’s depiction of heaven, people only seem “infinitely happy” because they’re happy forever. The Quran states you’ll have whatever you wish for in heaven. So I could wish to feel infinitely happy forever there.

Me: Maybe the Quran just meant heaven is awesome. How confident are you that you should take it literally?

Missionary: Not very confident. But there’s a chance it’s true!

Me: I need to think about feeling infinitely happy more. I’ll talk to you tomorrow.

Tomorrow

Me: My friend Raymond told me he doesn’t think anyone can have infinite happiness. His logic makes sense. Imagine happiness could be quantified through points. For example, someone with 0 happiness points would feel neutral, and someone with 1 happiness point would have some undetermined amount of happiness. If there’s no highest finite number, how could someone’s happiness level go from being finite to infinite?

Missionary: So what’s your point?

Me: Now that I realize feeling infinitely happy is impossible, I don't think God can give me more happiness than an experience machine. So I’m going to maximize my expected utility by striving to help create a utopia. 

Missionary: I’ll think about that.

On the way home…

Pascal’s Mugger: Give me your wallet.

Me: You’re unarmed, and I can beat you up?

Pascal’s Mugger: Yeah.

Me: Then, no thanks. 

(I start to walk away.)

Pascal’s Mugger: Wait! If you give me your wallet, I’ll give you a finite but extremely high amount of happiness. More happiness than the happiness you imagine you’ll get from an experience machine! So much happiness that it maximizes your utility to give me your wallet.

Me: If that was true, I’d give you my wallet. You’re overestimating the probability I believe you.

Pascal’s Mugger: Since you can’t 100% prove I’m lying, there has to be some point where my promises are large enough that it makes sense to give me your wallet.

Me: I think the issue is that I don’t have a precise definition for what it would feel like to be in an experience machine. I just imagine feeling the highest amount of happiness I could possibly feel. So you can't top that. Granted, there probably is no highest amount of happiness I could have. At least, if I still consider myself to be myself after I’m genetically modified and/or merged with artificial intelligence. 

Pascal’s Mugger: You’re underestimating the amount of happiness I’ll give you. Due to scope neglect

Me: I can’t even come up with the scope of this problem. I don’t know how to perfectly quantify happiness. And I’d struggle to quantify how little I trust you, even with an equivalent bet test.[4]

Pascal’s MuggerAll you need to understand is that giving me your wallet maximizes your expected utility.

Me: Is there any way you could explain why you’re correct in more detail? Or show any proof that you have great power?

Pascal’s Mugger: If I did that, I wouldn’t be able to give you any happiness. 

Me: Then I don’t see how you’ll be able to change my decision. My gut is to not give strangers who claim to have magical powers my wallet. I said I don’t know how to calculate whether I should give you my wallet. You’re replying, “I’m right, you’re wrong, I can’t say why, give me your wallet.”  

Pascal’s Mugger: I forgot to mention the extreme sadness you’ll feel if you don’t give me your wallet. More sadness than the happiness you imagine you’ll get from an experience machine! So much

Me: (interrupting) I’ve got to go. I was going to say I’ll let you know if I ever become pessimistic enough about the future or spiritual enough to change my mind. But I guess I'll find out if I'm sad forever now.

Pascal’s Mugger: I'll give you some time to think things over.

(cross-posted from my blog: https://utilitymonster.substack.com/p/utility-in-faith

  1. ^

    And if I hadn’t put parentheses around (1/∞), ∞ * 1/∞ also equals undefined.

  2. ^

    Since 1/∞ is undefined I’m not counting that as a number.

  3. ^

    But I'd try to resist the temptation to enter an experience machine until I didn't think there was anything I could do to help others enter the machine.

  4. ^

    To make an equivalent bet, ask yourself at what point the odds of making a bet has a neutral expected utility. For example, if I thought there was a 55% chance the Mets would beat the Nationals tonight, I’d be indifferent about betting $55 on the Mets to win $100. To do so, I imagine I’m playing a game where my goal is to win and lose nothing. Also, watch out to avoid confusing expected value with expected utility.

    Techniques you can use to help make equivalent bets include: 1) averaging the highest and lowest probabilities, which don’t sound crazy to you (How To Measure Anything pg 105, Cold Takes),  2) picturing that you’re spinning a dial to visualize your probabilities (How To Measure Anything pg 101), 3) picking colored balls out of a bag (The Scout Mindset), and 4) flipping an unfair coin (Guild of The Rose). They’re helpful to a point. I still find it hard to determine my subjective probability when I’m not trying to round my prediction to the nearest whole number. For example, I’ve never felt comfortable believing I’m 98.999% sure instead of 99%. I suppose I could get to that point if I spent more time thinking about it. Convincing myself to genuinely believe there’s a 0.000000000000000001% chance I trust the mugger would be even harder.

New to LessWrong?

New Comment