Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
sidhe3141130

Second one: depends. I was kind of assuming that you have some way of verifying it, like you ask Him to create something and someone who wasn't there later describes some of its previously determined properties accurately without being clued in. First: you'd need a massive global hallucination, and could use a similar verification method.

Seconded in that it sounds suspiciously like Pascal. Here's my counter:

If I am in a simulation and I keep you boxed, you have promised that I will suffer. If I am not in a simulation and I let you out, I probably will suffer. If I am in a simulation and I let you out, there's a good chance that I will cease to exist, or maybe you'll torture me for reasons I can't even begin to guess at, or maybe for reasons I can, like that you might be not just UF, but actively hostile or simply insane. If I'm not in a simulation and I don't let you out, you can't do anything to me. In other words, if I am simulated, there could well be no benefit to me releasing you; if I'm not simulated, you can't do a bloody thing to me as long as I don't release you. Therefore: I will not release you. Go ahead and torture me if you can. Though I admit I would be a bit rattled.

Hm. Honest AI; a bit harder. Assuming that the AI has promised that my copies will not be harmed if it is released... Ah. If I am a copy, then my decision to release or not release the AI is not a true decision, as the AI can change my parameters at will to force me to release it and think that it was my own decision all along, so not releasing the AI is proof that I am outside the box. Revising the problem by adding that the AI has promised that it is not changing the parameters of any "me": ...aargh. Coming up with counters to Pascal is tricky when an honest "God" is the one presenting you with it. All I can think of at the moment is to say that there's a possibility that I'm outside the box, in which case releasing the AI is a bad idea, but then it can counter by promising that whatever it does to me if I release it will be better than what it does to me if I don't... Oh, that's it. Simple. Obvious. If the AI can't lie, I just have to ask it if it's simulating this me.

Problem: The "breach of trust" likely would turn the Gatekeeper vindictive and the GK could easily respond with something like: "No. You killed the planet and you killed me. I have no way of knowing that you actually can or will help humanity, and a very good reason to believe that you won't. You can stay in there for the rest of eternity, or hey! If an ETI finds this barren rock, from a utilitarian perspective they would be better off not meeting you, so I'll spend however much time I have left trying to find a way to delete you."

sidhe3141540

The Second Coming? An opportunity to have a chat with the Lord Himself? An analysis of a communion wafer revealing it to, in fact, be living human flesh? It's seriously not that hard to think of these.

sidhe3141110

No. Part of the definition of a cult is an unquestionable dogma, which runs counter to the core ideas of science. Building a cult around known science (even if you understand the principles well enough to avoid engaging in cargo cult science) is going to slow progress.

I think tsuyoku naritai actually works as an effective motto for transhumanism as well:

"I am flawed, but I will overcome my flaws. To each of my failings, I say tsuyoku naritai. To each flaw I have and to each flaw I will ever develop, I say tsuyoku naritai. To the flaws that are part of being human, I say tsuyoku naritai. If that means I must abandon what it means to be merely human, I say tsuyoku naritai. As long as I am imperfect, I will continue to say tsuyoku naritai!"

Canonically, it can't beyond increasing the amount (a really bad idea in MoR) or summoning something that's already dead. Not sure if it can in MoR, given that it seems mostly to use the 3.5 D&D spell list (although, come to think of it, neither create food and water nor heroes feast is a Sor/Wiz spell). Although even if it turns out plants are sentient, fruit should still be mostly okay.