Posts

Sorted by New

Wiki Contributions

Comments

And I certainly see a very great difference between humanity continuing forever, versus humanity continuing to Graham's Number and then halting

You can't use "humanity" and "Graham's Number" in the same sentence.

Oh! Oh! Oh! Oh! Oh! U shitheads think you are doing something to me with your insect downvotes? Oh! Oh! Oh! Oh! Oh! The only thing U did is prove to me that U all eat SHIT! Not a single one of U can leave a sensible comment here. The next person who downvotes me is gonna get their ass beat by a kangaroo!

The AI tells me that I believe something with 100% certainty, but I can't for the life of me figure out what it is. I ask it to explain, and I get: "ksjdflasj7543897502ijweofjoishjfoiow02u5".

I don't know if I'd believe this, but it would definitely be the strangest and scariest thing to hear.

Rationality is winning that doesn’t generate a surprise; randomly winning the lottery generates a surprise. A good measure of rationality is the amount of complexity involved in order to win, and the surprise generated by that win. If to win at a certain task requires that your method have many complex steps, and you win, non-surprisingly, then the method used was a very rational one.

"You should have spent much more of your time in this debate convincing your tangled friend that, if she were to abandon her religious belief (or belief in belief, or whatever), she would still be able to feel good about herself and good about life; that life would still be a happy meaningful place to be."

I don't think Eliezer cared so much to correct someone's one wrong belief as much as he cared to correct the core that makes many such beliefs persist. Would he really have helped her if all his rational arguments failed, but his emotional one succeeded? My guess is that it wouldn't be a win for him or her.

How to Bind Yourself To Reality is the number one thing people should GET. But my guess is that this one might not be teachable.

Load More