All of PeteG's Comments + Replies

And I certainly see a very great difference between humanity continuing forever, versus humanity continuing to Graham's Number and then halting

You can't use "humanity" and "Graham's Number" in the same sentence.

Oh! Oh! Oh! Oh! Oh! U shitheads think you are doing something to me with your insect downvotes? Oh! Oh! Oh! Oh! Oh! The only thing U did is prove to me that U all eat SHIT! Not a single one of U can leave a sensible comment here. The next person who downvotes me is gonna get their ass beat by a kangaroo!

I'm in Phoenix, would be great to try and have one here.
I'll surely be in Phoenix eventually.

The AI tells me that I believe something with 100% certainty, but I can't for the life of me figure out what it is. I ask it to explain, and I get: "ksjdflasj7543897502ijweofjoishjfoiow02u5".

I don't know if I'd believe this, but it would definitely be the strangest and scariest thing to hear.

My immediate reaction was "It linked you to a youtube video?"

This is the only one that made the short hairs on the back of my neck stand up.

Rationality is winning that doesn’t generate a surprise; randomly winning the lottery generates a surprise. A good measure of rationality is the amount of complexity involved in order to win, and the surprise generated by that win. If to win at a certain task requires that your method have many complex steps, and you win, non-surprisingly, then the method used was a very rational one.

"You should have spent much more of your time in this debate convincing your tangled friend that, if she were to abandon her religious belief (or belief in belief, or whatever), she would still be able to feel good about herself and good about life; that life would still be a happy meaningful place to be."

I don't think Eliezer cared so much to correct someone's one wrong belief as much as he cared to correct the core that makes many such beliefs persist. Would he really have helped her if all his rational arguments failed, but his emotional one succeeded? My guess is that it wouldn't be a win for him or her.

Well that depends on whether your aim is to make people have correct beliefs, or whether you want to make people have correct beliefs by following the ritual of rational argument... and I think that EY would claim to be aiming for the former.

How to Bind Yourself To Reality is the number one thing people should GET. But my guess is that this one might not be teachable.

Most frequent would have to go to my avoidance of settling with cached thoughts. I notice, revise, and completely discard conclusions much more regularly and effectively when I recognize the conclusion was generated as soon as a question was asked.

The Wrong Question sequence was amazing. One of the very unintuitive sequences that greatly improved my categorization methods. Especially with the 'Disguised Queries' post.