And Yudkowski.net is result #6
Agree with the absurdity bias. For most (even smart) people their exposure to cryonics is things like Woody Allen's Sleeper and Futurama. I almost can't blame them for only seeing the absurd... I'm still trying to come around to it myself.
Not completely defined at the moment since I'm a 1st year PhD student at NYU, and currently doing rotations. It'll be something like comparative genomics/regulatory networks to study evolution of bacteria or perhaps communities of bacteria.
You'll get more response from the NY group (we don't all check LW and discussion board regularly) by making a post to the google group/listserve:
Thanks... this should come in handy in my computational research in systems biology
A broken clock is right twice per day. If value theory is incidentally correct, it doesn't make folk theories valuable on the margins - unless of course, if people who hold folk theories do consistently better than rationalists, but then I'd question the rationalist label.
I wish I could take that much time to do this
Is that because if you treat probabilities of (God or not God) as maximum entropy without prior information you'd get 50/50?
Good on them! In my experience, whenever I sneak bayesian updating into the conversation, it's well received by skeptics. When I try to introduce Bayes more formally or start supporting anti-mainstream ideas, such as cryonics, AI, etc, there's much more resistance.