Sorted by New

Wiki Contributions


(one liner - for policy makers)

Within ten years, AI systems could be more dangerous than nuclear weapons.  The research required for this technology is heavily funded and virtually unregulated.

Thanks for this post. I've seen the term inadequecy before (mostly on your Facebook page, I think) but never had such a clear definition in mind.

There was one small thing that bothered me in this post without detracting from the main argument. In section IV, we provisionally accept the premise "grantmakers are driven by prestige, not expected value of research" for the sake of a toy example. I was happy to accept this for the sake of the example. However, in section V (the omelette example and related commentary about research after the second world war), the text begins to read as though this premise is definitely true in real life. This felt inconsistent and potentially misleading.

(It’s not like anyone in our civilization has put as much effort into rationalizing the academic matching process as, say, OkCupid has put into their software for hooking up dates. It’s not like anyone who did produce this public good would get paid more than they could have made as a Google programmer.)

I appreciated this throwaway example of inadequacy. It gave me a little lightbulb and propelled me forward to read the rest of the post with more interest.

Certainly! Here it is:

On that page, it is fine at normal zoom, but the problem occurs when I zoom out to 80%, at which point the text is roughly the same size as here. So I guess it is something to do with how the font is rendering at that size. Whether it is something wrong with my computer or with the font I don't know.

Here is what I am seeing:

I am on Chrome on Windows 10. Experimentation shows that the effect only happens when the page zoom is at 100%... if I zoom in or out, the w goes back to normal.

The comment font has a weird lowercase 'w'. It is larger than the surrounding letters. Now that I have noticed it, I can't stop being distracted by it.

It is done. (The survey. By me.)

I have taken the survey, including the digit ratio question.

Since there was a box to be included in the SSC survey, I just a little bit disappointed there wasn't a question for favourite SSC post to go with the favourite LessWrong post question.

Making things happen with positive thinking requires magic. But myths about the health effects of microwaves or plastic bottles are dressed up to look like science as usual. The microwave thing is supposedly based on the effect of radiation on the DNA in your food or something -- nonsense, but to someone with little science literacy not necessarily distinguishable from talk about the information-theoretic definition of death.

I'm not sure that signing papers to have a team of scientists stand by and freeze your brain when you die is more boring than cooking your food without a microwave oven. I would guess that cryonics being "weird", "gross", and "unnatural" would be more relevant.

Upvoted for providing a clear counterexample to Yvain's assertion that people would find immortality to be "surely an outcome as desirable as any lottery jackpot".

This suggests that a partial explanation for the data is that "experienced rationalists" (high karma, long time in community) are more likely to find immortality desirable, and so more likely to sign up for cryonics despite having slightly lower faith in the technology itself.

Your conclusion is possible. But I'll admit I find it hard to believe that non-rationalists really lack the ability to take ideas seriously. The 1 = 2 example is a little silly, but I've known lots of not-very-rational people who take ideas seriously. For example, people who stopped using a microwave when they heard about an experiment supposedly showing that microwaved water kills plants. People who threw out all their plastic dishes after the media picked up a study about health dangers caused by plastics. People who spent a lot of time thinking positive thoughts because they have heard it will make them successful.

Could it be that proto-rationalists are just bad at quantifying their level of belief? Normally, I'd trust somebody's claim to believe something more if they're willing to bet on it; and if they aren't willing to bet on it, then I'd think their real level of belief is lower.

Load More