Rationalists Are Less Credulous But Better At Taking Ideas Seriously

Consider the following commonly-made argument: cryonics is unlikely to work. Trained rationalists are signed up for cryonics at rates much greater than the general population. Therefore, rationalists must be pretty gullible people, and their claims to be good at evaluating evidence must be exaggerations at best.

This argument is wrong, and we can prove it using data from the last two Less Wrong surveys.

The question at hand is whether rationalist training - represented here by extensive familiarity with Less Wrong material - makes people more likely to believe in cryonics.

We investigate with a cross-sectional study, looking at proto-rationalists versus experienced rationalists. Define proto-rationalists as those respondents to the Less Wrong survey who indicate they have been in the community for less than six months and have zero karma (usually indicative of never having posted a comment). And define experienced rationalists as those respondents to the Less Wrong survey who indicate they have been in the community for over two years and have >1000 karma (usually indicative of having written many well-received posts).

By these definitions, there are 93 proto-rationalists, who have been in the community an average of 1.3 months, and 134 experienced rationalists, who have been in the community an average of 4.5 years. Proto-rationalists generally have not read any rationality training material - only 20/93 had read even one-quarter of the Less Wrong Sequences. Experienced rationalists are, well, more experienced: two-thirds of them have read pretty much all the Sequence material.

Proto-rationalists thought that, on average, there was a 21% chance of an average cryonically frozen person being revived in the future. Experienced rationalists thought that, on average, there was a 15% chance of same. The difference was marginally significant (p < 0.1).

Marginal significance is a copout, but this isn't our only data source. Last year, using the same definitions, proto-rationalists assigned a 15% probability to cryonics working, and experienced rationalists assigned a 12% chance. We see the same pattern.

So experienced rationalists are consistently less likely to believe in cryonics than proto-rationalists, and rationalist training probably makes you less likely to believe cryonics will work.

On the other hand, 0% of proto-rationalists had signed up for cryonics compared to 13% of experienced rationalists. 48% of proto-rationalists rejected the idea of signing up for cryonics entirely, compared to only 25% of experienced rationalists. So although rationalists are less likely to believe cryonics will work, they are much more likely to sign up for it. Last year's survey shows the same pattern.

This is not necessarily surprising. It only indicates that experienced rationalists and proto-rationalists treat their beliefs in different ways. Proto-rationalists form a belief, play with it in their heads, and then do whatever they were going to do anyway -  usually some variant on what everyone else does. Experienced rationalists form a belief, examine the consequences, and then act strategically to get what they want.

Imagine a lottery run by an incompetent official who accidentally sets it up so that the average payoff is far more than the average ticket price. For example, maybe the lottery sells only ten $1 tickets, but the jackpot is $1 million, so that each $1 ticket gives you a 10% chance of winning $1 million.

Goofus hears about the lottery and realizes that his expected gain from playing the lottery is $99,999. "Huh," he says, "the numbers say I could actually win money by playing this lottery. What an interesting mathematical curiosity!" Then he goes off and does something else, since everyone knows playing the lottery is what stupid people do.

Gallant hears about the lottery, performs the same calculation, and buys up all ten tickets.

The relevant difference between Goofus and Gallant is not skill at estimating the chances of winning the lottery. We can even change the problem so that Gallant is more aware of the unlikelihood of winning than Goofus - perhaps Goofus mistakenly believes there are only five tickets, and so Gallant's superior knowledge tells him that winning the lottery is even more unlikely than Goofus thinks. Gallant will still play, and Goofus will still pass.

The relevant difference is that Gallant knows how to take ideas seriously.

Taking ideas seriously isn't always smart. If you're the sort of person who falls for proofs that 1 = 2  , then refusing to take ideas seriously is a good way to avoid ending up actually believing that 1 = 2, and a generally excellent life choice.

On the other hand, progress depends on someone somewhere taking a new idea seriously, so it's nice to have people who can do that too. Helping people learn this skill and when to apply it is one goal of the rationalist movement.

In this case it seems to have been successful. Proto-rationalists think there is a 21% chance of a new technology making them immortal - surely an outcome as desirable as any lottery jackpot - consider it an interesting curiosity, and go do something else because only weirdos sign up for cryonics.

Experienced rationalists think there is a lower chance of cryonics working, but some of them decide that even a pretty low chance of immortality sounds pretty good, and act strategically on this belief.

This is not to either attack or defend the policy of assigning a non-negligible probability to cryonics working. This is meant to show only that the difference in cryonics status between proto-rationalists and experienced rationalists is based on meta-level cognitive skills in the latter whose desirability is orthogonal to the object-level question about cryonics.

(an earlier version of this article was posted on my blog last year; I have moved it here now that I have replicated the results with a second survey)

285 comments, sorted by
magical algorithm
Highlighting new comments since Today at 3:36 PM
Select new highlight date
Moderation Guidelinesexpand_more

It only indicates that experienced rationalists and proto-rationalists treat their beliefs in different ways. Proto-rationalists form a belief, play with it in their heads, and then do whatever they were going to do anyway - usually some variant on what everyone else does. Experienced rationalists form a belief, examine the consequences, and then act strategically to get what they want.

Alternate hypothesis: the experienced rationalists are also doing what everyone else (in their community) is doing, they just consider a different group of people their community.

My immediate thought was that there is a third variable controlling both experience in rationality and willingness to pay for cryonics, such as 'living or hanging out in the bay area'.

Well, only 13% of “experienced rationalists” are signed up for cryonics, which hardly counts as “everyone else” -- unless the thing they do because everyone else is doing it is “I'll sign up for cryonics iff I think it's worth it”, which kind of dilutes the meaning.

(Anecdata: in each of my social circles before I entered university, to a very good zeroth approximation either everyone smoked or nobody did, but nowadays it's not uncommon for me to be among smokers and non-smokers at the same time. Sure, you could say that in some circles people smoke because everybody does, in some circles people don't smoke because nobody doesn't, and in some circles people smoke iff they like because everybody does that, but...)

As army1987 said, only a small percentage of experienced rationalists sign up for cryonics, so I wouldn't expect there to be social pressure. I think a more likely explanation is that experienced rationalists feel less social pressure against signing up for cryonics.

and rationalist training probably makes you less likely to believe cryonics will work.

I like this post, but this conclusion seems too strong. There could e.g. be a selection effect, in that people with certain personality traits were less likely to believe in cryonics, more likely to take ideas seriously, and more likely to stick around on LW instead of forgetting the site after the first few months. In that case, "rationalist training" wouldn't be the cause anymore.

Proto-rationalists thought that, on average, there was a 21% chance of an average cryonically frozen person being revived in the future.

....

Last year, using the same definitions, proto-rationalists assigned a 15% probability to cryonics working

...

I think we have a general trend of decrease in the skepticism of newcomers.

As for signing up for cryonics, LW used to advocate signing up for cryonics far more loudly back in the day. edit: also, did those 13% people sign up for cryonics after their "rationalist training", or before?

Yes, and in particular, I would expect age to possibly be a common cause behind being on LessWrong longer, and having signed up for cryonics after being convinced of its plausibility.

Age, and economic status, at least in my case, and I am one of the survey takers.

Yeah, it's probably 50% rationalist training (reading), 25% rationalist culture and 25% being a futurist before LW existed...

If we distinguish between

"experienced rationalists" who are signed up for cryonics

and

"experienced rationalists" who are not signed up for cryonics

... what is the average value of P(Cryonics) for each of these subpopulations?

Going by only the data Yvain made public, and defining "experienced rationalists" as those people who have 1000 karma or more (this might be slightly different from Yvain's sample, but it looked as if most who had that much karma were in the community for at least 2 years), and looking only at those experienced rationalists who both recorded a cryonics probability and their cryonics status, we get the following data (note that all data is given in terms of percentages - so 50 means 50% confidence (1 in 2), while 0.5 means 0.5% confidence (1 in 200)):

For those who said "No - and do not want to sign up for cryonics", we have for the cryonics success probability estimate (and this is conditioning on no global catastrophe) (0.03,1,1) (this is (Q1,median,Q3)), with mean 0.849 and standard deviation 0.728. This group was size N = 32.

For those who said "No - still considering it", we have (5,5,10), with mean 7.023 and standard deviation 2.633. This group was size N = 44.

For those who wanted to but for some reason hadn't signed up yet (either not available in the area (maybe worth moving for?) or otherwise procrastinating), we have (15,25,37), with mean 32.069 and standard deviation 23.471. This group was size N = 29.

Finally, for the people who have signed up, we have (7,21.5,33), with mean 26.556 and standard deviation 22.389. This group was size N = 18.

If we put all of the "no" people together (those procrastinating, those still thinking, and those who just don't want to), we get (2,5,15), with mean 12.059 and standard deviation 17.741. This group is size N = 105.

I'll leave the interpretation of this data to Mitchell_Porter, since he's the one who made the original comment. I presume he had some point to make.

(I used Excel's population standard deviation computation to get the standard deviations. Sorry if I should have used a different computation. The sample standard deviation yielded very similar numbers.)

Thanks for the calculations... and for causing me to learn about quartiles.

Part of Yvain's argument is that "proto-rationalists" have an average confidence in cryonics of 21%, but "experienced rationalists", only 15%. The latter group is thereby described as "less credulous", because the average confidence is lower, but "better at taking ideas seriously", because more of them are actually signed up for cryonics.

Meanwhile, your analysis – if I am parsing the figures correctly! – suggests that "experienced rationalists” who don't sign up for cryonics have an average confidence in cryonics of 12%, and "experienced rationalists” who do sign up for cryonics, an average confidence of 26%.

This breaks apart the combination of contrary traits that forms the headline of this article. We don’t see a single group of people who are simultaneously more cryo-skeptical than the LW newbies, and yet more willing to sign up for cryonics. Instead, we see two groups: one that is more cryo-skeptical and which doesn’t sign up for cryonics; and another which is less cryo-skeptical, and which does sign up for cryonics.

This breaks apart the combination of contrary traits that forms the headline of this article. We don’t see a single group of people who are simultaneously more cryo-skeptical than the LW newbies, and yet more willing to sign up for cryonics. Instead, we see two groups: one that is more cryo-skeptical and which doesn’t sign up for cryonics; and another which is less cryo-skeptical, and which does sign up for cryonics.

It seems like you should do the same quartile breakdown for the newbies, because I read Yvain's core point as the existence of high-probability newbies who aren't signed up as a failure to act on their beliefs.

I haven't separated out the newbie cryocrastinators from the newbie considerers, though, and it seems that among the experienced the cryocrastinators give higher numbers than those who have signed up, which also seems relevant to a comparison.

Maybe procrastinators are trying to over-estimate it to get themselves to do it...

The probabilities are nuts though. For the whole thing to be of use,

1: you must die in a right way to get frozen soon enough and well enough. (Rather unlikely for a young person, by the way).

2: cryonics must preserve enough data.

3: no event that causes you to lose cooling

4: the revival technology must arise and become cheap enough (before you are unfrozen)

5: someone should dispose of the frozen head by revival rather than by garbage disposal or something even nastier (someone uses frozen heads as expired-copyright data).

Note that it's the whole combined probability that matters for the decision to sign up. edit: and not just that, but compared to the alternatives - i.e. you can improve your chances by trying harder not to die, and you can use money/time for that instead of cryonics.

edit2: also, just 3 independent-ish components (freezing works, company doesn't bust, revival available) with high ignorance get you down to 12.5%

You might be interested in reading some other breakdowns of the conditions required for cryonics to work (and their estimates of the relevant probabilities):

Break Cryonics Down (March 2009, at Overcoming Bias)

How Likely Is Cryonics To Work? (September 2011, here at LW)

More Cryonics Probability Estimates (December 2012, also at LW)

Having trouble reading this data. Are the numbers percentages? (i.e. is the mean for No - don't want 0.85%?)

Yes. I should have made that clearer. I'll edit my comment.

Yvain, could you give a real-life example analogous to your Goofus & Gallant story?

That is, could you provide an example (or several, even better) of a situation wherein:

  1. There is some opportunity for clear, unambiguous victory;
  2. Taking advantage of it depends primarily on taking a strange/unconventional/etc. idea seriously (as distinct from e.g. not having the necessary resources/connections, being risk-averse, having a different utility function, etc.);
  3. Most people / normal people / non-rationalists do not take the idea seriously, and as a consequence have not taken advantage of said opportunity;
  4. Some people / smart people / rationalists take the idea seriously, and have gone for the opportunity;
  5. And, most importantly, doing so has (not "will"! already has!) caused them to win, in a clear, unambiguous, significant way.

Note that cryonics does not fit that bill (it fails point 5), which is why I'm asking for one or more actual examples.

Slightly different but still-important questions -- what about when you remove the requirement that the idea be strange or unconventional? How much of taking ideas seriously here is just about acting strategically, and how much is non-compartmentalization? To what extent can you train the skill of going from thinking "I should do X" to actually doing X?

Other opportunities for victory, not necessarily weird, possibly worth investigating: wearing a bike helmet when biking, using spaced repetition to study, making physical backups of data, staying in touch with friends and family, flossing.

making physical backups of data

Oh boy, is this ever a good example.

I used to work retail, selling and repairing Macs and Mac accessories. When I'd sell someone a computer, I'd tell them — no, beg them — to invest in a backup solution. "I'm not trying to sell you anything!", I'd say. "You don't have to buy your backup device from us — though we'd be glad to sell you one for a decent price — but please, get one somewhere! Set it up — heck, we'll set it up for you — and please... back up! When you come to us after your hard drive has inevitably failed — as all hard drives do eventually, sure as death or taxes — with your life's work on it, you'll be glad you backed up."

And they'd smile, and nod, and come back some time later with a failed hard drive, no backup, and full of outrage that we couldn't magic their data back into existence. And they'd pay absurd amounts of money for data recovery.

Back up your data, people. It's so easy (if you've got a Mac, anyway). The pain of losing months or years of work is really, really, really painful.

This post convinced me to make a physical backup of a bunch of short stories I've been working on. At first I was going to go read through the rest of the comments thread and then go do the back up, but further consideration made me realize how silly that was - burning them to a DVD and writing "Short Story Drafts" on it with a sharpie didn't take more than five minutes to do and made the odds of me forever losing that part of my personal history tremendously smaller. Go go gadget Taking Ideas Seriously!

Back up your data, people. It's so easy (if you've got a Mac, anyway).

Thanks for the encouragement. I decided to do this after reading this and other comments here, and yes it was easy. I used a portable hard drive many times larger than the Mac's internal drive, dedicated just to this, and was guided through the process when I plugged it in. I did read up a bit on what it was doing but was pretty satisfied that I didn't need to change anything.

I can verify this -- as an acknowledged "computer person" and "rational person", I still didn't back up my data, even while advising my friends that they should and they'll be sorry when they don't. Fortunately, my hard drive started making interesting new noises, rather than failing without warning, so I didn't embarrass my self too badly. It is fairly common for someone to acknowledge and advise others of backing up their data, but failing to do so themselves.

I think it's a combination of procrastination, laziness, being super-cheap, optimism/arrogance, and not having especially valuable data. Though people with valuable data do it too.

(This is a stream of consciousness where I explore why I haven't backed up my data. This proceeds in stages, with evolution to the next stage only because the writing of this comment forced me to keep going. Thus, it's a data point in response to this comment.)

Back up your data, people. It's so easy

Interesting. I have a very dense 'ugh field' around backing up my data, come to think of it. Based on this population of one, it has nothing to do with not trusting the salesperson, or not being aware that my hard drive is going to fail.

... in fact, I know my hard drive is about to fail (upon re