This site often speaks of rationality and intelligence as though they were the same thing, and that someone, by becoming more rational, becomes more intelligent for practical purposes.

 

Certainly it seems to me that this must be to some extent the case, but what is the exchange rate? If a person has an IQ of 100, and then they spend a year on lesswrong, reading all the sequences and taking the advice to heart, training their skills and identifying their biases and all that, at the end of it, presumably their raw IQ score is still 100, but if we measure how they do on correlated indicators regarding their lifestyle or something, should we expect to see them, in some way, living the life of a smarter person? How much smarter?

 

How many points of IQ would you be willing to give up to retain what you have learned from this site?

 

Personally I would answer "less than one". It seems like it SHOULD be useful, but it doesn't really feel like it is.

New to LessWrong?

New Comment
28 comments, sorted by Click to highlight new comments since: Today at 1:37 AM

What Intelligence Tests Miss (LW discussion) is very relevant here. I do not agree that this site conflates intelligence and rationality or that they are the same.

Fair enough. How many IQ points would you say make a fair exchange for lesswrong's teachings?

That I don't have a good handle on. I have gone back and forth on how practical they are.

Edit: This comment was written in response to why, if rationality is supposedly so helpful, it isn't commonplace by now.

A lot of the information that makes LW's brand of rationality more distinct (cognitive biases, various systematic bayesian statistical techniques, decision theory stuff, etc.) is really new.

Like, Heuristics and Biases came out in 1982 (though they did the research a decade or so before hand), Jaynes didn't describe/justify the maximum entropy distribution until 1957, the Solomonoff prior wasn't until 1960, and the AI research that gave people a concrete way to think about thinking didn't really start until the 1950s. Causality was published in 2000.

So it hasn't really had the time to become widely adopted, or fleshed out yet.

Before then though, less strict rationality has actually helped a lot of people. Europe took over the world because of the Renaissance, for instance. The Romans took over because of superior military organization (a particularly brutal form of instrumental rationality).

The Persians built and maintained (for a while, anyway) the Achaemenid Empire largely because of their superior logistical and diplomatic skill, backed up by the army that they built.

The Byzantines maintained the remnants of the Roman Empire for a few hundred years based mostly on their diplomacy.

Oops, edits crossed in midstream. This reply made a lot more sense in conjunction with the original post as it was originally written.

Edit: Haha, yes.

Yup, so much more sense. :P

Actually, it seems like almost none of this relates any more. I guess I'll leave it up for posterity's sake, but the current topic is harder. I guess I should wait longer before responding to things.

[-]knb13y40

I would give up less than one point, and my IQ is above 100.

This is a good reality check. Good post.

The key insight here is the applicability of the weak efficient markets hypothesis: if some useful information is publicly known, you can be pretty sure that other people are already using it to their advantage. If you have found some insight that will enable you to get ahead in practice, it's always a good idea to ask yourself what exactly makes you special to be privy to this information. It may be that you are smarter than others, or that you are lucky to have privileged access to this information, but it may also be that others are already familiar with it and using it, only that you've been oblivious about it so far -- or that the insight is in fact less useful than you think.

This is why the laboratory insight about biases from psychology, behavioral economics, etc. is typically not useful in practice. If this insight really is applicable to what people do even when they have strong incentives to avoid bias, then one would expect that there already is a huge industry targeted at making profits from these biases, and avoiding falling prey to it is already a part of well-known good common sense. (This is indeed the case with e.g. gambling.) Otherwise, it may be that the bias is reproducible in the lab but disappears with enough incentive, just like lots of people would flunk a test of basic arithmetic, but it doesn't mean you could get away with shortchanging them with real money.

In contrast, when it comes to issues that don't have instrumental implications in terms of money, status, etc., it's not that hard to learn about biases and make one's beliefs more accurate than average. Trouble is, it's easy precisely because people normally don't bother correcting their beliefs in such matters, lacking the incentive to do so. (Or even having contrary incentives if the biased beliefs have beneficial signaling and other implications.)

[-]Sly13y20

I tend to think that rationality and intelligence are two very separate things.

I personally know some absolutely brilliant people that have no grasp of rationality.

In keeping with a comment I recently made here, it may be useful to distinguish three things:

  • Intelligence = ability to learn from experience without conscious effort.
  • Individual Rationality = ability to learn by experimentation and conscious consideration of evidence.
  • Social Rationality = ability to learn (and teach) by interaction with other agents.

So, to answer my version of the cash-out question: I haven't picked up enough about Individual Rationality here to balance any tangible loss of intelligence. But what I have picked up about Social Rationality here, together with the continued presence of this social resource to learn more stuff, are worth at least 5+ IQ points to me.

I think it would be interesting to look at the benefits of this site's rationality training by means of a simple, easily-measured metric. Income in dollars, number of friends who show up to help you move, mass in kilograms, etc. would be good choices. Such an approach misses subtleties, but allows for formal study and can be compared to the effects of IQ. For instance, we could then say "Study of LessWrong sequences improves salary by a number equal to +5 IQ points (or -5, to leave open the possibility that this site's rationality training is detrimental to performance).

Without an externally verifiable metric, we will have a strong bias to overemphasizing the benefits of study of this site: psychological studies show that spending time and effort on a task increases the belief that the task is worthwhile.

I think the benefit I've reaped by reading OB and LW over the years considrably outweighs any benefit I could have achieved by a measly 1 point increase in IQ without encountering LW/OB. And there are plenty of people who talk about OB/LW in life-changing ways that go far beyond the benefits I feel I've experienced. I'm surprised that nobody else seems to have argued that 1 IQ point is far too low.

I have a hard time saying I would sacrifice more, but I think on rational grounds, in terms of the probability of various practical benefits, the true value of something like LW/OB has to be significantly greater than 1 IQ point. I'd guess at least half a SD, assuming one has sufficient intelligence to be able to benefit from the community in the first place. I think my reticence to say that I would sacrifice that much or more to prevent myself from having never actually found LW/OB is mostly irrational, involving self-serving bias and others. I overestimate how much I would have learned anyway and from different sources, because some I already knew, and because other things seemed like especially lucid presentations of things I've always believed but not been able to fully articulate. But when I try to think about i critically, I have to admit that OB/LW -- in particular, the writings of Eliezer -- have probably been the largest intellectual influence of my life, which has been filled with intellectual influences.

How many points of IQ would you be willing to give up to retain what you have learned from this site?

IQ is precious, LW content can be learned again when forgotten. I agree with your "less than one" answer. It would be different if both changes were permanent, or if the IQ could be regained after some time.

[-][anonymous]13y00

.

Not a stupid question per se, but it's beside the point of the original poster.

They aren't suggesting that this is a choice that would actually come up for some well-formed reason; rather, they are asking "How important is rationality relative to intelligence?" and couching that question as "Would you exchange one unit of rationality (expressed as the contents of the Sequences) for N units of intelligence (expressed as IQ points)?"

Any other units of rationality and intelligence could be swapped in instead without losing the main point of the question.

[-][anonymous]13y00

.

That sounds accurate, but I imagine the largest region of uncertainty under discussion at the moment has to do with the practical relationship between LW-style rationality and harmonization of perceptions with the outside world.

[-][anonymous]13y00

.

Back to the Basics of Rationality, along with the stuff it links to, seems like it might be the closest to what you're looking for. The more general subject of rationality outreach has come to be fairly popular, though; Effective Rationality Outreach and Tweetable Rationality are recent high-karma posts, for example. I don't think much of a consensus on methods could be said to exist yet, although there seems to be a consensus that outreach is a good idea.

Raising the Sanity Waterline is a popular Eliezer post on a related subject, and you'll probably see its name getting thrown around when the topic is broached.

[-][anonymous]13y00

.

Yes, I think that's an excellent rephrase. Perhaps with a "To what degree..." tacked onto the front of it.

It is probably not an uncontroversial rephrase, though, since the equation of intelligence with the ability to juggle large numbers of mental objects is itself probably not uncontroversial.

That said, I endorse it.

(Though Nornagest is also correct that there's a "are the Sequences actually good for conveying rationality?" interpretation, which I personally find a less-interesting question.)

[-][anonymous]13y30

.

This blog is all about illustrating cognitive biases with concrete examples.

I have indeed, and am fond of it. During my days as a technical writer, I had that list tacked up on my wall for a time.

And yeah, invoking concrete examples when things get too abstract to follow is a fine, fine thing. Worst case, it makes very clear to others where my understanding is flawed.

There is a fair bit of this sort of concrete work on LW posts -- both Sequence and non -- but there's always room for more.

[-][anonymous]13y00

I think as you go higher up in the IQ spectrum, you should be more willing to give up IQ points.

At high levels (lets say 3 standard deviations out) you can understand the vast majority of things given a reasonable time frame, and if you're content to live off of other people's discoveries, its much more useful to know what you should be doing and how to allocate your focus, than to have your focus that little bit more useful.

I, personally, would be willing to drop about 7 IQ points. I suspect that I'll be able to translate rationality into later-life success to a point that I could've given up more, but I don't have much data to support this yet, so I'll be more conservative.

I'd be willing to drop down to 85th percentile or so if I'd get to have known LW concepts my entire life.

I'd give as many as it' take, and I have no idea what my IQ is. Without LW I might as well be a vegetable by now. I guess it depends on how you define "what you have learned from this site", if it includes a minimum infrastructure for "understanding" of it to be possible probably ALL the rest of my brain.

[-][anonymous]13y00

Rationality, the art of explicitly reasoning about things, is heralded by this community as being immensely fruitful, indeed, the very definition of winning, or something like that.

Absolutely not. Explicit reasoning is a strategy, and it is a falsifiable claim that it leads to winning. You don't win by definition when you explicitly reason about things.

[-][anonymous]13y-10

First of all I want to point out that this question isn't only a very interesting one but also a question which should be reconsidered by everyone. I don't think there is a clear answer.It depends on how you define rationality and intelligence.But they are definitely not the same. I think .. if I asked as many people as possible to reflect on what intelligence is, the most people would think about intelligence tests (this is not meant as an allegation).

In my opinion intelligence consists of many parts. And rationality is for me also a part of intelligence. I would compare rationality to intrapersonal intelligence concerning the theory of multiple intelligences.

I have thought a lot about it . And I'm not good at explaining my thoughts, but what I wanted to say is...unfortunately the value of rationality is more and more eclipsed. For me rationality is the most important part of intelligence. You need "rational thinking" to fulfill your dreams,aims, whatever.To be happy. A high IQ can be useful for being successful ( if your definition of success means having a lot of money) but it DOESN'T make you happy. I think self-realization is the most important need and not esteem. Therefore rationality is more important than the other parts of intelligence. "Our achievement-oriented society" often forgets that.