What Intelligence Tests Miss: The psychology of rational thought

by Kaj_Sotala9 min read11th Jul 201054 comments


IQ and g-factorGeneral IntelligenceRationality

This is the fourth and final part in a mini-sequence presenting Keith E. Stanovich's excellent book What Intelligence Tests Miss: The psychology of rational thought.

If you want to give people a single book to introduce people to the themes and ideas discussed on Less Wrong, What Intelligence Tests Miss is probably the best currenty existing book for doing so. It does have a somewhat different view on the study of bias than we on LW: while Eliezer concentrated on the idea of the map and the territory and aspiring to the ideal of a perfect decision-maker, Stanovich's perspective is more akin to bias as a thing that prevents people from taking full advantage of their intelligence. Regardless, for someone less easily persuaded by LW's somewhat abstract ideals, reading Stanovich's concrete examples first and then proceeding to the Sequences is likely to make the content presented in the sequences much more interesting. Even some of our terminology such as "carving reality at the joints" and the instrumental/epistemic rationality distinction will be more familiar to somebody who was first read What Intelligence Tests Miss.

Below is a chapter-by-chapter summary of the book.

Inside George W. Bush's Mind: Hints at What IQ Tests Miss is a brief introductory chapter. It starts with the example of president George W. Bush, mentioning that the president's opponents frequently argued against his intelligence, and even his supporters implicitly conceded the point by arguing that even though he didn't have "school smarts" he did have "street smarts". Both groups were purportedly surprised when it was revealed that the president's IQ was around 120, roughly the same as his 2004 presidential candidate opponent John Kerry. Stanovich then goes on to say that this should not be surprising, for IQ tests do not tap into the tendency to actually think in an analytical manner, and that IQ had been overvalued as a concept. For instance, university admissions frequently depend on tests such as the SAT, which are pretty much pure IQ tests. The chapter ends by a disclaimer that the book is not an attempt to say that IQ tests measure nothing important, or that there would be many kinds of intelligence. IQ does measure something real and important, but that doesn't change the fact that people overvalue it and are generally confused about what it actually does measure.

Dysrationalia: Separating Rationality and Intelligence talks about the phenomenon informally described as "smart but acting stupid". Stanovich notes that if we used a broad definition of intelligence, where intelligence only meant acting in an optimal manner, then this expression wouldn't make any sense. Rather, it's a sign that people are intuitively aware of IQ and rationality as measuring two separate qualities. Stanovich then brings up the concept of dyslexia, which the DSM IV defines as "reading achievement that falls substantially below that expected given the individual's chronological age, measured intelligence, and age-appropriate education". Similarly, the diagnostic criterion for mathematics disorder (dyscalculia) is "mathematical ability that falls substantially below that expected for the individual's chronological age, measured intelligence, and age-appropriate education". He argues that since we have a precedent for creating new disability categories when someone's ability in an important skill domain is below what would be expected for their intelligence, it would make sense to also have a category for "dysrationalia":

Dysrationalia is the inability to think and behave rationally despite adequate intelligence. It is a general term that refers to a heterogenous group of disorders manifested by significant difficulties in belief formation, in the assessment of belief consistency, and/or in the determination of action to achieve one's goals. Although dysrationalia may occur concomitantly with other handicapping conditions (e.g. sensory impairment), dysrationalia is not the result of those conditions. The key diagnostic criterion for dysrationalia is a level of rationality, as demonstrated in thinking and behavior, that is significantly below the level of the individual's intellectual capacity (as determined by an individually administered IQ test).

The Reflective Mind, the Algorithmic Mind, and the Autonomous Mind presents a three-level model of the mind, which I mostly covered in A Taxonomy of Bias: The Cognitive Miser. At the end, we return to the example of George W. Bush, and are shown a bunch of quotes from the president's supporters describing him. His speechwriter called him "sometimes glib, even dogmatic; often uncurious and as a result ill-informed"; John McCain said Bush never asks for his opinion and that the president "wasn't intellectually curious". The same sentiment was echoed by a senior official in Iraq who had observed Bush in various videoconferences and said that the president's "obvious lack of interest in long, detailed discussions, had a chilling effect". On the other hand, other people were quoted as saying that Bush was "extraordinarily intelligent, but was not interested in learning unless it had practical value". Tony Blair repeatedly told his associates that Bush was "very bright". This is taken as evidence that while Bush is indeed intelligent, he does not have thinking dispositions that would have make him make use of his intelligence: he has dysrationalia.

Cutting Intelligence Down to Size further criticizes the trend of treating the word "intelligence" in a manner that is too broad. Stanovich points out that even critics of the IQ concept who introduce terms such as "social intelligence" and "bodily-kinesthetic intelligence" are probably shooting themselves in the foot. By giving everything valuable the label of intelligence, these critics are actually increasing the esteem of IQ tests, and therefore making people think that IQ measures more than it does.

Consider a thought experiment. Imagine that someone objected to the emphasis given to horsepower (engine power) when evaluating automobiles. They feel that horsepower looms too large in people's thinking. In an attempt to deemphasize horsepower, they then being to term the other features of the car things like "braking horsepower" and "cornering horsepower" and "comfort horsepower". Would such a strategy make people less likely to look to engine power as an indicator of the "goodness" of a car? I think not. [...] Just as calling "all good car things" horsepower would emphasize horsepower, I would argue that calling "all good cognitive things" intelligence will contribute to the deification of MAMBIT [Mental Abilities Measured By Intelligence Tests].

Stanovich then continues to argue in favor of separating rationality and intelligence, citing surveys that suggest that folk psychology does already distinguish between the two. He also brings up the chilling effect that deifying intelligence seems to be having on society. Reviews about a book discussing the maltreatment of boys labeled feebleminded seemed to concentrate on the stories of the boys who were later found to have normal IQs, implying that abusive treatment of boys who actually did have a low IQ was okay. Various parents seem to take a diagnosis of low mental ability as much more shocking than a diagnosis such as ADHD or learning disability that stresses the presence of normal IQ, even though the life problems associated with some emotional and behavior disorders are much more severe than those associated with many forms of moderate or mild intellectual disability.

Why Intelligent People Doing Foolish Things Is No Surprise briefly introduces the concept of the cognitive miser, explaining that conserving energy and not thinking about things too much is a perfectly understandable tendency given our evolutionary past.

The Cognitive Miser: Ways to Avoid Thinking discusses the cognitive miser further, starting with the "Jack is looking at Anne but Anne is looking at George" problem, noting that one could arrive at the correct answer via disjunctive reasoning ("either Anne is married, in which case the answer is yes, or Anne is unmarried, in which case the answer is also yes") but most people won't bother. It then discusses attribute substitution (instead of directly evaluating X, consider the correlated and easier to evaluate quality Y), vividness/salience/accessibility effects, anchoring effects and the recognition heuristic. Stanovich emphasizes that he does not say that heuristics are always bad, but rather that one shouldn't always rely on them.

Framing and the Cognitive Miser extensively discusses various framing effects, and at the end notes that high-IQ people are not usually any more likely to avoid producing inconsistent answers to various framings unless they are specifically instructed to try to be consistent. This is mentioned to be a general phenomenon: if intelligent people have to notice themselves that an issue of rationality is involved, they do little better than their counterparts of lower intelligence.

Myside Processing: Heads I Win - Tails I Win Too! discusses "myside bias", people evaluating situations in terms of their own perspective. Americans will provide much stronger support for the USA banning an unsafe German car than for Germany banning an unsafe American car. People will much more easily pick up on inconsistencies in the actions of their political opponents than the politicians they support. They will also be generally overconfident, be appalled at others exhibiting the same unsafe behaviors they themselves exhibit, underestimate the degree to which biases influence our own thinking, and assume people understand their messages better than they actually do. The end of the chapter surveys research on the linkage between intelligence and the tendency to fall prey to these biases. It notes that intelligent people again do moderately better, but only when specifically instructed to avoid bias.

A Different Pitfall of the Cognitive Miser: Thinking a Lot, but Losing takes up the problem of failing to override your autonomous processing even when it would be necessary. Most of this chapter is covered by my previous discussion of override failures in the Cognitive Miser post.

Mindware Gaps introduces in more detail a different failure mode: that of mindware gaps. It also introduces and explains the concepts of Bayes' theorem, falsifiability, base rates and the conjunction error as crucial mindware for avoiding many failures of rationality. It notes that thinking dispositions for actually actively analyzing things could be called "strategic mindware". The chapter concludes by noting that the useful mindware discussed in the chapter is not widely and systematically taught, leaving even intelligent people gaps in their mindware that makes them subject to failures of rationality.

I mostly covered the contents of Contaminated Mindware in my post about mindware problems.

How Many Ways Can Thinking Go Wrong? A Taxonomy of Irrational Thinking Tendencies and Their Relation to Intelligence summarizes the content of the previous chapters and organizes the various biases into a taxonomy of biases that has the main categories of the Cognitive Miser, Mindware Problems, and Mr. Spock Syndrome. I did not previously cover Mr. Spock Syndrome because as Stanovich says, it is not a fully cognitive category. People with the syndrome have a reduced ability to feel emotions, which messes up their ability to behave appropriately in various situations even though their intelligence remains intact. Stanovich notes that the syndrome is most obvious with people who have suffered severe brain damage, but difficulties of emotional regulation and awareness do seem to also correlate negatively with some tests of rationality, as well as positive life outcomes, even when intelligence is controlled for.

The Social Benefits of Increasing Human Rationality - and Meliorating Irrationality concludes the book by arguing that while increasing the average intelligence of people would have only small if any effects on general well-being, we could reap vast social benefits if we actually tried to make people more rational. There's evidence that rationality would be much more malleable than intelligence. Disjunctive reasoning, the tendency to consider all possible states of the world when deciding among options, is noted to be a rational thinking skill of high generality that can be taught. There also don't seem to be strong intelligence-related limitations on the ability to think disjunctively. Much other useful mindware, like that of scientific and probabilistic reasoning. While these might be challenging to people with a lower IQ, techniques such as implementation intention may be easier to learn.

An implementation intention is formed when the individual marks the cue-action sequence with the conscious, verbal declaration of "when X occurs, I will do Y." Often with the aid of the context-fixing properties of language, the triggering of this cue-action sequence on just a few occasions is enough to establish it in the autonomous mind. Finally, research has shown that an even more minimalist cognitive strategy of forming mental goals (whether or not they have implementation intentions) can be efficacious. For example, people perform better at a task when they are told to form a mental goal ("set a specific, challenging goal for yourself") for their performance as opposed to being given the generic motivational instructions ("do your best").

Stanovich also argues in favor of libertarian paternalism: shaping the environment so that people are still free to choose what they want, but so that the default choice is generally the best one. For instance, countries with an opt-out policy for organ donation have far more donors than the countries with an opt-in policy. This is not because the people in one country would be any more or less selfish than those in other countries, but because people in general tend to go with the default option. He also argues that it would be perfectly possible though expensive to develop general rationality tests that would be akin to intelligence tests, and that also using RQ proxies for things such as college admission would have great social benefits.

In studies cited in this book, it has been shown that:
  • Psychologists have found ways of presenting statistical information so that we can make more rational decisions related to medical matters and in any situation where statistics are involved.
  • Cognitive psychologists have shown that a few simple changes in presenting information in accord with default biases could vastly increase the frequency of organ donations, thus saving thousands of lives.
  • Americans annually pay millions of dollars for advice on how to invest their money in the stock market, when following a few simple principles from decision theory would lead to returns on their investments superior to any of this advice. These principles would help people avoid the cognitive biases that lead them to reduce their returns - overreacting to chance events, overconfidence, wishful thinking, hindsight bias, misunderstanding of probability.
  • Decision scientists have found that people are extremely poor at assessing environmental risks. This is mainly because vividness biases dominate people's judgment to an inordinate extent. People could improve, and this would make a huge difference because these poor assessments come to affect public policy (causing policy makers to implement policy A, which saves one life for each $3.2 million spent, instead of policy B, which would have saved one life for every $220,000 spent, for example).
  • Psychologists from various specialty areas are beginning to pinpoint the cognitive illusions that sustain pathological gambling behavior - pseudodiagnosticity, belief perseverance, over-reacting to chance events, cognitive impulsivity, misunderstanding probability - behavior that destroys thousands of lives each year.
  • Cognitive psychologists have studied the overconfidence effect in human judgment - that people miscalibrate their future performance, usually by making overoptimistic predictions. Psychologists have studied ways to help people avoid these problems in self-monitoring, making it easier for people to plan for the future (overconfident people get more unpleasant surprises).
  • Social psychological research has found that controlling the explosion of choices in our lives is one of the keys to happiness - that constraining choice often makes people happier.
  • Simple changes in the way that pension plans are organized and administered could make retirement more comfortable for millions of people.
  • Probabilistic reasoning is perhaps the most studied topic in the decision-making field, and many of the cognitive reforms that have been examined - for example, eliminating base-rate neglect - could improve practices in courtrooms, where poor thinking about probabilities have been shown to impede justice.

These are just a small sampling of the teachable reasoning strategies and environmental fixes that could make a difference in people's lives, and they are more related to rationality than intelligence. They are examples of the types of outcomes that would result if we all became more rational thinkers and decision makers. They are the types of outcomes that would be multiplied if schools, businesses, and government focused on the parts of cognition that intelligence tests miss. Instead, we continue to pay far more attention to intelligence than to rational thinking. It is as if intelligence has become totemic in our culture, and we choose to pursue it rather than the reasoning strategies that could transform our world.