Rationality 101

Around the 1970s, psychologists discovered something incredible:

Humans suck at making decisions.

Well, not exactly surprising.  We all know humans aren’t perfect.  But in past few decades, scientists have found literally hundreds of errors in our thinking, from making incorrect judgments to looking at the wrong information.  Here are just a few examples:

1. We’re terrible at planning:  

Students were asked to estimate when they were 99% certain they’d finish an academic project.  When the time came, only 45% of them finished at their 99% estimate.  Similarly, students in another study were asked to predict times for finishing their senior thesis if ‘‘everything went as poorly as it possibly could.’’  Only 30% of students finished by that time [Buehler, Griffin, & Peetz, 2010].

planning-fallacy

It’s far from just the students.  Our overconfidence in planning has been replicated across all fields, from financial decisions, to software projects, to major government ventures.

2. We’re screwed over by herd mentality:

Participants were placed either alone or in a group intercom discussion (but they were in separate rooms, so they couldn’t see one another).  One of the “participants” then had a seizure (they were really working with the experimenter, and the seizure was faked).

When alone, people went to help 85% of the time.  But when the participants knew there were four other people in different rooms who had also heard the seizure, only 31% of those groups had people who reported the incident [Darley & Latane 1968].

 

bystander-effect

 

Alas, there are also many real-life examples of our inability to handle responsibility in a group, often with disastrous results.  Being in a group can make it harder to make good decisions.

3. We’re really inconsistent:

People were asked how much they’d pay to save one child.  Then, they were asked how much they’d pay to save one child out of two, so it’d be uncertain which one.  On the second question, people were willing to pay less, even though they’d be saving one person in both scenarios [Slovic & Västfjäll, 2014].

In another study, people were willing to pay $80 to save 2,000 drowning birds, but a similar group of people came up with basically the same number, $78, when given the same question for 20,000 birds [Desvouges, 1992, et al].

 

scope-neglect

 

Welcome to the field of heuristics and biases, where the more we learn, the stupider we seem to be.

For all that we’ve advanced in medicine, technology, and morality, we’re still stuck with pretty much the same squishy mammal brain as that of our distant ancestors.  Mental adaptations that once might have proved helpful on the savannah have become poorly suited for our modern world.

It does get better, though.

Soon after, psychologists began looking into debiasing, or researching ways to reduce the errors in our thinking.  The results were mixed.  Unsurprisingly, it turns out that no one single solution exists to solve all the bugs in our thinking (save for just removing the brain and calling it a day).  But many assorted results in different areas have uncovered some strategies that have been shown to improve our decision-making skills.  

Armed with concrete strategies for shaping our thinking, along with the knowledge of the different pitfalls, it appears possible to do better than our clunky primate brain defaults.  We can try to be less wrong in our thinking.  We can hopefully suck less.

And that’s rather exciting.

From this focus on debiasing, a group, the rationality community, has sprung up.  They’re people from all over the globe, trying to better understand their thinking.  Rationalists have a focus on discovering truth, improving themselves, and finding good ways to help others.

For the past few years, discussion websites such as LessWrong (heh) have been the place of online discussions about how the research on biases and debiasing can improve our lives.  Eliezer Yudkowsky (LessWrong’s cofounder) and others, for example, have written a huge amount of content that summarizes this research Some of this effort can be found on a free book called Rationality: From AI to Zombies.

Rationality: From AI to Zombies goes far beyond just heuristics and biases.  Part researcher, part philosopher, Yudkowsky explores epistemology, cognitive science, and probability.  There are essays on topics from resolving disagreements to artificial intelligence.  Thematically, it feels a lot like Douglas Hofstadter’s legendary Gödel, Escher, Bach, although its short essay compilation format makes it a very different type of read.

Reading just a few essays in the collection can help point at the gist of what rationality is all about.  (I’d recommend this one or this one for strong, usable ideas.) Yudkowsky’s writings can be complex at times, though, and his texts may not be the easiest to parse.  

r-a-z

For a much more direct introduction to heuristics and biases, Nobel Prize winner Daniel Kahneman’s bestselling book Thinking Fast and Slow provides an engaging account of the field, and he writes in a familiar, easy-to-read way.  

Kahneman is considered the co-founder of the entire field of heuristics and biases; he and Amos Tversky were the psychologists in the 1970’s who kicked off the whole field.  Thinking Fast and Slow is probably the best introduction to the collection of years of research on this topic.  

It even has the original survey questions included in the studies, so you can see your own responses.  Thinking Fast and Slow is easily one of the top three books I’ve ever read.  Learning about our mental errors completely changed the way I view my own mind.  It’s highly worth checking out.

tfas

But what about those debiasing strategies I mentioned earlier?  One of the best places to learn more is to check out the nonprofit Center for Applied Rationality (CFAR).  CFAR combines concepts from economics and cognitive psychology to create research-backed techniques to combat biases.  It’s like self-improvement, but with solid backing.

They then host workshops where they teach these skills to improve thinking and problem-solving.  Their website has a wealth of materials on debiasing, with book recommendations, checklists, videos, and more.  With a mission to “actually trying to figure things out”, they consistently focus on generating strong “object-level” actions.

cfar-logo

 

If any of this sounds interesting, I’d highly recommend poking around on the above resources.  (There’s also this extended reading list I put together here.) Of course, I’ll be the first to admit that debiasing doesn’t solve everything.

Still, it’s great to know we can strive to do better.  

Once you start reading up on these errors, it becomes easier to catch yourself making them, as well as seeing your past mistakes in hindsight.  Recognizing the cues, revamping your planning, changing your thoughts, debiasing can help us do more of what we want, instead of our buggy “defaults”.

As for this blog here—mindlevelup?  I write weekly essays on productivity and motivation, sprinkled with some rambling discussions on thinking about our thinking.  If you are at all interested in “leveling up” your mind, I’d encourage you to follow me along as I try to sort out all the quirks of my own brain and hopefully impart some helpful tips.  (The Selected Posts page is a great place to start.)

 

 

 

Works Cited:

Buehler, Roger, Dale Griffin, and Johanna Peetz. “Chapter one-the planning fallacy: cognitive, motivational, and social origins.” Advances in Experimental Social Psychology 43 (2010):162.https://www.researchgate.net/publication/251449615_The_Planning_Fallacy

Darley, John M., and Bibb Latane. “Bystander Intervention in Emergencies: Diffusion of Responsibility.” Journal of Personality and Social Psychology 8.4 p1 (1968): 377.http://www.wadsworth.com/psychology_d/templates/student_resources/0155060678_rathus/ps/ps19.html

Desvousges, William H., et al. “Measuring Nonuse Damages using Contingent Valuation: An Experimental Evaluation of Accuracy.” (1992).http://www.rti.org/sites/default/files/resources/bk-0001-1009_web.pdf

Västfjäll, Daniel, Paul Slovic, and Marcus Mayorga. “Whoever Saves One Life Saves the World: Confronting the Challenge of Pseudoinefficacy.” Manuscript submitted for publication (2014). http://globaljustice.uoregon.edu/files/2014/07/Whoever-Saves-One-Life-Saves-the-World-1wda5u6.pdf