Desire is the direction, rationality is the magnitude

by So8res 5y5th Apr 20152 min read13 comments

26


What follows is a series of four short essays that say explicitly some things that I would tell an intrigued proto-rationalist before pointing them towards Rationality: AI to Zombies (and, by extension, most of LessWrong). For most people here, these essays will be very old news, as they talk about the insights that come even before the sequences. However, I've noticed recently that a number of fledgling rationalists haven't actually been exposed to all of these ideas, and there is power in saying the obvious.

This essay is cross-posted on MindingOurWay.


A brief note on "rationality:"

It's a common trope that thinking can be divided up into "hot, emotional thinking" and "cold, rational thinking" (with Kirk and Spock being the stereotypical offenders, respectively). The tropes say that the hot decisions are often stupid (and inconsiderate of consequences), while the cold decisions are often smart (but made by the sort of disconnected nerd that wears a lab coat and makes wacky technology). Of course (the trope goes) there are Deep Human Truths available to the hot reasoners that the cold reasoners know not.

Many people, upon encountering one who says they study the art of human rationality, jump to the conclusion that these "rationalists" are people who reject the hot reasoning entirely, attempting to disconnect themselves from their emotions once and for all, in order to avoid the rash mistakes of "hot reasoning." Many think that these aspiring rationalists are attempting some sort of dark ritual to sacrifice emotion once and for all, while failing to notice that the emotions they wish to sacrifice are the very things which give them their humanity. "Love is hot and rash and irrational," they say, "but you sure wouldn't want to sacrifice it." Understandably, many people find the prospect of "becoming more rational" rather uncomfortable.

So heads up: this sort of emotional sacrifice has little to do with the word "rationality" as it is used in Rationality: AI to Zombies.

When Rationality: AI to Zombies talks about "rationality," it's not talking about the "cold" part of hot vs cold reasoning, it's talking about the reasoning part.

One way or another, we humans are reasoning creatures. Sometimes, when time pressure is bearing down on us, we make quick decisions and follow our split-second intuitions. Sometimes, when the stakes are incredibly high and we have time available, we deploy the machinery of logic, in places where we trust it more than our impulses. But in both cases, we are reasoning. Whether our reasoning be hot or cold or otherwise, there are better and worse ways to reason.

(And, trust me, brains have found a whole lot of the bad ones. What do you expect, when you run programs that screwed themselves into existence on computers made of meat?)

The rationality of Rationality: AI to Zombies isn't about using cold logic to choose what to care about. Reasoning well has little to do with what you're reasoning towards. If your goal is to enjoy life to the fullest and love without restraint, then better reasoning (while hot or cold, while rushed or relaxed) will help you do so. But if your goal is to annihilate as many puppies as possible, then this-kind-of-rationality will also help you annihilate more puppies.

(Unfortunately, this usage of the word "rationality" does not match the colloquial usage. I wish we had a better word for the study of how to improve one's reasoning in all its forms that didn't also evoke images of people sacrificing their emotions on the altar of cold logic. But alas, that ship has sailed.)

If you are considering walking the path towards rationality-as-better-reasoning, then please, do not sacrifice your warmth. Your deepest desires are not a burden, but a compass. Rationality of this kind is not about changing where you're going, it's about changing how far you can go.

People often label their deepest desires "irrational." They say things like "I know it's irrational, but I love my partner, and if they were taken from me, I'd move heaven and earth to get them back." To which I say: when I point towards "rationality," I point not towards that which would rob you of your desires, but rather towards that which would make you better able to achieve them.

That is the sort of rationality that I suggest studying, when I recommend reading Rationality: AI to Zombies.

26