A new arrival, Kouran, recently challenged our conventional use of the label "rational" to describe various systems. The full thread is here, and it doesn't summarize neatly, but he observes that we often use "rational" in the context of non-intellectual, non-cognitive, etc. systems, and that this is an unconventional use of the word.

Unsurprisingly, this led to Standard Conversation Number 12 about how we don't really use "rational" to mean what the rest of the world means by it, and about instrumental rationality, and etc. and etc. In the course of that discussion I made the observation a couple of times (here and here) that we could probably substitute some form of "optimal" for "rational" wherever it appears without losing any information.

Of course, status quo bias being what it is, I promptly added that we wouldn't actually want to do that, because, y'know, it would be work and involve changing stuff.

But the more I think about it, the more it seems like I ought to endorse that lexical shift. We do spend a not-inconsiderable amount of time and attention on alleviating undesirable side-effects of the word 'rational,' such as the Spock effect, and our occasional annoying tendency to talk about the 'rational' choice of shoe-polish when we really mean the optimal choice, and our occasional tendency to tie ourselves in knots around "rationalists should win". (That optimized systems do better than non-optimized systems is pretty much the definition of "optimized," after all. If we say that rational systems generally do better than irrational systems, we're saying that rational systems are generally optimal, which is a non-empty statement. But if we define "rational" to mean the thing that wins, which we sometimes do, it seems simpler to talk about optimized systems in the first place.)

There's precedent for this... a while ago I started getting out of the habit of talking about "artificial intelligences" when I really wanted to talk about superhuman optimizing systems instead, and I continue to endorse that change. So, I'm going to stop using "rational" when I actually mean optimal. I encourage others to do so as well. (Or, conversely, to tell me why I shouldn't.)

This should go without saying, but in case it doesn't: I'm not proposing recoding anything or rewriting anything or doing any work here beyond changing my use of language as it's convenient for me to do so.

New Comment
69 comments, sorted by Click to highlight new comments since: Today at 11:56 AM

I like the word "optimal", and it frequently overlaps with "rational", but they aren't interchangeable. I think the key difference is that the word "rationality" carries along an implicit human actor with beliefs, knowledge and goals. On the other hand, the word "optimal" frequently ends up pointed at narrow, abstract goals, omniscient viewpoints and unavailable options.

"Investing in ABCD turned out to be suboptimal, but it was rational because he couldn't have known their factory would be hit by an earthquake." (Optimality is with respect to a state of perfect information, but past-tense rationality is with respect to the information that was actually available.)
"Sending a letter was suboptimal, but it was rational because I didn't have an email address." (Optimization is over an implied set of contact methods, rationality counts only those that are available.) "That algorithm is optimal, but using it would be irrational because it's complicated. (Implied narrow goal: computational efficiency. Actual goal system includes simple implementation.)

Since talking about rationality frequently means talking about how to deal with limited information and clarifying confusions about goals, the word "optimal" doesn't always work. There's another important difference, too, which is that "rationality" is more meta; and this protects it somewhat, in that I'd expect a book about "optimal decision-making" to tell me what to do, and a book about "rational decision-making" to tell me how to decide. The latter is more trustworthy, since the layer of indirection makes it hard to slip in agendas.

Upvoted for the clear examples. But could you explain your final sentence?

If someone wants to convince you of an incorrect, object-level thing, and they're talking to you about what's optimal, then they can argue for it directly. If they're talking about what's rational, they'll they have to fool you twice - first in getting you to accept a decision procedure that produces it, and again in getting you to accept the thing itself.

Hmm..Could you give an example? I think I disagree because it might be easy to get people to swallow an abstract argument about decisions, and the object-level thing might just be a hop, skip and a jump from that. Getting people to swallow the object-level thing first could be harder because the objects can't be disguised under abstract labels and hidden inferences. But again, an example of what you have in mind would illuminate things.

The jump is easy only if you happen to take ideas seriously. People compartmentalize by default, so they shouldn't have much trouble "trusting" a decision procedure while at the same time finding excuses for why it wouldn't work for them in a particular case.

If you do take ideas seriously, it will be harder to make you accept a shaky decision procedure at all: you would find too many examples in your own life where following it wouldn't have worked.

This all sounds plausible, but I'd like an example.

It's funny, though: Here we are disputing (abtractly) whether abstract or object-level discourse is more pliable to the pens of deceivers, and I'm insisting on a more object-level discussion. Ha.

When I first heard that Lesswrong was full of rationalists, I assumed that Lesswrong was anti-empiricist.

I nearly asked you what this dichotomy is about, but a quick search is easier. I can't help but quote the first paragraph:

The dispute between rationalism and empiricism concerns the extent to which we are dependent upon sense experience in our effort to gain knowledge. Rationalists claim that there are significant ways in which our concepts and knowledge are gained independently of sense experience. Empiricists claim that sense experience is the ultimate source of all our concepts and knowledge.

My first reaction was "But but but, it doesn't work like that!"

What's the short description-word alternative then? "Rationality" and "aspiring rationalist" sounds better than "optimized behavior" or anything else I see along these lines; surely one's behavior is never already "optimal". How would you call a practitioner? "Student of optimization"? "Optimized one"? This is way more weird than "rationalist".

Why not "optimality" and "optimalist"?

(I agree with jimrandomh's criticisms of replacing "rational" with "optimal": the replacement should not be done. But I have to confess to an initial, strongly positive reaction to the prospect of junking "rationalist," since for me, as a philosopher, that word picks out Descartes, Spinoza, and Leibniz.)

The closest equivalent to "rationalist" I can think of is "optimizer," which sounds pretty good to me in most of the contexts I find myself using the word on this site.

"Aspiring rationalist" is not a phrase I can readily imagine myself using, but I guess if I wanted to keep the modesty signalling I'd probably say something like "I'm a mediocre optimizer" or "I'm getting better at optimizing" or some such thing.

Becoming a better optimizer is not at all clearly the best marginal improvement, and clearly not an exclusive terminal goal. You are a human being. You want yourself optimized in some ways, but not necessarily with a focus on making yourself a better optimizer. So far, rationality is not that great for most purposes: you get a much clearer "big picture" understanding of the world, you correct some grievous mistakes, you see more freedom for finding ways of making life more fun. Perhaps you get a chance of producing a useful idea for the project of FAI. But this is not something best characterized as "being a better optimizer".

If I'm understanding you correctly, I would agree with you that "rationality" as you're using it in this comment doesn't map particularly well to any form of optimization, and I endorse using "rationality" to refer to what I think you're talking about..

I would also say that "rationality" as it is frequently used on this site doesn't map particularly well to "rationality" as you're using it in this comment.

Huh. If the folks downvoting this can explain to me how to map the uses of "rationality" in "rationality is not that great for most purposes" and "rationalists should win" (for example) to one another, I'd appreciate it... they sure do seem like different things to me. They are both valuable, but they are different, and seem mostly incommensurable.

[-][anonymous]12y00

Winning doesn't necessarily involve being really good at winning. While winning is a good thing, it's not a given that you should personally implement it. For example, humans are good at lifting heavy things, but with use of heavy lifting machinery, and not extraordinary power of their own improved muscles.

[This comment is no longer endorsed by its author]Reply

I didn't down-vote, but my two cents:

I think you're misunderstanding Nesov's comment. Becoming a better optimizer loses a useful distinction- to see this you need to take an outside view of optimization in general. Rationalists want to optimize their behaviors relative to their values/goals - which includes only a very narrow slice of things to optimize for (generally not the number of paperclips in the universe any other process that might be encoded in a utility function) and a specific set of heuristics to master. Hence the claim that rationality isn't that great for many purposes- there are only relatively few purposes we wish to actually pursue currently.

Even though becoming a sufficiently stronger optimizer-in-general will help you achieve your narrow range of goals, unless you specifically work towards optimizing for your value set, it's not optimal to do so relative to your actual utility function. An optimizer-in-general, strictly speaking, will on average be just as good at optimizing for the number of paperclips in the universe as you will at managing your relationships. The useful distinction is lost here.

I agree that a sufficiently general optimizer can optimize its environment for a wide range of values, the vast majority of which aren't mine, and a significant number of which are opposed to mine. As you say, an optimizer-in-general is as good at paperclips as it is at anything else (though of course a human optimizer is not, because humans are the result of a lot of evolutionary fine-tuning for specific functions).

I would say that a sufficiently general rationalist can do exactly the same thing. That is, a rationalist-in-general (at least, as the term is frequently used here) is as good at paperclips as it is at anything else (though of course a human rationalist is not, as above).

I would also say that the symmetry is not a coincidence.

I agree that if this is what Nesov meant, then I completely misunderstood his comment. I'm somewhat skeptical that this is what Nesov meant.

I was thinking about whether telling someone I'm an aspiring optimizer is going to result in less confusion than telling them that I'm an aspiring rationalist. I think that the term 'optimizer' needs a little more specification to work; how about Decision Optimization? If I tell someone I'm working on decision optimization, I pretty effectively convey what I'm doing - learning and practicing heuristics in order to make better decisions.

I probably agree that "I'm working on decision optimization" conveys more information in that case than "I'm working on rationality" but I suspect that neither is really what I'd want to say in a similar situation... I'd probably say instead that "I'm working on making more consistent probability estimates," or "I'm working on updating my beliefs based on contradictory evidence rather than rejecting it," or whatever it was. (Conversely, if I didn't know what I was working on more specifically, I would question how I knew I was working on it.)

I think you're misunderstanding Nessov's comment. Becoming a better optimizer looses a useful distinction

One s. One o.

One s. One o.

Opss. Fixed.

I'm going to continue to say optimized when I mean optimized and continue to say rational when I mean rational. I'm also going to continue to refrain from making titles along the lines of "rational shoe buying" - except when I've accidentally drunk something poisonous and don't have a stomach pump available.

Of course, status quo bias being what it is, I promptly added that we wouldn't actually want to do that, because, y'know, it would be work and involve changing stuff.

(Taking the difficulty of changing things into account is not necessarily status quo bias. Sometimes it's really better to live with imperfections of status quo than to pay the cost of the change. This isn't necessarily relevant here though.)

Agreed on all three counts.

I think rational should be reserved for some level of conscious choice, probably conceptually mediated.

I don't want to say you're more rational than I am if your enzymes are more efficient, or if you react more effectively to avoid the rush of oncoming defensive linemen than I do. Optimal is much broader than rational. By rational we mean consciously choosing the optimal, not just being optimal.

I think superlatives like "optimal" have a lot of problems. In particular, I think saying "optimal philanthropy" says to people "I'm perfect, unlike you," while comparatives like "better philanthropy" communicate something more like "Let us strive together to be better." (I think "efficient" is a better choice than "better" for related connotative reasons.)

That is a very good point here. A stupid intelligence system (the one that explores too little of the solution space) can do things that are grossly suboptimal, but are nonetheless rational in the narrow sense that the system did take the effort to try to explore solution space and to pick best solution. Humans though tend to choose what parts of solution space to explore based on all sorts of biases, many of which are outright irrational, in the sense of not being backed with correct reasoning, or being backed with deliberately faulty reasoning as a cover story.

As a bonus, "optimalism" isn't already taken (unlike rationalism). We certainly don't want to be confused with optimists though.

There's that. "Optimal"/"optimist" has the same kind of failure mode as "rational"/"rationalize".

“Language is a cracked kettle on which we beat out tunes for bears to dance to, while all the time we long to move the stars to pity.”

Interestingly, the only cult I've been in (at about age 14) is called "optimalist club" (the name is both literal translation and phonetically identical phrase in Russian; the cult is built around alternative medicine, with higher-status members being people good at health-advice-generation).

Also, The Rational Optimist was recently taken, as a book title by Matt Ridley ....

Reread What Do We Mean By "Rationality".

"X is rational!" is usually just a more strident way of saying "I think X is true" or "I think X is good". So why have an additional word for "rational" as well as "true" and "good"? Because we want to talk about systematic methods for obtaining truth and winning.

(emphasis in original)

I think the same goes for 'optimal'. When we say we're studying the art and science of human rationality, we don't mean we're studying what's optimal for humans. It means we're studying how humans can go about achieving optimality, using systematic, repeatable methods.

Taking that into account along with the principle of least astonishment, it seems to me that we should be talking about rational processes and motivations used in order to arrive at optimal goals (according to some set of selection criteria) or choose the optimal option out of some set of possibilities. So we'd still be talking about the art of human rationality, but we'd discuss optimal Christmas gift selection, say, or optimal childcare.

That carves space mostly out of instrumental rationality, but I still think there's room for an epistemic/instrumental distinction within the rationality space; the latter would describe techniques and heuristics for practical optimization, but wouldn't necessarily describe their goal.

I agree that we generally mean to refer to systematic, or at least reliable, methods of optimization, and that both the terms "rational" and "optimal" leave that implicit.

I was not saying that 'rational' and 'optimal' leave that implicit in the same way. Rather, I think the distinction occurs naturally between "studying rationality" and "studying optimality" or between "behaving rationally" and "behaving optimally" - subtle, but enough to motivate using 'rational' rather than 'optimal' in our discussions.

Huh. Sadly, the distinction is subtle enough so that i don't follow you at all.

But by all means, I endorse you using the language that best achieves your goals.

And if you can come up with a way of rephrasing this point that I find easier to follow (or if someone else can), I'd be appreciative.

Similarly to the use of 'right' and 'good'. For a consequentialist, x is right because y is good.

At the margin, I think 'rational' best describes actions and 'optimal' best describes outcomes. Thus, if action x causes outcome y, we might say that x is rational because y is optimal.

While 'behaving optimally' doesn't seem very wrong to me, "Studying the art and science of human optimality" absolutely does. To study what is optimal partly implies we're finding out about values; to study what is rational implies that we're finding out how to optimize for values, whatever they are.

If the distinction I'm observing exists, it's rather weak and there's plenty of slippage.

OK... I think I followed that. Thanks.

And I think I agree with you as far as it goes, though it doesn't outweigh my other considerations.

But I would probably say "Studying the art and science of optimization" rather than "Studying the art and science of human optimality."

A new arrival, Kouran, recently challenged our conventional use of the label "rational" to describe various systems. The full thread is here, and it doesn't summarize neatly, but he observes that we often use "rational" in the context of non-intellectual, non-cognitive, etc. systems, and that this is an unconventional use of the word.

I read the comment by Kouran and while he or she had some valid points they are for most part already points assumed by our actual conventions here and so do not particularly apply as criticism. Mind you this is from a perspective of someone already inclined to consider some usages of 'rational' as a violation of convention.

I agree that the criticisms fall flat on the LW meaning of "rationality". However, I've so far not been able to help Kouran see the slight difference in our meanings.

I've been using the label "LessWrongian" on occasion with friends. I've also been wondering recently if just plain "wisdom" would be a more catch-all label for the type of rationality we're talking about.

[-]Zed12y40

I think "strategy" is better than "wisdom". I think "wisdom" is associated with cached Truths and signals superiority. This is bad because this will make our audience too hostile. Strategy, on the other hand, is about process, about working towards a goal, and it's already used in literature in the context of improving one's decision making process.

You can get away with saying things like "I want to be strategic about life", meaning that I want to make choices in such a way that I'm unlikely to regret them at a later stage. Or I can say "I want to become a more strategic thinker" and it's immediately obvious that I care about reaching goals and that I'm not talking about strategy for the sake of strategy (I happen to care about strategy because of the virtue of curiosity, but this too is fine). The list goes on: "we need to reconsider our strategy for education", "we're not being strategic enough about health care -- too many people die unnecessarily". None of these statements put our audience on guard or make us look like unnatural weirdos. [1]

The most important thing is that "irrational" is perceived as an insult and way too close to the sexist emotional/hormonal used to dismiss women. Aside from the sexism saying "whatever, you're just being irrational" is just as bad as saying "whatever, you're just being hormonal". It's the worst possible thing to say, and when you have a habit of using the word "rational" a lot it's way too easy to slip up.

[1] fun exercise - substitute "strategy" by "rationality" and see how much more Spock-like it all sounds.

From a purely definitional perspective that's fairly close; but if we're concerned with signaling, "wisdom" has even more appallingly bad associations than "rational" does. If a friend told me he'd joined a group dedicated to seeking wisdom, I'd at best assume an ashram or something similar, and likewise I'd expect most of the people attracted by the term to lie somewhere along the New Age/NRM spectrum.

I suppose you're right. It turns us from apparent Rand-loving Spocks to Hippie New Agers in the eyes of the public.

Wait, so what would happen if we said "rationality and wisdom" or even "rationality slash wisdom"?

Either they even each-other out, or people think we're crazy and self-contradictory?

Sounds worth trying!

I assume you mean formulations (especially post titles) like "rational home buying", "rational childcare" or "rational dating".

While I do not like those formulations, I like formulations that replace "rational" with "optimal" even less. To me the word "optimal" is mostly a mathematical term (Wikipedia redirects optimal to mathematical optimization) and does not acknowledge that humans have complex values, limited knowledge and resources and that there are costs in those activities. Also I understand "optimal" to mean maximally good. If something could be better, it's not yet optimal. That means something like optimal childcare might require super human capabilities.

Formulations like "rational home buying" sound silly and even cultish, because "rational" is completely redundant here (it makes you wonder why they won't shut up about their favorite subject "rationality"). "Rational home buying" is just about home buying. No one is is deliberately irrational. I'd suggest to just say "home buying", unless you really want to contrast it with irrational home buying.

[-]see12y30

No one is is deliberately irrational.

Wanna bet?

I was. For a couple of years I would do things with no other justification than because it was a silly thing to do or because no one would ever expect it*. I recall I had some sort of weird meta-rationalization for it at one point, but that was quickly lost in the chaos.

Deliberate irrationality is not a hobby I'd suggest taking up any time soon. I still have a habit of fixating on the most ludicrous, worst possible thing to do in any given situation, and it takes significant mental effort to make myself not do it.

*NOBODY EXPECTS THE SPANISH INQUISITION!!!

How about 'strategic home buying'? See comment by Zed above. I think this makes it much more obvious what you actually mean.

I usually use "rational" in a somewhat jargonish sense to refer to things that have goals and work to achieve them in a broad-context way. This is pretty dang distinct from "optimal," mainly because calling something optimal implies something to optimize, while goals can be anything they want.

Do you mean the use of "rational" as in the "rational shoe buying" threads? In that case, sure.

I would quite happily call a system that has a goal and works to achieve that goal in a broad-context way an optimizer.

Right. So if we replace "rational" with "optimized," our "rational agent" becomes an "optimized agent."

RRRrrnt. :P

(shrug) Sure.

If we try to do text substitutions without a semantic understanding of what's going on, we get nonsense or worse. This should not be surprising. I'm not actually proposing a regexp search-and-replace, I'm proposing a lexical shift.

What we frequently refer to here as a "rational agent" isn't an optimized agent, it's an optimizing agent -- one that makes the decisions that most effectively implement its goals.

What we frequently refer to here as a "rational choice" is both an optimizing choice (that is, one which when implemented effects the chooser's goals) and an optimized choice (that is, of the set of available choices, the one which has the highest chance of effecting the chooser's goals). It might also be an optimal choice (that is, the one that actually best effects the chooser's goals).

A chooser might pick an option at random which turns out to be (by sheer dumb luck) the optimal choice. Their choice would still be optimized, though the process they used to select it was not a reliable optimizing process.

This seems pretty straightforward and useful to me, which is why I'm adopting this language.

I endorse other people similarly adopting language that seems straightforward and useful to them.

What we frequently refer to here as a "rational agent" isn't an optimized agent, it's an optimizing agent -- one that makes the decisions that most effectively implement its goals.

I am reminded of one of the early videos in Norvig and Thrun's recent online AI class, where "optimal" was used in two different senses in rapid succession — to mean "the algorithm yields the shortest route" and "the algorithm executes in the best time". This yielded some confusion for a friend of mine, who assumed that the speaker meant that these were both aspects of some deeper definition of "optimal" which would then be explained. No such explanation was forthcoming.

Eliezer brilliantly wrote this in Twelve Virtues of Rationality:
"Do not be blinded by words. When words are subtracted, anticipation remains."

I think “rational” and “optimal” share similar anticipatory elements, but “optimal” is simpler and more abstract, whereas “rational” almost necessarily applies “optimal” to some bounded agent.

When I think of a “rational” decision versus an “optimal” decision, or a “rational” person versus an “optimal” person, the overlap I see is the degree of effectiveness of something.
What I anticipate with “rational” is the effectiveness of something as a result of the procedural decision-making of an agent with scarce knowledge and capability. Context reveals who this agent is; it’s often humans.
What I anticipate with “optimal” is the highest effectiveness of something, either inclusive or exclusive of an agent and scarcity. If the context reveals no agent, scarcity can be physical but not distributive; if the context reveals an agent, it will imply which agent and what level of scarcity.

I would imagine that using proper descriptors or clear context would alleviate a lot of the ambiguity.

I've had the opposite experience. It has seemed to me that, instead of overusing the word "rational", most people are in a bizarre hurry to taboo "rational" in favor of what they really mean. And people also get annoyed at e.g. lukeprog for using "Rational" too much in article titles.

I don't know, but I think this might have the unfortunate side effect of confusing people. "Why aren't we allowed to talk about rationality!?"

Those are my two pennies' worth of anecdotal evidence.

"Why aren't we allowed to talk about rationality!?"

We are allowed to talk about rationality, but talking about rationality should focus on the substance, which is not usually correlated with frequent repetition of the label "rational". (Compare how frequently you encounter "rationality" on LW to the frequency of "mathematics" in a calculus textbook.)

There is a particular danger of losing the real goal when labels are overused. If I want to buy a car, I want to find a car which is reasonably cheap, with low fuel consumption, good looking, fast, big enough or small enough or whatever other criteria it may fulfill. Formulating the question as "what's the rational way to buy a car" risks obscuring the real problem and replacing the original goal with some abstract ideal of rationality, which in practice can be identified with partly arbitrary norms accepted by the community of self-identified rationalists. Don't underestimate the power of labels. When an abstract idea acquires a name, it starts its own life, becomes reified, easily substitutes in thought the original concrete concepts it was initially supposed to represent. You can see this effect with political labels all the time, if you think "rational" is immune, recall Randian Objectivism.

With a bit of exaggeration, "rationality" is too vague and broad term to be used in discussions of rationality.

Prior to LessWrong, I always remembered rational as meaning "free of logical fallacy". A perfectly rational person was someone who, when given a group of premises, would always derive every logically valid conclusion possible from that set, and never derive a logically invalid one.

I try to use "Bayesian" or "lesswrongian"

I will start using "Optimal".

Actually I'm not sure. Anyway, anecdotally, I got into a disagreement with somebody yesterday over what a rational agent would do to find out what somebody had for breakfast. I said that in most cases they would just ask, but the person with whom I was speaking said that they would cut the person open to see what they ate. When I said that this would lead to the agent getting arrested, the person fell back on the idea that when asked the rational agent would give no answer about what the person had for breakfast, since the didn't look inside and see. I rejected this because that would make me, as someone who would just ask, better at updating towards the truth (of what the people had for breakfast) than the rational agent.

This rambling summary of our disagreement could have been avoided if I had just been using the word "optimal" or some other term.

P.S. I think many of the points about the word "optimal" being sub-optimal are good ones, but I think it might be worth looking for a better term than "rational". On the other hand, while "optimal" may not be the optimal word to choose, it could still be the rational one.

Perhaps the optimal way to find out is cutting them open, too. At least if you optimise for certainty of the answer and ignore the side-effects as arrests, ethics and disgust. People can have silly preconceptions of rationality, people can have silly preconceptions of optimality as well (or perhaps propensity to argue about silly things in general).

It might be the best way to find out the answer, but would you really argue that it would be a rational or optimal decision in real life?

I wouldn't, of course.

I granted him that it might make sense it the world somehow depended on it of it there were other crazy circumstances, but he was arguing that a more-rational-me would cut the person open, and I was saying that a more-rational-me would probably just ask.

Did he give any explanation why such a silly thing would be rational? Does he exhibit similar weird interpretations of "rationality" or other words on other occasions, or was his insistence only a result of his inability to change mind, even if it means insisting on clearly unreasonable random guesses?

His definition of rational was something like "getting to the truth with highest possible accuracy" but may have even been "getting to the truth 100% of the time".