Does anyone know of a good article that illustrates how society is generally irrational, and how making society more rational would have huge benefits, because it'd be a very high level action?

I'm writing an essay about how to improve education, and one of my proposals is that a core part of the curriculum should be rationality. I believe that doing this would have huge benefits to society, and want to explain why I think this, but I'm having trouble. Any thoughts?

Edit: Part of Raising the Sanity Waterline talks about common ways in which people are irrational. However, they're all links to longer Less Wrong articles. Preferably, I'd like to illustrate it in a few sentences/paragraphs.

New to LessWrong?

New Comment
41 comments, sorted by Click to highlight new comments since: Today at 3:00 AM

Society is made up of individuals. If you can demonstrate that individuals are irrational, then you have a better chance at claiming that the society is too. Yudkowsky wrote about the sanity waterline rather late when he had already covered a lot of other topics and I think this was intentional.

You can't just start from the assumption that society would be more rational if rationality was taught at school. You'd also need evidence that rationality can be taught to a lot of average people. I don't think such evidence exists. Whatever taken out from the curriculum might be replaced by something completely ineffective.

Of course, if changing the curriculum would make some of the smarter individuals more rational, and leave the average student with nothing, the result might still be a net positive. This argument wouldn't convince anyone professing egalitarianism however.

Individual benefits are far easier to sell than societal benefits. They're easier to imagine, examples are available, they're near rather than far, self interest is inherently motivating, and your reader won't be mindkilled by politics. If you can get the reader to accept the individual benefits, then you might be able to extrapolate a bit from there.

The title of this post is misleading, since you're not illustrating anything but asking for advice.

You can't just start from the assumption that society would be more rational if rationality was taught at school.

It could actually make things worse. It could put the whole society into a huge valley of bad rationality.

On the other hand, maybe that's exactly where the society is now.

You can't just start from the assumption that society would be more rational if rationality was taught at school. You'd also need evidence that rationality can be taught to a lot of average people. I don't think such evidence exists. Whatever taken out from the curriculum might be replaced by something completely ineffective.

Can't specific rationality techniques be effectively taught to a large amount of average people, though? I vaguely recall that there might be some examples of that in studies where the researchers taught participants a trick or two before submitting them a test of some sort, but my ability to recall specific examples is almost geometrically inverse to gwern's, so that certainly takes out of my point.

Can't specific rationality techniques be effectively taught to a large amount of average people, though?

I know of no evidence that this is possible -- where "effectively" means "after several years still actively using these techniques in their lives".

[-][anonymous]10y00

I could swear there was research on delaying gratification that matched this criteria. It's not clear in the Wikipedia article whether the hot-cold strategy was prescriptive or descriptive, but I thought I remembered there being a prescriptive study that correlated with positive outcomes later in life

The title of this post is misleading, since you're not illustrating anything but asking for advice

You're right, sorry about that. I just changed it.

I took a semester long Critical Thinking course in college as part of a Philosophy major prerequisite (which I later dropped.) I was already familiar with the material, which was pretty sparse and underdeveloped compared to the Sequences, but I think every other student left that class having learned a lot about avoiding some major pitfalls of reasoning which affected their lives on a regular basis, and every student including myself agreed that something like it ought to have been a mandatory class in high school.

Individual benefits might be easier to sell than societal benefits, but honestly, I think that the societal benefits of general rationality education would utterly dwarf the benefits of intensive rationality training in a small sector of the populace. Biases hampering our ability to productively debate policy, to evaluate initiatives empirically, to design programs to attain their actual intended purposes, etc. all harm the social structures around which all people, from the most rational to the least, are forced to build their lives.

You might start by tabooing rationality.

The proxy we used in the last LW survey for rationality was whether people are well calibrated in their uncertainty. It might be a lot easier to write an article about how calibrating your uncertainty is beneficial than writing it about an abstract phrase like "rationality".

I'm writing an essay about how to improve education, and one of my proposals is that a core part of the curriculum should be rationality.

Again what do you mean with rationality. Medical doctors are already taught bayes rule somewhere in their statistics course. They just don't learn it in a way that allows them to actually use it later when they practice medicine.

The problem is that we don't have any evidence that we can effectively teach bayes rule to those kinds of people even if we make space in the curriculum.

That makes your proposal for putting rationality into the school curriculum an advocacy of a non evidence-based intervention. That might or might brother you, but it's something to keep in mind when writing an article and maybe even worth explicitly addressing.

[-]gjm10y70

Your question rings alarm bells for me.

[-]ntroPi10y-40

I think you have a good point, but it would be easier to see if you had posted a short sentence explaining what your point is. Please don't assume that every reader has read all the sequences or has the time to do so (edit: read this one) just to understand your comment.

The idea is that you shouldn't start your reasoning process from the conclusion, if you want to be rational. For a rational person, conclusion is what they get at the end, after weighing all available evidence, not a starting point.

Specifically, you don't know whether "rationality would be beneficial for the society". So you shouldn't start at this point (the conclusion). What if you are wrong (but there is a selective evidence you could use to support your conclusion anyway)?

[-]gjm10y40

I certainly don't assume that any particular reader has read all the sequences (nor that they should). I don't think it's so unreasonable to suggest reading one particular not-so-long post -- whose title might give the game away to a sufficiently quick-witted reader without even needing to follow the link.

[-]ntroPi10y-30

This is decreasing your work in commenting by increasing the work for some readers. It would be globally more useful to spend one minute on a better comment like the one Viliam_Bur has posted, than having an unknown number of people read the linked article to understand your point.

Your utility function and opinion may differ though, perhaps your intention was not primarily to get a point across but to make people read the article?

[-]gjm10y40

I'm sorry that you didn't like my comment.

My intention was to get a point across. I thought that anyone who read my comment, didn't find its meaning clear, and was interested enough that they'd have bothered to read a longer and more explicit one would probably also be willing to read the thing I linked to, and that they might find it interesting if they did.

(Being terse plainly hasn't, in fact, decreased the amount of effort I've had to expend.)

[-]ntroPi10y-10

I actually read the article due to your post and it was interesting. I agree to your point, just didn't like the style and I could have been more diplomatic about it.

Keep posting. :-)

Please don't assume that every reader has read all the sequences or has the time to do so just to understand your comment.

A particular post was linked. The implied requirement of having to "read all the sequences" is an extreme distortion of the issue that makes your remark seem more relevant.

[-]ntroPi10y-40

You're right. "Has read a majority of the sequences so that there is a high probability that this specific sequence is among them" would have been more precise.

While it was an exaggeration "extreme distortion" seems like a harsh judgement.

Edit: oh sorry - I i didn't mean to imply all the sequences are necessary for understanding. I'll fix the sentence.

[This comment is no longer endorsed by its author]Reply

Having to read the "majority of the sequences" is still an extreme distortion. It's enough to have a look at the (single) linked post.

Yes I replied too fast to your comment. Already Fixed.

A commonly circulated article in the business world is The case for behavioral strategy.

On the relevance to GCRs, there's Cognitive biases potentially affecting judgment of global risks.

Assuming that rationality can be taught at school to everyone, is there even a connection between more rational individuals and a more rational society?

The problem I see here is that rationality is already very weakly defined for individuals and I know of no definitions in the context of society. A society can't even think (or can it?), how can it be rational?

Many decision processes of society are not based on rationality at all and I see no reason why the tried ways of winning (i.e. corruption) should be replaced by others assuming as the only change slightly more rational agents. Elections produces an average of opinions, this average may not change at all given more rational voters.

You will have to cover a lot of inference steps just to show that society as a whole will become more rational. Rationality isn't the only attribute of a "good" society and there might be ugly trade-offs. Whether a more rational society will have any "huge benefits" is just the last question in a chain that will surely be too much for a single article or a few sentences.

The problem I see here is that rationality is already very weakly defined for individuals and I know of no definitions in the context of society. A society can't even think (or can it?), how can it be rational?

The first answer that comes to mind is "A society can be considered rational if the institutions that society creates collectively would be considered rationally and intelligently designed had they been designed by an individual."

This is clearly not usually the case, but some societies have it much worse than others. Yvain's writings on Haiti reveal a society which is, by this metric, much less rational than America. I see no reason to suppose that existing first world countries have hit some theoretical ceiling on societal rationality.

So a society is rational if the institutions are rational ... and an institution is rational if its outputs seem rationally designed ... which is judged by a rational individual ... which is still hard to define.

I see your point and agree that there is room for improvement. Instead of "more rational" I would propose "less insane" which seems to fit the evidence as good as the other description.

Will one of these more insane societies become less insane by making sure everybody on the streets is less insane? The connection doesn't seem obvious, except in extreme cases.

The connection between rational individuals and rational society is implied by use of the same word and only obvious in extreme cases.

What do you mean by "only obvious in extreme cases?"

I would definitely not agree that the connection between rational individuals and rational society is merely implied by the use of the same word, I would absolutely say that they're inextricably linked. Having attempted cooperative projects with other people over a wide range of rationality levels, I've found that working with groups of more rational individuals really does eliminate a huge cohort of problems which attend the work of less rational individuals.

One member on this site, years ago, discussed how his boss had once remarked on how well a project he (the commenter) had handled had gone, and spoken of it as if it were simply a fortuitous chance. And the commenter explained to the boss that the project had gone well because he'd designed it to go well by addressing the possible points of failure. This was a possibility that had simply never occurred to his boss before.

Operating within rational versus irrational groups can spell the difference between everyone understanding concepts like this versus nobody understanding them.

What do you mean by "only obvious in extreme cases?"

Just, that there is no obvious mechanism that produces a more rational society from more rational people.

Again I agree on the positive effects of rationality and do believe that more rationality will improve society. But there are many people that say the same about religion, obedience or other things that I don't view as positive.

I don't think it's true at all that there's no obvious mechanism that produces a more rational society from more rational people.

If I'm working on a project with a group of irrational people, the other members will tend to make mistakes of judgment which I'm simply too many steps of inference removed from them to realistically explain. So I give up, and the project suffers.

If I'm working on a project with a group of highly rational people, those problems can be avoided without even needing to be discussed, saving energy for higher level problems.

Groups are made up of individuals. If every individual in a group recognizes the problems which will attend a course of action, that group is much more likely to avoid those problems than a group where nobody recognizes them.

A group project is far away from society as a whole, where discussion and explanation between all members is impossible due to scale.

Your project could benefit from increased obedience as you could just lead rationally and the others would follow. Disagreements between rational people can take a longer time to resolve, etc.

I still agree to all your examples. More anecdotes will not be helpful, as I already agree that increased rationality will improve society (and group projects and institutions for that matter).

What I'm missing is a clear mechanism that actually produces a more rational society just from increasing the rationality of people. Please explain the mechanism.

"Society" doesn't make decisions, groups of people make decisions. If every individual in the group understands how to avoid natural pitfalls, how to coordinate decisionmaking processes, how to take on board information from viewpoints which conflict with their own and incorporate what's useful rather than throwing it out wholecloth, etc, then the collective decisionmaking ability of the group is improved.

Your project could benefit from increased obedience as you could just lead rationally and the others would follow. Disagreements between rational people can take a longer time to resolve, etc

The projects I participated in could have benefited from increased group obedience, if everyone simply followed my lead, but if the members lacked the reasoning ability to distinguish between competent leaders, how would they know who to trust to lead them?

In my experience, disagreements between genuinely rational people overwhelmingly do not take a longer time to resolve. One of the basic components of rationality is knowing how to take new information on board and actually change your mind. Disagreements between irrational people tend to be far more intractable.

"Society" doesn't make decisions, groups of people make decisions.

The way society forms mass-opinions and decides (i.e. by voting) on important issues is not easily split into groups of people making decisions.

Still I accept your mechanism because group decisions are a large part of society and improving that will improve society.

About the group project: If we can get everyone to be "genuinely rational" instead of just a bit more rational we will certainly live in a very different world. I don't expect that anytime soon though.

Your project could benefit from increased obedience as you could just lead rationally and the others would follow.

This is a good point even for the society. To get a rational society, it is not necessary that literally everyone becomes rational. Just that the rational people make the most important decisions, and the others follow them.

Although there are dangers with this solution in a long term; specifically that some day the irrational people may decide to stop following the rational ones. In democracy it means someone else uses some simple tricks to get their attention, and wins the elecion. On the other hand, the non-democratic societies have another long-term risk, which is the leading group becoming irrational from the inside; either they lose their sanity gradually, or just a small subset goes insane and succeeds to remove the others from the inner circle.

Have a look at Kathryn Schultz's writing. She focuses on a single key piece of rationality - the necessity and desirability of changing one's mind - and writes about it in a fashion that I assume suits your project very well.

Dan Ariely's books offer some good examples of organizations leaving metaphorical hundred dollar bills lying on the ground, and refusing to pick them up even when they're pointed out, apparently out of sheer inertia habit. I'd be more specific, but it's been more than a year since I read any of them, so I've forgotten most of the details. I'd suggest checking out Predictably Irrational and The Honest Truth about Dishonesty for some good examples.

society is generally irrational

Do you mean that people are generally irrational, or that the people could change their relations to each other so as to better achieve their ends?

What you wrote seems to me an anthropomorphism of society.

That people are generally irrational.

Society is irrational enough that degrees with a large rationality training component (MBA, Finance, maybe CS somewhat?) make the individual who acquires them very valuable.

Something about the War On Drugs maybe? Or even better, something which happened in the past and is now clearly absurd, such as the Dutch Tulip Bubble, or the threat to racial purity in america posed by the Irish.

Part of the problem is that rationality is a threat to many groups, and they have a tendency to consider rationality to be an ideology, equivocate between "ideology" and "religion", and then pretend that teaching rationality in public school constitutes an establishment of religion. People define their identity in terms of irrational belief systems tend to not like the idea of the next generation being raised to be rational.

And of course, things like the Prisoner's Dilemma show that individual rationality doesn't necessarily translate to group rationality, while the Efficient Market Hypothesis posits that the aggregate can be treated as being rational in the absence of individual rationality.

I think that one idea that should be discussed in schools is making intuitive reasoning explicit. For instance, I think that everyone engages in a version of Bayesian reasoning, but most people just have a vague, intuitive sense of it that is very buggy and susceptible to manipulation, like affirming the consequent ( if ((A -> B) and B), then that can be evidence for A, but is not proof). Unfortunately, in the cases where students do get lessons on rhetoric, the focus is often on constructing arguments that are persuasive, rather than on constructing arguments that are valid.

If you're arguing on the basis of the benefit to society, one point to bring up is that people should be able to debate issues constructively. That means articulating their thinking (and this in turn means being conscious of what that thinking is, which is often not a trivial issue), understanding what the thinking behind the opposing point of view is, how to clarify issues and identify the main issue of contention, etc.

Part of the problem is that rationality is a threat to many groups, and they have a tendency to consider rationality to be an ideology, equivocate between "ideology" and "religion", and then pretend that teaching rationality in public school constitutes an establishment of religion. People define their identity in terms of irrational belief systems tend to not like the idea of the next generation being raised to be rational.

Actually those people are probably right when they charge that pushing rationality often goes hand in hand with pushing an ideology.

It's no accident that one of the examples that Eliezer uses is Robert Aumann "scary" Jewish belief in the link.

In the same way on can catch racists via implicit reasoning tests you can probably catch new atheists who push rationality as a means to push an ideology.

Actually those people are probably right when they charge that pushing rationality often goes hand in hand with pushing an ideology.

And ever worse, people pushing "rationality" often aren't actually pushing rationality (defined as: making maps that better correspond to territory).

If a random person came and tried to teach (their definition of) rationality at schools, in absence of more specific information, I would probably be afraid.

Yes, a lot of those people don't think in terms of maps.

As far the definition goes I'm even more radical and would say: "Making maps that allow you to better navigate the territory."

From a slightly different angle-- there are a lot of established groups (families, schools, religions, countries) which don't want their members thinking about whether loyalty is worth the costs and whether obedience serves the purposes of the group. For example, one major reason to enlist in the military is to serve one's country. People are not exactly encouraged to think about whether joining the military (whether for them personally or in general) is an effective means of doing so.

Teaching rationality in a way which implies that people should make a serious effort to make rational decisions could be up against a hard fight.

Now that I think about it, CFAR and the like haven't run into that sort of resistance that I know of, and I'm assuming it's because CFAR is still too small to be noticed.