John Cook draws on the movie Redbelt to highlight the difference between staged contests and real-world fights. The main character of the movie is a Jiu Jitsu instructor who is willing to fight if necessary, but will not compete under arbitrary rules. Cook analogies this to the distinction between academic and real-world problem solving. Academics and students are often bound by restrictions that are useful in their own contexts, but are detrimental to someone who is more concerned with having a solution than where the solution came from.

Robin pointed arbitrary restrictions in academia out to us before, but his question then was regarding topics neglected for being silly. Following Cook's line of reasoning, are there any arbitrary restrictions we have picked up in school or other contexts that are holding us back? Are there rationalist "cheats" that are being underused?

New Comment
34 comments, sorted by Click to highlight new comments since: Today at 3:34 AM

Rather than just gambling with money, people could gamble with their lives. A global warming denier, for example, could announce that he is so sure that the earth will not be significantly warmer in ten years than it is today that if he is wrong about this he will kill himself. A legal system that enforced such a promise would, clearly, make it possible for someone to very credibly communicate the sincerity of his beliefs.

Scary, voted up.

This can be pushed further: law/moral/ethics are often "holding us back". The use of dissection of human body has been forbidden/allowed many time in history and this affected our knowledge of anatomy and medicine. Many physical and psychological experiments that have been done before cannot be reproduced today, for they were "unethical".

It doesn't have to be Nazis experimentations. Informed consent requires that the person knows that he is under study, which might skew the results.

Some famous experiments were even against the legislation of that time: Louis Pasteur has tested his rabies vaccine illegally.

This vaccine was first used on 9-year old Joseph Meister, on July 6, 1885, after the boy was badly mauled by a rabid dog. This was done at some personal risk for Pasteur, since he was not a licensed physician and could have faced prosecution for treating the boy. However, left without treatment, the boy faced almost certain death from rabies.

Related to the Nazi experiments, there are people in the scientific community who argue that they should not be cited, even in case where they provided valuable information:

Although it is difficult morally, one might concede that within the mass of pseudoscientifc Nazi data some shreds can be valuable to researchers, as a small portion of the hypothermia data has proven to be. Of course, such data should be used only in the most exceptional circumstances and only in the absence of ethically derived data.

This seems absurd. The experiments were horrid and reprehensible, no question - but if they provided useful data, shouldn't we try to salvage at least something good from their deeds?

A policy against it may provide some marginal disincentive to future scientists under vile regimes.

Edit: of course the real cause of the objection is just 'moral contamination,' the same trigger-happy associational neural machinery used to avoid poisonous foods attaches negative affect to anything associated with the Nazis. But the heuristic can sometimes be useful, just as our cooperative emotions can be hacks to implement binding commitments.

If we assume those scientists actually care about their future number of citations, then yes.

How likely is it to be a result of genuine reasoning leading to this conclusion, and how likely is it to be just a rationalization of the yuck factor? It seems pretty straightforward.

A "global warming denier" doesn't necessarily believe the world is not getting warmer, or that it will certainly get colder. Just FYI.

One obvious "cheat" - ask someone else for an answer.

Much time is wasted working out something that should be looked up.

The fear and hatred of gambling. Contra Tyler Cowen, betting your beliefs is one of the best paths to both individual and group rationality. You should be doing it twice a day, like brushing your teeth. The beliefs that don't get bet get cavities and rot; the beliefs that are unbettable create unbreakable deadlocks that later require ophtalmological intervention. Bet!

One warning though: Gambler's ruin is very possible with betting systems, even if your strategy has a positive expected value.

I warn those of you with a Netflix account that Redbelt is one awful mess of a movie. Yes, the Brazilian jiu-jitsu instructor refuses to compete against the evil Brazilians, but it's not because prize fights are inherently different from grappling matches. It's for some sort of dedication to an ideal of honoring a Japanese-looking master that even he can't coherently articulate. Brazilian jiu-jitsu is built upon wrestling others - in class, in tournaments, in MMA fights and in real life.

It's one thing to have an arbitrary restriction; it's another to have one that's simultaneously a contradiction of the core principles of your profession and so unclear even the person following it can't tell you what it is. And it's yet another thing to reward the instructor with a redbelt - the supposed highest honor in jiu-jitsu - solely for getting angry, beating up security guards and brawling in the hallways of an arena to protest the stealing of his stupid idea.

Mamet tried to do about eleven different things with his script and only Eijofor's performance gave that movie its sole redeeming feature.

That being said, this "do not hit a girl" thing has always annoyed me. There are biological difference between genders, but if someone deserves to be punched, they deserve to be punched.

That being said, this "do not hit a girl" thing has always annoyed me. There are biological difference between genders, but if someone deserves to be punched, they deserve to be punched.

Given the difficulty of determining whether someone's pregnant, and the large possibility of violence causing a miscarriage, and the long time it takes for a successful pregnancy, it seems like a pretty good heuristic. Almost as good as "do not hit anyone".


The first thing that springs to mind are the arcane company policies in large corporations, where responsibilities and duties are so finely assigned that they often become barriers to getting actual work done. This becomes especially true if a situation comes along that there aren't codified rules for (or as I like to think of it, a selection pressure is introduced that the system isn't adapted for).

I also think it's interesting to contrast this to the rational technique of the least convenient possible world, where instead of removing artificial restrictions you keep adding them.

I'm curious about what "large corporations" you're writing about. I used to work for Walmart, the largest corporation in the world, for 2.5 years and my perceptions of the "responsibilites and duties" was diametrically opposite what you are claiming. Of course, I know its the leftist/academic thing to dump on business, but could some of you who do actually provide some references that you aren't just making these claims up.

I think a common situation is the manager/idea-generator relationship. An idea generator is a person who spends a lot of their working time simply "thinking", and there is no apparent output to their task until the very end, when they output an idea. A programmer trying to design the right algorithm to solve a given problem is one example of an idea generator.

Often, the manager will want to have some sort of feedback on the progress, and have an estimate of the time remaining to completion. The idea-generator, however, has no idea how long their task will take. They might find the solution this afternoon, or they may spend months brainstorming on it.

And so the manager may "assign" responsibilities like writing daily reports on what was found so far, filling in time sheets, etc. to alleviate their nervousness from seeing nothing produced. Bureaucracy like this is just taking the idea-generator's mind off of the real problem at hand, and can slow things down.

We might say there are two kinds of "responsibility." School teaches people to be responsible to authority; the other kind is being responsible for eventual outcomes (such as truthfulness) by asking questions and challenging authority.

An example would be something I read recently about the institutional mindset held by journalists at newspapers: older editors and managers are practically begging young reporters for new ideas... the problem is the type of people who go to work for a newspapers now tend to want responsibilities (and security) given to them.

Meanwhile a lot of people who never finished their homework or followed their assignment guidelines were distracted from school by new technologies -- sites like this -- and learning from the proliferating information available online.

Ad hominems. We are so well schooled in 'traditional' deductive rationality that we instinctively shy away from using this strategy, even though it's quite powerful and often we're using it in practice anyway.

...we instinctively shy away from using this strategy... often we're using it in practice anyway.

Is this not contradictory?

You understand the hypocrisy, then. We rely on this very general & valid strategy in all sorts of real-life real-money situations, but when it comes to discussions of complex important topics? All of sudden it is 100% verboten.

This, it seems to me, is exactly what a underused rationalist cheat would look like.

Can you give an example of something that this change would sanction?

This, it seems to me, is exactly what a underused rationalist cheat would look like.

I agree that it seems to match my impression of what the form should be. However, it's not just an arbitrary rule to not use ad hominem arguments. Ad hominem is a formal fallacy - non-fallacious ad hominems are really not all that unheard-of in academia.

I don't think that post disagrees with me.

Well, are you talking about higher academia, or earlier schooling too? ie, there's always the classic "your paper must be at least X pages long", which in college professors will explicitly say that you have to unlearn.

your paper must be at least X pages long ... college professors will explicitly say that you have to unlearn

Funny - I find paper lengths to be a good guide to just how much I need to unpack my arguments. And these requirements never go away - try sending a 202 word abstract to a journal that asked for 200.

What do you mean?

Do you mean something like "This is really short. I'm probably skipping a bunch of steps in my reasoning and need to spell it out more explicitly" or do you mean something else?

And the abstract thing is about upper limits vs what I was talking about, which was lower limits.

Do you mean something like "This is really short. I'm probably skipping a bunch of steps in my reasoning and need to spell it out more explicitly" or do you mean something else?

Kindof something else, though I think you get the idea.

For any paper, you can always explain more steps in your reasoning, define your terms better, or bring out more of your assumptions. One of the good heuristics for determining how much you need to do this is to consider your audience. However, when writing for a class for your professor, you don't really have this luxury (if you're writing on something noncontroversial, your argument could be written as a conclusion followed by 'you already know the rest'). Since part of the purpose of writing the paper for the class is to demonstrate whether you understand the material, you need to explain to the professor things he already knows. Paper length is a good rough heuristic to let you know just how much of that you need to do in the course of your argument.

Both, and other contexts as well. Large corporate environments often work under the constraint of "use only standard, safe software tools like Microsoft Word or Java". It's just that those sort of constraints aren't internalized as frequently.

Is Science Doesn't Trust Your Rationality along the lines of what you're thinking?

Yes, thanks for reminding me of that post. Science (as an institution) is a very good example of a constraint that has its purposes, but sometimes has to be overridden.


Interesting article. One part confused me:

Libertarianism secretly relies on most individuals being prosocial enough to tip at a restaurant they won't ever visit again.

Why is this the case? In Australia tipping isn't particularly a norm that we adhere to. What is the point that it is trying to make? How prosocial do I have to be to tip at restaurants?

I think it's intended as a sufficient, but not necessary condition. If a culture can maintain a voluntary, prosocial norm like tipping, it will do well under libertarianism.

Problems in the real world have much more detailed context which can be used to "guess" an answer to then test rationally. Even if it turns out the answer isn't adequate as it stands, and it often isn't, it usually does provide an "advanced base" to find a better answer.

Also real world problems rarely have an accessible "best" answer; you need to find one that is good enough; and you usually need to define "good enough" on the fly as well.