To teach people about a topic you've labeled "rationality", it helps for them to be interested in "rationality". (There are less direct ways to teach people how to attain the map that reflects the territory, or optimize reality according to their values; but the explicit method is the course I tend to take.)
And when people explain why they're not interested in rationality, one of the most commonly proffered reasons tends to be like: "Oh, I've known a couple of rational people and they didn't seem any happier."
Who are they thinking of? Probably an Objectivist or some such. Maybe someone they know who's an ordinary scientist. Or an ordinary atheist.
That's really not a whole lot of rationality, as I have previously said.
Even if you limit yourself to people who can derive Bayes's Theorem—which is going to eliminate, what, 98% of the above personnel?—that's still not a whole lot of rationality. I mean, it's a pretty basic theorem.
Since the beginning I've had a sense that there ought to be some discipline of cognition, some art of thinking, the studying of which would make its students visibly more competent, more formidable: the equivalent of Taking a Level in Awesome.
But when I look around me in the real world, I don't see that. Sometimes I see a hint, an echo, of what I think should be possible, when I read the writings of folks like Robyn Dawes, Daniel Gilbert, Tooby & Cosmides. A few very rare and very senior researchers in psychological sciences, who visibly care a lot about rationality—to the point, I suspect, of making their colleagues feel uncomfortable, because it's not cool to care that much. I can see that they've found a rhythm, a unity that begins to pervade their arguments—
Yet even that... isn't really a whole lot of rationality either.
Even among those whose few who impress me with a hint of dawning formidability—I don't think that their mastery of rationality could compare to, say, John Conway's mastery of math. The base knowledge that we drew upon to build our understanding—if you extracted only the parts we used, and not everything we had to study to find it—it's probably not comparable to what a professional nuclear engineer knows about nuclear engineering. It may not even be comparable to what a construction engineer knows about bridges. We practice our skills, we do, in the ad-hoc ways we taught ourselves; but that practice probably doesn't compare to the training regimen an Olympic runner goes through, or maybe even an ordinary professional tennis player.
And the root of this problem, I do suspect, is that we haven't really gotten together and systematized our skills. We've had to create all of this for ourselves, ad-hoc, and there's a limit to how much one mind can do, even if it can manage to draw upon work done in outside fields.
The chief obstacle to doing this the way it really should be done, is the difficulty of testing the results of rationality training programs, so you can have evidence-based training methods. I will write more about this, because I think that recognizing successful training and distinguishing it from failure is the essential, blocking obstacle.
There are experiments done now and again on debiasing interventions for particular biases, but it tends to be something like, "Make the students practice this for an hour, then test them two weeks later." Not, "Run half the signups through version A of the three-month summer training program, and half through version B, and survey them five years later." You can see, here, the implied amount of effort that I think would go into a training program for people who were Really Serious about rationality, as opposed to the attitude of taking Casual Potshots That Require Like An Hour Of Effort Or Something.
Daniel Burfoot brilliantly suggests that this is why intelligence seems to be such a big factor in rationality—that when you're improvising everything ad-hoc with very little training or systematic practice, intelligence ends up being the most important factor in what's left.
Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?
Of this there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.
I do not except myself from this criticism. I am no beisutsukai, because there are limits to how much Art you can create on your own, and how well you can guess without evidence-based statistics on the results. I know about a single use of rationality, which might be termed "reduction of confusing cognitions". This I asked of my brain, this it has given me. There are other arts, I think, that a mature rationality training program would not neglect to teach, which would make me stronger and happier and more effective—if I could just go through a standardized training program using the cream of teaching methods experimentally demonstrated to be effective. But the kind of tremendous, focused effort that I put into creating my single sub-art of rationality from scratch—my life doesn't have room for more than one of those.
I consider myself something more than a first-dan black belt, and less. I can punch through brick and I'm working on steel along my way to adamantine, but I have a mere casual street-fighter's grasp of how to kick or throw or block.
Why are there schools of martial arts, but not rationality dojos? (This was the first question I asked in my first blog post.) Is it more important to hit people than to think?
No, but it's easier to verify when you have hit someone. That's part of it, a highly central part.
But maybe even more importantly—there are people out there who want to hit, and who have the idea that there ought to be a systematic art of hitting that makes you into a visibly more formidable fighter, with a speed and grace and strength beyond the struggles of the unpracticed. So they go to a school that promises to teach that. And that school exists because, long ago, some people had the sense that more was possible. And they got together and shared their techniques and practiced and formalized and practiced and developed the Systematic Art of Hitting. They pushed themselves that far because they thought they should be awesome and they were willing to put some back into it.
Now—they got somewhere with that aspiration, unlike a thousand other aspirations of awesomeness that failed, because they could tell when they had hit someone; and the schools competed against each other regularly in realistic contests with clearly-defined winners.
But before even that—there was first the aspiration, the wish to become stronger, a sense that more was possible. A vision of a speed and grace and strength that they did not already possess, but could possess, if they were willing to put in a lot of work, that drove them to systematize and train and test.
Why don't we have an Art of Rationality?
Third, because current "rationalists" have trouble working in groups: of this I shall speak more.
Second, because it is hard to verify success in training, or which of two schools is the stronger.
But first, because people lack the sense that rationality is something that should be systematized and trained and tested like a martial art, that should have as much knowledge behind it as nuclear engineering, whose superstars should practice as hard as chess grandmasters, whose successful practitioners should be surrounded by an evident aura of awesome.
And conversely they don't look at the lack of visibly greater formidability, and say, "We must be doing something wrong."
"Rationality" just seems like one more hobby or hobbyhorse, that people talk about at parties; an adopted mode of conversational attire with few or no real consequences; and it doesn't seem like there's anything wrong about that, either.
Eliezer, I have recommended to you before that you read The Darkness That Comes Before and the associated trilogy. I repeat that recommendation now. The monastery of Ishual is your rationalist dojo, and Anasurimbor Kellhus is your beisutsukai surrounded by a visible aura of formidability. The book might even give you an idea or two.
My only worry with the idea of these dojos is that I doubt the difference between us and Anasurimbor Kellhus is primarily a difference in rationality levels. I think it is more likely to be akrasia. Even an irrational, downright stupid person can probably think of fifty ways to improve his life, most of which will work very well if he only does them (quit smoking, quit drinking, study harder in school, go on a diet). And a lot of people with pretty well developed senses of rationality whom I know, don't use them for anything more interesting than winning debates about abortion or something. Maybe the reason rationalists rarely do that much better than anyone else is that they're not actually using all that extra brainpower they develop. The solution to that isn't more brainpower.
Kellhus was able to sit down, enter the probability trance, decide on the be... (read more)
I think the akrasia you describe and methods of combating it would come under the heading of "kicking", as opposing to the "punching" I've been talking about. It's an art I haven't created or learned, but it's an art that should exist.
This "art of kicking" is what pjeby has been working toward, AFAICT. I haven't read much of his writing, though. But an "art of kicking" would be a great thing to mix in with the OB/LW corpus, if pjeby has something that works, which I think he has at least some of -- and if we and he can figure out how to hybridize kicking research and training with punching research and training.
I'd also love to bring in more people from the entrepreneurship/sales/marketing communities. I've been looking at some of their better literature, and it has rationality techniques (techniques for not shooting yourself in the foot by wishful thinking, overconfidence, etc.) and get-things-done techniques mixed together. I love the sit-and-think math nerd types too, and we need sitting and thinking; the world is full of people taking action toward the wrong goals. But I'd expect better results from our rationalist community if we mixed in more people whose natural impulses were toward active experiments and short-term visible results.
Pjeby's working on akrasia? I'll have to check out his site.
That brings up a related question that I think Eliezer hinted at: what pre-existing bodies of knowledge can we search through for powerful techniques so that we don't have to re-invent the wheel? Entrepreneurship stuff is one. Lots of people have brought up pick-up artists and poker, so those might be others.
I nominate a fourth that may be controversial: mysticism. Not the "summon demons" style of mysticism, but yoga and Zen and related practices. These people have been learning how to examine/quiet/rearrange their minds and sort out the useful processes from the useless processes for the past three thousand years. Even if they've been working off crazy metaphysics, it'd be surprising if they didn't come up with something. Eliezer talks in mystical language sometimes, but I don't know whether that's because he's studied and approves of mysticism or just likes the feel of it.
What all of these things need is a testing process combined with people who are already high-level enough that they can sort through all the dross and determine which techniques are useful without going native or opening themselves up to the accusation that they're doing so; ie people who can sort through the mystical/pick-up artist/whatever literature and separate out the things that are useful to rationalists from the things specific to a certain worldview hostile to our own. I've seen a few good people try this, but it's a mental minefield and they tend to end up "going native".
In the case of pickup literature, there is a lot to attract rationalists, but also a lot to inspire their ire.
The first thing rationalists should notice about pickup is that it wins. There are no other resources in mainstream culture or psychology that are anywhere near as effective. Yet even after witnessing the striking ability of pickup theories to win, I am hesitant to say that they are actually true. For example, I acknowledge the fantastic success of notions like "women are attracted to Alpha Males," even though I don't believe that they are literally true, and I know that they are oversimplifications of evolutionary psychology. Consequently, I am an instrumentalist, not a realist, about pickup theories.
If we started a project from scratch where we applied rationality to the domain of sex and relationships, and developed heuristics to improve ourselves in those areas, this project would have a considerable overlap with the teachings of the seduction community. At its best, pickup is "applied evolutionary psychology." Many of the common criticisms of pickup demonstrate an anger against the use of rationality and scientific thinking in the supposedly sacred... (read more)
Also, since this particular community leans altruistic, I'd hope that such a project would emphasize the future happiness of potential partners more than does (correct me if I'm wrong) the current pickup community.
I tune out wherever I hear the term 'alpha male' in that sort of context. The original scientific concept has been butchered and abused beyond all recognition. Even more so the 'beta' concept. Beta males are the ones standing right behind the alpha ready to overthrow him and take control themselves. 'Omega' should be the synonym for 'pussy'.
But I must admit the theory is at least vaguely in the right direction and works. Reasonably good as popular science for the general public. Better than what people believe about diet, showering, and dental hygene.
Actually, the best (and most common) criticisms I see are more due to the use of lies and manipulation in the area of sex and romance.
The evo-psych stuff (and thereby any science and rationality) is perfectly fine by me.
You've hit on something that I have long felt should be more directly addressed here/at OB. Full disclosure is that I have already written a lot about this myself and am cleaning up some "posts" and chipping away here to get the karma to post them.
It's tough to talk about meditation-based rationality because (a) the long history of truly disciplined mental practice comes out of a religious context that is, as you note, comically bogged down in superstitious metaphysics, (b) it is a more-or-less strictly internal process that is very hard to articulate (c) has become a kind of catch-all category for sloppy new-age thinking about a great number of things (wrongheaded, pop quantum theory, anyone?)
Nevertheless, as Yvain notes, there is indeed a HUGE body of practice and tried-and-true advice, complete with levels of mastery and, if you have been lucky enough to know some the masters, that palpable awesomeness Eliezer speaks of. I'm sure all of this sounds pretty slippery and poppish, but it doesn't have to be. One thing I would like to help get going here is a rigorous discussion, for my benefit and everyone's, about how we can apply the science of cognition to the practice of meditation and vice versa.
Couldn't agree more. Execution is crucial.
I can come out of a probability trance with a perfect plan, an ideal path of least resistance through the space of possible worlds, but now I have to trick, bribe or force my messy, kludgy, evolved brain into actually executing the plan.
A recent story from my experience. I had (and still have) a plan involving a relatively large chunk of of work, around a full-time month. Nothing challenging, just 'sit down and do it' sort of thing. But for some reason my brain is unable to see how this chunk of work will benefit my genes, so it just switches into a procrastination mode when exposed to this work. I tried to force myself to do it, but now I get an absolutely real feeling of 'mental nausea' every time I approach this task – yes, I literally want to hurl when I think about it.
For a non-evolved being, say an intelligently-designed robot, the execution part would be a non-issue – it gets a plan, it executes it as perfectly as it can, give or take some engineering inefficiencies. But for an evolved being trying to be rational, it's an entirely different story.
If one had public metrics of success at rationality, the usual status seeking and embarrassment avoidance could encourage people to actually apply their skills.
An idea on how to make the execution part trivial – a rational planner should treat his own execution module as a part of the external environment, not as a part of 'himself'. This approach will produce plans that take into account the inefficiencies of one's execution module and plan around them.
Because they don't win? Because they don't reliably steer reality into narrow regions other people consider desirable?
I've met and worked with several irrationalists whose models of reality were, to put it mildly, not correlated to said reailty, with one explicit, outspoken anti-rationalist with a totally weird, alien epistemology among them. All these people had a couple of interesting things in common.
On one hand, they were often dismal at planning – they were unable to see obvious things, and they couldn't be convinced otherwise by any arguments appealing to 'facts' and 'reality' (they universally hated these words).
On the other hand, they were surprisingly good at execution. All of them were very energetic people who didn't fear any work or situation at all, and I almost never saw any of them procrastinating. Could this be because... (read more)
I’ll describe three most interesting cases.
Number One is a Russian guy, now in his late 40s, with a spectacular youth. Among his trades were smuggling (during the Soviet era he smuggled brandy from Kazakhstan to Russia in the water system of a railway car), teaching in a ghetto college (where he inadvertently tamed a class of delinquents by hurling a wrench at their leader), leading a programming lab in an industrial institute, starting the first 3D visualization company in our city, reselling TV advertising time at a great margin (which he obtained by undercover deals involving key TV people and some outright gangsters), and saving the world by trying to find venture funding for a savant inventor who supposedly had a technology enabling geothermal energy extraction (I also worked together with them on this project). He was capable of totally crazy things, such as harpooning a wall portrait of a notorious Caucasus clanlord in a room full of his followers. He had lots of money during his successful periods, but was unable to convert this into a longer-term success.
Number Two is a deaf-mute woman, now in her 40s, who owns and runs a web development company. Her speech is distorted, ... (read more)
Thanks, Vladimir. You have interesting friends!
Yes, the guy is smart, swift-thinking and quick to act when it comes to getting projects up from the ground, connecting the right people and getting funding from nowhere (much less so when it comes to technical details and fine-grained planning). His actual decisions are effective, regardless of the stuff he has in the conscious part of his head.
(Actually quite a lot of people whose 'spoken' belief systems are suboptimal or plain weird are perfectly able to drive cars, run companies, avoid tigers and otherwise deal with the reality effectively.)
But can we call such 'hardware-accelerated' decisions rational? I don't know.
Regarding your question. We had obvious disagreements with this guy, and I spent some time thinking about how can we resolve them. As a result, I decided that trying to resolve them (on a conscious level of course) is futile unless we have an agreement about fundamental things -- what we define as truth, and which methods can we use to derive truths from other truths.
I didn't think much about this issue before I met him (a scientific, or more specifically, Popperian worldview was enough for me), and this was the first time I had to consciously think about the issue. I even doubt I knew the meaning of the term 'epistemology' back then :)
Unfortunately, this hasn't aged very impressively.
Despite the attempts to build the promised dojo (CFAR, Leverage/Paradigm, the EA Hotel, Dragon Army, probably several more that I'm missing), rationalists aren't winning in this way. The most impressive result so far is that a lot of mid-tier powerful people read Slate Star Codex, but I think most of that isn't about carrying on the values Eliezer is trying to construct in this sequence - Scott is a good writer on many topics, most of which are at best rationality-adjacent. The second most impressive result is the power of the effective altruism movement, but that's also not the same thing Eliezer was pointing at here.
The remaining positive results of the 2009 rationality community are a batch of happy group houses, and MIRI chugging along its climb (thanks to hard-to-replicate personalities like Eliezer and Nate).
I think the "all you need is to try harder" stance is inferior to the "try to make a general postmortem of 'rationalist dojo' projects in general" stance, and I'd like to see a systematic attempt at the latter, assembling public information and interviewing people in all of these groups, and integrating all the data on why they failed to live up to their promises.
I'm relatively new to rationality, but I've been a nihilist for nearly a decade. Since I've started taking developing my own morality seriously, I've put about 3500 hours of work into developing and strengthening my ethical framework. Looking back at myself when nihilism was just a hobbyhorse, I wasn't noticeably moral, and I certainly wasn't happy. I was a guy who knew things, but the things I knew never got put into practice. 5 years later, I'm a completely different person than I was when I started. I've made a few discoveries, but not nearly enough to account for the radical shifts in my behavior. My behavior is different because I practice.
I know a few other nihilists. They post pictures of Nietzsche on Facebook, come up with clever arguments against religion, and have read "the Anti-Christ." They aren't more moral... (read more)
While developing a rationality metric is obviously crucial, I have this nagging suspicion that what it may take is simply a bunch of committed wanna-be rationalists to just get together and, well, experiment and teach and argue, etc with each other in person regularly, try to foster explicit social rules that support rather than inhibit rationality, and so on.
From there, at least use a fuzzy this "seems" to work/not work type metric, even if it's rather subjective and imprecise, as a STARTING POINT, until one can more precisely do that, until one gets a better sense of exactly what to look for, explicitly.
But, my main point is my suspicion that "do it, even if you're not entirely sure yet what you're doing, just do it anyways and try to figure it out on the fly" may actually be what it takes to get started. If nothing else, it'll produce some nice case study in failure that at least one can look at and say "okay, let's actually try to work out what we did wrong here"
EDIT: hrm... maybe I ought reconsider my position. Will leave this up, at least for now, but with the added note that now I'm starting to suspect myself of basically just trying to "solve the problem without having to, well, actually solve the problem"
Every dojo has its sensei. There is a need for curriculum, but also skilled teachers to guide the earnest student. LessWrong and Overcoming Bias have, to some extent, been the dojo in which the students train. I think that you may find a lot of value in just jumping into a project like this: starting a small school that meets two times a week to practice a particular skill of rationality. A key goal to the budding school is to train the future's teachers.
One of my barriers to improving my rationality is little awareness of what the good reading and study m... (read more)
On a side note, we have religious schools where a religion, such as Christianism, is part of the cursus. This indoctrinates young minds very early in their life, and leaves them scared, biased in most cases for the rest of their existence.
If we had, on the other hand, schools where even just basics of rationality and related topics, such as game theory, economics, scientific method, probabilities, biases, etc. were taught, what a difference it would make.
The sooner you kickstart rationality in a person, the longer they have to learn and practice it, obv... (read more)
For a nice literary description of what it means to have an "aura of awesome" try "The String Theory" by David Foster Wallace. Wallace writes of a mid-level pro tennis player: "The restrictions on his life have been, in my opinion, grotesque... But the radical compression of his attention and sense of himself have allowed him to become a transcendent practitioner of an art."
Perhaps in the future humans will achieve the same level of excellence at the Art of Rationality as some currently do at the Art of Tennis.
I see what you're saying about rationality being trained in a pure fashion (where engineering, the sciences in general, etc. is - hopefully - "applied rationality"). One thing I don't see you mention here but it was a theme in your 3 worlds story, and which is also a factor in martial arts training, is emotional management. That's crucial for rationality, since it will most likely be our feelings that lead us astray. Look at how the feeling of "trust" did in Madoff's investors. Muay thai and Aikido deal with emotions differently, but e... (read more)
Just an observation: Few modern American karate schools ever let you hit someone, except when a lot of padding is involved. Fighting is not usually an element in exams below the blackbelt level. Competition is usually optional and not directly linked to advancement. I've seen students attain advanced belts without having any real-life fighting ability.
(The term "dojo" is Japanese, and I think most Japanese martial artists study Judo or Aikido, which are not subject to these criticisms.)
Isn't this a description of what a liberal arts education is supposed to provide? The skills of 'how to think' not 'what to think'? I'm not too familiar with the curriculum since I did not attend a liberal arts college, instead I was conned into an overpriced private university, but if anyone has more info please chime in.
What kinds of tests or contests might we have? One that I can think of would be to have students try to create some visible, small scale effect in a society, with points for efficiency.
Elizer raises the issue of testing a rationality school. I can think of a simple way to at least approach this: test the students for well-understood cognitive biases. We have tests for plenty of biases; some of the tests don't work if you know about them, which surely these students will, but some do, and we can devise new tests.
For example, you can do the classic test of confirmation bias where you give someone solid evidence both for and against a political position and see if they become more or less certain. Even people who know about this experiment should often still fall prey to it—if they don't, they have demonstrated their ability to escape confirmation bias.
As a thought, could it be that one of the major obstacles standing in the way of the creation of a "rationality dojo" is the public perception (however inaccurate) that such already exists in not just one but multiple forms? Take the average high school debate club as one example: participants are expected to learn to give a reasoned argument, and to avoid fallacious reasoning while recognizing it in their opponents. Another example would be maths classes, wherein people are expected to learn how to construct a sound mathematical proof. I very much doubt that most people would understand the distinction between these and the proposed "rationality dojo", which would make it very hard to establish one.
It's easy to define success in martial arts. Defining 'rationality' is harder. Have you done so yet, Eliezer?
Even in martial arts, many of the schools of thoughts are essentially religions or cults, completely unconcerned with fighting proficiency and deeply concerned with mastering the arcane details of a sacred style passed on from teacher to student.
Such styles often come with an unrealistic conviction that the style is devastatingly effective, but there is little concern with testing that.
See also: http://www.toxicjunction.com/get.asp?i=V2741
I've ... (read more)
Why aren't rationalists more formidable? Because it takes more than rationality to be formidable. There's also intelligence, dedication, charisma, and other factors, which rationality can do little to improve. Also, formidability is subjective, and I suspect that more intelligent people are less likely to find others formidable. As for why there isn't an art of rationality, I think it's because people can be divided into two groups: those who don't think rationality is particularly important and don't see the benefits of becoming more rational, and those who see rationality as important but are already rational for the most part, and for them, additional rationality training isn't going to result in a significant improvement.