They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.
Not necessarily. Hindsight bias is likely at work here. You know that studying these fields helped you to acquire better beliefs, and so you conclude that this consequence should be obvious. But unless a curious but untrained reasoner somehow finds out that studying these fields will help them, we shouldn't expect them to study them. Why on earth would someone try to read The Logic of Science if they didn't already know that it would improve their reasoning skills?
There are a lot more genuinely curious people out there than there are rationalists. But unless those curious people happen to meet a LWer, or stumble across a link to this site, their chances of lear...
I agree in part, though this excuse was stronger before Google. Now people can Google "how to think better" or "how to figure out what's true" and start looking around. One thing leads to another. Almost all the stuff I mention above is discussed in many of the textbooks on thinking and deciding — like, say, Thinking and Deciding.
I tried typing those queries (and related ones) into google, to see if someone could easily find some sort of starting point for rationality. "How to think better" yields many lists of tips that are mediocre at best (things like: exercise, become more curious, etc). About halfway down the page, interestingly, is a post on CSA, but it's not a great one. It seems to mostly say that to get better at thinking you first have to realize that you are not naturally a fantastic thinker. This is true, but it's not something that points the way forward towards bayesian rationality. (by the way, "how to figure out what's true" provides essentially nothing of value, at least on the first page).
In order for someone to go down the path you've identified on their own, as a curious individual, they would have to have a substantial amount of luck to get started. Either they would have to have somehow stumbled upon enough of an explanation of heuristics and biases that they realized the importance of them (which is a combination of two fairly unlikely events), or they would have to be studying those subjects for some reason other than their instrumental value. Someone who started off curiously studying AI would have a much better chance at finding this path, for this reason. AI researchers in this instance, have a tremendous advantage when it comes to rationality over researchers in the hard sciences, engineers, etc.
I'm not an expert, but with this in mind it should be a rather simple matter to apply a few strategies so that LW shows up near the top of relevant search results. At the very least we could create wiki pages with titles like "How to Think Better" and "How to Figure Out What's True" with links to relevant articles or sequences. The fact that rationality has little obvious commercial value should work in our favor by keeping competing content rather sparse.
Is rationality a common enough word that people would naturally jump to it when trying to figure out how to think better? I'm not sure how often I used it before Less Wrong, but I know that it is substantially more commonplace after reading the sequences.
Yes, sign out of Google or use a different browser where you're not signed in, and you'll see that Eliezer successfully took over the word 'rationality'. Let this be a lesson about what is possible.
Thanks MinibearRex.
I've added ads on Google AdWords that will start coming up for this in a couple days when the new ads get approved so that anyone searching for something even vaguely like "How to think better" or "How to figure out what's true" will get pointed at Less Wrong. Not as good as owning the top 3 spots in the organic results, but some folks click on ads, especially when it's in the top spot. And we do need to make landing on the path towards rationality less of a stroke of luck and more a matter of certainty for those who are looking.
I'd handle shame-flavored incentives with tongs. It's plausible that I have an unusual degree of sensitivity on the subject, but I'm making progress on a very bad case of self-hatred and akrasia, and "is my curiosity good enough?" strikes me as a sort of self-alienation which takes focus away from paying attention to whatever you might be curious about.
"What might I be missing about this?", "How can I increase my enthusiasm for learning?", "How can I spend less time on errors while still taking on difficult projects?" seem much safer. "What am I doing to improve my life? Is it having the desired effect?" should probably be on the list.
Also, they would seek to personally become an immortal super-intelligence, since many truths simply can't be learned by an unenhanced human, and certainly not within a human lifetime.
(Which is why the Yudkowsky-Armstrong Fun-Theoretic Utopia leaves me cold. Would any curious person not choose to become superintelligent and "have direct philosophical conversations with the Machines" if the only alternative is to essentially play the post-Singularity equivalent of the World of Warcraft?)
Something I learned viscerally while I was recovering from brain damage is that intelligence is fun. I suspect I'd want to enhance my intelligence in much the same way that I'd want to spend more time around puppies.
I'm having difficulty knowing what level of rationalist this is aimed at. Are the people you talk to every week students of rationality, or 'normal' people?
This post applies to both, I imagine. But because you talk about "people" instead of explicitly talking about people like me, it's easy to see this post as not being aimed at me. (Maybe it's not).
What I mean is: It's easy to praise oneself and one's peers by talking about people of a lower class. When I was young, it was 'dumb people', when I was a bit more sophisticated it was 'theists', when I was an Objectivist, it was 'non-Objectivists', and now that I'm a rationalist the temptation is to criticize those who "know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques." So this post, because it isn't clearly directed at people who have worked hard to do better in the ways prescribed by the Sequences, causes my semiconscious mind to ask: "is this a beginning level post, or something I should actually pay attention to?" Are you telling me to do better, or criticizing outsiders in order to pro...
I predict you're selling yourself short. Maybe my weaknesses and shortcomings are largely filtered out if you know me only through my writings, but the people I work with every week could list them for you. There is clearly a level (or 5) above my own.
Moreover, I've been studying rationality for years, and since April have had the benefit of working on rationality or x-risk full time.
It's very hard to tell "what it is about me" that gives me the rationalist powers I do possess, but if I had to guess, the single biggest thing would be my deep desire to say oops whenever appropriate, which I suspect I got from having wasted 21 years of my life for failing to say oops about the supernatural. I don't want to waste my time like that again.
See also: Twelve Virtues of Rationality, The Meditation on Curiosity, Use Curiosity
What would it look like if someone was truly curious — if they actually wanted true beliefs? Not someone who wanted to feel like they sought the truth, or to feel their beliefs were justified. Not someone who wanted to signal a desire for true beliefs. No: someone who really wanted true beliefs. What would that look like?
A truly curious person would seek to understand the world as broadly and deeply as possible. They would study the humanities but especially math and the sciences. They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.
They would practice truth-seeking skills as a musician practices playing her instrument. They would practice "debiasing" techniques for reducing common thinking errors. They would seek out contexts known to make truth-seeking more successful. They would ask others to help them on their journey. They would ask to be held accountable.
They would cultivate that burning itch to know. They would admit their ignorance but seek to destroy it.
They would be precise, not vague. They would be clear, not obscurantist.
They would not flinch away from experiences that might destroy their beliefs. They would train their emotions to fit the facts.
They would update their beliefs quickly. They would resist the human impulse to rationalize.
But even all this could merely be a signaling game to increase their status in a group that rewards the appearance of curiosity. Thus, the final test for genuine curiosity is behavioral change. You would find a genuinely curious person studying and learning. You would find them practicing the skills of truth-seeking. You wouldn't merely find them saying, "Okay, I'm updating my belief about that" — you would also find them making decisions consistent with their new belief and inconsistent with their former belief.
Every week I talk to people who say they are trying to figure out the truth about something. When I ask them a few questions about it, I often learn that they know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques. They do not regularly practice the skills of truth-seeking. They don't seem to say "oops" very often, and they change their behavior even less often. I conclude that they probably want to feel they are truth-seeking, or they want to signal a desire for truth-seeking, or they might even self-deceivingly "believe" that they place a high value on knowing the truth. But their actions show that they aren't trying very hard to have true beliefs.
Dare I say it? Few people look like they really want true beliefs.