What Curiosity Looks Like

See also: Twelve Virtues of Rationality, The Meditation on Curiosity, Use Curiosity

What would it look like if someone was truly curious — if they actually wanted true beliefs? Not someone who wanted to feel like they sought the truth, or to feel their beliefs were justified. Not someone who wanted to signal a desire for true beliefs. No: someone who really wanted true beliefs. What would that look like?

A truly curious person would seek to understand the world as broadly and deeply as possible. They would study the humanities but especially math and the sciences. They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.

They would practice truth-seeking skills as a musician practices playing her instrument. They would practice "debiasing" techniques for reducing common thinking errors. They would seek out contexts known to make truth-seeking more successful. They would ask others to help them on their journey. They would ask to be held accountable.

They would cultivate that burning itch to know. They would admit their ignorance but seek to destroy it.

They would be precise, not vague. They would be clear, not obscurantist.

They would not flinch away from experiences that might destroy their beliefs. They would train their emotions to fit the facts.

They would update their beliefs quickly. They would resist the human impulse to rationalize.

But even all this could merely be a signaling game to increase their status in a group that rewards the appearance of curiosity. Thus, the final test for genuine curiosity is behavioral change. You would find a genuinely curious person studying and learning. You would find them practicing the skills of truth-seeking. You wouldn't merely find them saying, "Okay, I'm updating my belief about that" — you would also find them making decisions consistent with their new belief and inconsistent with their former belief.

Every week I talk to people who say they are trying to figure out the truth about something. When I ask them a few questions about it, I often learn that they know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques. They do not regularly practice the skills of truth-seeking. They don't seem to say "oops" very often, and they change their behavior even less often. I conclude that they probably want to feel they are truth-seeking, or they want to signal a desire for truth-seeking, or they might even self-deceivingly "believe" that they place a high value on knowing the truth. But their actions show that they aren't trying very hard to have true beliefs.

Dare I say it? Few people look like they really want true beliefs.

283 comments, sorted by
magical algorithm
Highlighting new comments since Today at 5:14 PM
Select new highlight date
Moderation Guidelinesexpand_more

They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.

Not necessarily. Hindsight bias is likely at work here. You know that studying these fields helped you to acquire better beliefs, and so you conclude that this consequence should be obvious. But unless a curious but untrained reasoner somehow finds out that studying these fields will help them, we shouldn't expect them to study them. Why on earth would someone try to read The Logic of Science if they didn't already know that it would improve their reasoning skills?

There are a lot more genuinely curious people out there than there are rationalists. But unless those curious people happen to meet a LWer, or stumble across a link to this site, their chances of learning the benefits of studying these subjects are not great. There are a few books that might get them started (Robyn Dawes is coming to mind), but how likely is it that they're going to stumble across one of those books, especially if they aren't explicitly interested in the field of human thought already (like EY?).

I would bet that there are a lot of genuinely curious people out there who have realized that thinking is a skill. But if you were to ask them for the best way they knew of to improve that skill, would say something along the lines of "sudoku puzzles". And that's pretty sad.

I agree in part, though this excuse was stronger before Google. Now people can Google "how to think better" or "how to figure out what's true" and start looking around. One thing leads to another. Almost all the stuff I mention above is discussed in many of the textbooks on thinking and deciding — like, say, Thinking and Deciding.

I tried typing those queries (and related ones) into google, to see if someone could easily find some sort of starting point for rationality. "How to think better" yields many lists of tips that are mediocre at best (things like: exercise, become more curious, etc). About halfway down the page, interestingly, is a post on CSA, but it's not a great one. It seems to mostly say that to get better at thinking you first have to realize that you are not naturally a fantastic thinker. This is true, but it's not something that points the way forward towards bayesian rationality. (by the way, "how to figure out what's true" provides essentially nothing of value, at least on the first page).

In order for someone to go down the path you've identified on their own, as a curious individual, they would have to have a substantial amount of luck to get started. Either they would have to have somehow stumbled upon enough of an explanation of heuristics and biases that they realized the importance of them (which is a combination of two fairly unlikely events), or they would have to be studying those subjects for some reason other than their instrumental value. Someone who started off curiously studying AI would have a much better chance at finding this path, for this reason. AI researchers in this instance, have a tremendous advantage when it comes to rationality over researchers in the hard sciences, engineers, etc.

I'm not an expert, but with this in mind it should be a rather simple matter to apply a few strategies so that LW shows up near the top of relevant search results. At the very least we could create wiki pages with titles like "How to Think Better" and "How to Figure Out What's True" with links to relevant articles or sequences. The fact that rationality has little obvious commercial value should work in our favor by keeping competing content rather sparse.

When I search for keyword: rationality, I get HPMoR for #2, yudkowsky.net for #5, and What Do We Mean By "Rationality"? for #7. Not sure how much my search history is affecting this.

Is rationality a common enough word that people would naturally jump to it when trying to figure out how to think better? I'm not sure how often I used it before Less Wrong, but I know that it is substantially more commonplace after reading the sequences.

You probably get this result because google has figured out it's a better search-result for you.... because you've already gone to those pages before.

Not sure how many people outside of the web world realise this, but google does personalise search results based on your own personal search-habits.

People who have not yet been to any of these pages are much less likely to get the same set of search results as this.

Edit: lukeprog's response (about two below here) below is how to see google for what it actually is like for a newbie.

Yes, sign out of Google or use a different browser where you're not signed in, and you'll see that Eliezer successfully took over the word 'rationality'. Let this be a lesson about what is possible.

Thanks MinibearRex.

I've added ads on Google AdWords that will start coming up for this in a couple days when the new ads get approved so that anyone searching for something even vaguely like "How to think better" or "How to figure out what's true" will get pointed at Less Wrong. Not as good as owning the top 3 spots in the organic results, but some folks click on ads, especially when it's in the top spot. And we do need to make landing on the path towards rationality less of a stroke of luck and more a matter of certainty for those who are looking.

It's been almost three months. How's the data on this campaign going?