They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.
Not necessarily. Hindsight bias is likely at work here. You know that studying these fields helped you to acquire better beliefs, and so you conclude that this consequence should be obvious. But unless a curious but untrained reasoner somehow finds out that studying these fields will help them, we shouldn't expect them to study them. Why on earth would someone try to read The Logic of Science if they didn't already know that it would improve their reasoning skills?
There are a lot more genuinely curious people out there than there are rationalists. But unless those curious people happen to meet a LWer, or stumble across a link to this site, their chances of lear...
I agree in part, though this excuse was stronger before Google. Now people can Google "how to think better" or "how to figure out what's true" and start looking around. One thing leads to another. Almost all the stuff I mention above is discussed in many of the textbooks on thinking and deciding — like, say, Thinking and Deciding.
I tried typing those queries (and related ones) into google, to see if someone could easily find some sort of starting point for rationality. "How to think better" yields many lists of tips that are mediocre at best (things like: exercise, become more curious, etc). About halfway down the page, interestingly, is a post on CSA, but it's not a great one. It seems to mostly say that to get better at thinking you first have to realize that you are not naturally a fantastic thinker. This is true, but it's not something that points the way forward towards bayesian rationality. (by the way, "how to figure out what's true" provides essentially nothing of value, at least on the first page).
In order for someone to go down the path you've identified on their own, as a curious individual, they would have to have a substantial amount of luck to get started. Either they would have to have somehow stumbled upon enough of an explanation of heuristics and biases that they realized the importance of them (which is a combination of two fairly unlikely events), or they would have to be studying those subjects for some reason other than their instrumental value. Someone who started off curiously studying AI would have a much better chance at finding this path, for this reason. AI researchers in this instance, have a tremendous advantage when it comes to rationality over researchers in the hard sciences, engineers, etc.
I'm not an expert, but with this in mind it should be a rather simple matter to apply a few strategies so that LW shows up near the top of relevant search results. At the very least we could create wiki pages with titles like "How to Think Better" and "How to Figure Out What's True" with links to relevant articles or sequences. The fact that rationality has little obvious commercial value should work in our favor by keeping competing content rather sparse.
Is rationality a common enough word that people would naturally jump to it when trying to figure out how to think better? I'm not sure how often I used it before Less Wrong, but I know that it is substantially more commonplace after reading the sequences.
Yes, sign out of Google or use a different browser where you're not signed in, and you'll see that Eliezer successfully took over the word 'rationality'. Let this be a lesson about what is possible.
Thanks MinibearRex.
I've added ads on Google AdWords that will start coming up for this in a couple days when the new ads get approved so that anyone searching for something even vaguely like "How to think better" or "How to figure out what's true" will get pointed at Less Wrong. Not as good as owning the top 3 spots in the organic results, but some folks click on ads, especially when it's in the top spot. And we do need to make landing on the path towards rationality less of a stroke of luck and more a matter of certainty for those who are looking.
I'd handle shame-flavored incentives with tongs. It's plausible that I have an unusual degree of sensitivity on the subject, but I'm making progress on a very bad case of self-hatred and akrasia, and "is my curiosity good enough?" strikes me as a sort of self-alienation which takes focus away from paying attention to whatever you might be curious about.
"What might I be missing about this?", "How can I increase my enthusiasm for learning?", "How can I spend less time on errors while still taking on difficult projects?" seem much safer. "What am I doing to improve my life? Is it having the desired effect?" should probably be on the list.
Also, they would seek to personally become an immortal super-intelligence, since many truths simply can't be learned by an unenhanced human, and certainly not within a human lifetime.
(Which is why the Yudkowsky-Armstrong Fun-Theoretic Utopia leaves me cold. Would any curious person not choose to become superintelligent and "have direct philosophical conversations with the Machines" if the only alternative is to essentially play the post-Singularity equivalent of the World of Warcraft?)
Something I learned viscerally while I was recovering from brain damage is that intelligence is fun. I suspect I'd want to enhance my intelligence in much the same way that I'd want to spend more time around puppies.
I'm having difficulty knowing what level of rationalist this is aimed at. Are the people you talk to every week students of rationality, or 'normal' people?
This post applies to both, I imagine. But because you talk about "people" instead of explicitly talking about people like me, it's easy to see this post as not being aimed at me. (Maybe it's not).
What I mean is: It's easy to praise oneself and one's peers by talking about people of a lower class. When I was young, it was 'dumb people', when I was a bit more sophisticated it was 'theists', when I was an Objectivist, it was 'non-Objectivists', and now that I'm a rationalist the temptation is to criticize those who "know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques." So this post, because it isn't clearly directed at people who have worked hard to do better in the ways prescribed by the Sequences, causes my semiconscious mind to ask: "is this a beginning level post, or something I should actually pay attention to?" Are you telling me to do better, or criticizing outsiders in order to pro...
I predict you're selling yourself short. Maybe my weaknesses and shortcomings are largely filtered out if you know me only through my writings, but the people I work with every week could list them for you. There is clearly a level (or 5) above my own.
Moreover, I've been studying rationality for years, and since April have had the benefit of working on rationality or x-risk full time.
It's very hard to tell "what it is about me" that gives me the rationalist powers I do possess, but if I had to guess, the single biggest thing would be my deep desire to say oops whenever appropriate, which I suspect I got from having wasted 21 years of my life for failing to say oops about the supernatural. I don't want to waste my time like that again.
And yet the conclusion is so...Hansonesque.
Every week I talk to people who say they are trying to figure out the truth about something. When I ask them a few questions about it, I often learn that they know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques...I conclude that they probably want to feel they are truth-seeking, or they want to signal a desire for truth-seeking, or they might even self-deceivingly "believe" that they place a high value on knowing the truth. But their actions show that they aren't trying very hard to have true beliefs.
Really? What percent of people are aware of the existence of cognitive biases? One percent? At least I wouldn't expect more than that to realize that probability theory or artificial intelligence bear upon questions in seemingly unrelated fields like philosophy or medicine.
And of people who know of the existence of cognitive biases, how many are even capable of genuinely entertaining the thought that they themselves might be biased, as opposed to Rush Limbaugh or unethical pharmaceutical researchers or all those sill...
this particular argument breaks Hanlon's Razor aka the Generalized Anti-Hanson Principle.
Cute, but I'm not sure I would call a Hansonian interpretation "malicious". Maybe "differently optimized".
I'd reserve malice for active manipulation, not status-seeking.
Dare I say it? Few people look like they really want true beliefs.
I think otherwise - most people want to have true beliefs. However, they have rather limited trust in the powers of their own logic, as the experience of school has taught them that they are often wrong. They don't have the numerical skills to embark on anything more numerically ambitious than what money requires. They expect to be wrong often, and rarely use formal reason as such. But they still want to have true beliefs, and rely mostly on intuition and experience to decide on that.
For most people, most beliefs are socially acquired - people acquire their beliefs from the people around them, and they tend to acquire large blocks of belief together. One shouldn't underestimate the sheer amount of work needed to do anything different.
Most people never create a new idea (in the sense you're talking about) in their entire lives - they have experiences, yes, and they change beliefs based on experience. But they do not regard themselves as having the basic equipment to generate ideas, or to be sophisticated in judging between them.
In the end I've come to the view that none of us can change this (well, not anytime soon...
FWIW I linked to this though my Twitter feed and got a very negative reaction from my friends, though the reasons they give for disliking it are pretty varied; they said things like
One person has promised to write up what they felt in a blog post, which I look forward to reading.
They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs.
Really? The others make sense, but it's not clear this will be useful to a human trying to learn things themselves. If I want to notice patterns, "plug all of your information into a matrix and perform eigenvector decompositions" is probably not going to get me very far.
So are those who claim to seek truth, but really don't, merely akratic or are they hypocritical?
Do "curious" people want to learn the (already discovered) truth or to discover heretofore unknown truths? You seem to confound the two. Data on the statistical correlation between these distinct motives would be interesting, but I doubt most scientists are primarily concerned with personally accumulating true beliefs. Preparing to make contribution to human knowledge probably looks a lot different from preparing to absorb the greatest mass of truths. It probably also looks different from preparing to function rationally as far as go quotidian beliefs.
I think you are confusing between wanting to know, and being good at it.
Imagine someone in the stone age, would you say none was genuinely curious because they didn't know about all those fields which weren't invented yet ?
Then, what about someone living in our world, but not knowing about Bayesian reasoning, AI, ... ? How can he know that those fields are fundamental to learn, to satisfy their curiosity on another field, before at least learning the basis of them ? When you don't know about Bayes' theorem, but you are curious (you really want to know the ...
I think you overestimate the ease it is to "jump to the meta-level" (ie, you want to learn about something, so you jump to learning how to learn) to people who were not pointed to do it - by reading Gödel, Escher, Bach, some of LW or anything like that. Someone genuinely curious about "what actually happened in ancient Rome" will read lots of books about it, will go to visit the ruins, go to museums, ... but won't spontaneously start asking about "decision theory" or about "what is the general process to resolve dispute between scholars ?" if not given strong hints that they should do it.
I'm not sure there's an overarching "curiosity" that people have or don't have: I'm very curious about whether a specific kind of database will perform adequately in certain circumstances (long story) but I'm only mildly curious about how to identify which French painter during the 19th century painted which picture. Some art experts, I'm sure, have cultivated the skill to guess within seconds which painter it is for every picture. I wouldn't mind having that skill -- it sounds like a fun skill to have -- but it seems like it would be more resour...
Who can quarrel with this, except perhaps on emphasis? All that can really be asked is whether the list of subjects prime for pursuit by rationalists is complete. The one major omission: acquiring writing skill. This is vital not only for articulating your ideas, so you get worthwhile feedback. The quality of ideas themselves depends on how well they're expressed. That Darwin was a superb writer isn't an incidental fact. (See "Can bad writers be good thinkers?"; "Are good thinkers good writers?"; "Some writing skills undermine thou...
Truth seekers should deliberately impose costs on themselves for holding false beliefs. That is, they should increase the cost of being wrong. One way to do this is to bet on your beliefs. Another way is to bond your beliefs: post a bond that you will forfeit if your prediction is wrong. Yes, imposing such costs is bothersome, but for truth seekers the tradeoff is easily worth it.
More or less accurate, though of course there's a ton of stuff that was left unsaid. "Study fields like these" is easy to say, "learn skills like these" is really difficult. There's no easy way to communicate skills, and sanity is a skill-set. You kinda just have to hope people have enough lucidity to make connections between fields and reliably see single step implications, and enough ambition to seek out and learn the skills from better thinkers than themselves. E.g. thanks to having brilliant friends I know a lot of cognitive tricks ...
I would love to hear more about how playing chess helps whatever skills you think it helps.
I expect it helps with your dubstep moves.
…What would that look like?
The immediate answer which I cannot shake off is - Stanislaw Lem:) or, perhaps, the hero of his novel "Runny nose".
Something is missing here. Curiosity about what? Are we only supposed to care about having true beliefs or are we supposed to care about what those true beliefs tell us about the world? I'm betting on the latter. In fact, I would go further and say that it is better to have lots of literally false but approximately true beliefs about the world than to have only a few, completely true beliefs about the world or even to have the same number of completely true beliefs but not have them cover as much interesting territory.
I don't have a theory of what make...
See also: Twelve Virtues of Rationality, The Meditation on Curiosity, Use Curiosity
What would it look like if someone was truly curious — if they actually wanted true beliefs? Not someone who wanted to feel like they sought the truth, or to feel their beliefs were justified. Not someone who wanted to signal a desire for true beliefs. No: someone who really wanted true beliefs. What would that look like?
A truly curious person would seek to understand the world as broadly and deeply as possible. They would study the humanities but especially math and the sciences. They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.
They would practice truth-seeking skills as a musician practices playing her instrument. They would practice "debiasing" techniques for reducing common thinking errors. They would seek out contexts known to make truth-seeking more successful. They would ask others to help them on their journey. They would ask to be held accountable.
They would cultivate that burning itch to know. They would admit their ignorance but seek to destroy it.
They would be precise, not vague. They would be clear, not obscurantist.
They would not flinch away from experiences that might destroy their beliefs. They would train their emotions to fit the facts.
They would update their beliefs quickly. They would resist the human impulse to rationalize.
But even all this could merely be a signaling game to increase their status in a group that rewards the appearance of curiosity. Thus, the final test for genuine curiosity is behavioral change. You would find a genuinely curious person studying and learning. You would find them practicing the skills of truth-seeking. You wouldn't merely find them saying, "Okay, I'm updating my belief about that" — you would also find them making decisions consistent with their new belief and inconsistent with their former belief.
Every week I talk to people who say they are trying to figure out the truth about something. When I ask them a few questions about it, I often learn that they know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques. They do not regularly practice the skills of truth-seeking. They don't seem to say "oops" very often, and they change their behavior even less often. I conclude that they probably want to feel they are truth-seeking, or they want to signal a desire for truth-seeking, or they might even self-deceivingly "believe" that they place a high value on knowing the truth. But their actions show that they aren't trying very hard to have true beliefs.
Dare I say it? Few people look like they really want true beliefs.