Here are my thoughts on the "Why don't rationalists win?" thing.
Epistemic
I think it's pretty clear that rationality helps people do a better job of being... less wrong :D
But seriously, I think that rationality does lead to very notable improvements in your ability to have correct beliefs about how the world works. And it helps you to calibrate your confidence. These abilities are useful. And I think rationality deserves credit for being useful in this area.
I'm not really elaborating here because I assume that this is something that we agree on.
However, I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway), and that this may increase the "why don't rationalists win?" thing. I think that a big reason for this lack of progress is because a) we think about really really really difficult things! And b) we beat around the bush a lot. Big topics are often brought up, but I rarely see people say, "Ok, this is a huge topic so in order to make progress, we're going to have to sit down for many hours and be deliberate about this. But I think we could do it!". Instead, these conversations seem to be just people having fun, procrastinating, and never investing enough time to make real progress.
Altruistic
I also think that rationality is doing a great job in helping people to do a better job at being altruistic. Another thing that:
- I'm going to assume that we mostly agree on, and thus not really elaborate.
- I think deserves to be noted and given credit.
- Is useful.
For people with altruistic goals, rationality is helping them to achieve their goals. And I think it's doing a really good job at this. But I also think that it doesn't quite feel like the gains being made here are so big. I think that a major reason for this is because the gains are so:
- High level.
- Likely to be realized far in the future.
- Are the sort of thing that you don't personally experience (think: buying a poor person lunch vs. donating money to people in Africa).
But we all know that (1), (2), and (3) don't actually make the gains smaller, it just makes them feel smaller. I get the impression that the fact that the gains feel smaller results in an unjustified increase in the "rationalists don't win" feeling.
Success
I get the impression that lack of success plays a big role in the "why don't rationalists win?" thing.
I guess an operational definition of success for this section could be "professional, financial, personal goals, being awesome...".
I don't know much about this, but I would think and hope that rationality helps people to be notably more successful than they otherwise would be. I don't think rationality is at the point yet where it could make everyone millionaires (metaphorically and/or literally). But I think that a) it could get there, and b) we shouldn't trivialize the fact that it does (I'm assuming) make people notably more successful than they otherwise would be.
But still, I think that there are a lot of other factors that determine success, and given their difficulty/rarity, even with rationality in your toolbox, you won't achieve that much success without these things.
- Plain old hard work. I'm a huge believer in working smart, but I also think that given a pretty low and relatively sufficient level of smartness in your work, it's mostly a matter of how hard you work. You may ask yourself, "Take someone who studies really hard, but is lacking big time when it comes to rationality - wouldn't they not be successful?". I think an important (and sad) point to make is that at this point in history, you could be very successful with domain specific knowledge, but no rationality. And so people who work really hard but don't have an ounce of rationality often end up being very good at what they do, and very successful. I think we'll reach a point where things progress enough and rationality does in fact become necessary (the people with domain specific knowledge but no rationality will fail).
- Aptitude/starting early. I'm not sure the extent to which aptitude is actually a thing. I sense that a big part of it is simply how early on you started. When your brain was at that "sponge-stage". Regardless, aptitude/starting early seems to be pretty important. Someone who works hard but started too late will certainly be at a disadvantage.
- Opportunity. In one sense, not much will help you if you have to work 3 jobs to survive (you won't have much time for self-improvement or other necessary investments of time). In another sense, there's the idea that "you are who you surround yourself with". So people who are fortunate enough to grow up around other smart and hard working people will have had the opportunity to be socially pressured into doing the same. I think this is very underrated, but also very overcommable. In another sense, some people are extremely fortunate and are born into a situation where they have a lot of money and connections.
- Ambition/confidence. Example: imagine a web developer who has rationality + (1) + (2) + (3) but doesn't have (4). He'll probably end up being a good web developer. But he might not end up being a great web developer. The reason for that is because he might not have the ambition or confidence to think to pursue certain skills. He may think, "that stuff is for truly smart people, I'm just not one of those people". And he may not have the confidence to pursue the goal of being a great software engineer (more general and wide-ranging). He may not have the confidence to learn C and other stuff. Note that there's a difference between not having the confidence to try, and not having the confidence to even think to try. I think that the latter is a lot more common, and blends into "ambition territory". On that note, this hypothetical person may not think to pursue innovative ideas, or get into UX, or start a startup and do something bigger.
Happiness
I get the impression that lack of happiness plays a big role in the "why don't rationalists win?" thing.
Luke talked about the correlates of happiness in How to Be Happy:
Factors that don't correlate much with happiness include: age,7 gender,8 parenthood,9 intelligence,10 physical attractiveness,11 and money12 (as long as you're above the poverty line). Factors that correlate moderately with happiness include: health,13 social activity,14 and religiosity.15 Factors that correlate strongly with happiness include: genetics,16 love and relationship satisfaction,17 and work satisfaction.18
One thing I want to note is that genetics seem to play a huge role, and that plus the HORRIBLE hedonic adaptation thing makes me think that we don't actually have that much control over our happiness.
Moving forward... and this is what motivated me to write this article... the big determinants of happiness seem like things that are sort of outside rationality's sphere of influence. I don't believe that, and it kills me to say it, but I thought it'd make more sense to say it first and then amend it (a writing technique I'm playing around with and am optimistic about). What I really believe is:
- Things like social and romantic relationships are tremendously important factors in one's happiness. So is work satisfaction (in brief: autonomy, mastery and purpose).
- These are things that you could certainly get without rationality. Non-rationalists, have set a somewhat high bar for us to beat.
- Rationality certainly COULD do wonders in this area.
- But the art hasn't progressed to this point yet. Doing so would be difficult. People have been trying to figure out the secrets of happiness for 1000s of years, and though I think we've made some progress, we still have a long way to go.
- Currently, I'm afraid that rationality might be acting as a memetic immune disorder. There's a huge focus on our flaws and how to mitigate them, and this leads to a lot of mental energy being spent thinking about "bad" things. I think (and don't know where the sources are) that a positive/optimistic outlook plays a huge role in happiness. "Focusing on the good." Rationality seems to focus a lot on "the bad". Rationality also seems to make people feel unproductive and wrong for not spending enough time focusing on and fixing this "bad", and I fear that this is overblown and leads to unnecessary unhappiness. At the same time, focusing on "the bad" is important: if you want to fix something, you have to spend a lot of time thinking about it. Personally, I struggle with this, and I'm not sure where the equilibrium point really is.
Social
Socially, LessWrong seems to be a rather large success to me. My understanding is that it started off with Eliezer and Robin just blogging... and now there are thousands of people having meet-ups across the globe. That amazes me. I can't think of any examples of something similar.
Furthermore, the social connections LW has helped create seem pretty valuable to me. There seem to be a lot of us who are incredibly unsatisfied with normal social interaction, or sometimes just plain old don't fit in. But LW has brought us together, and that seems incredible and very valuable to me. So it's not just "it helps you meet some cool people". It's "it's taken people who were previously empty, and has made them fulfilled".
Still though, I think there's a lot more that could be done. Rationalist dating website?* Rationalist pen pals (something that encourages the development of deeper 1-on-1 relationships)? A more general place that "encourages people to let their guard down and confide in each other"? Personal mentorship? This is venturing into a different area, but perhaps there could be some sort of professional networking?
*As someone who constantly thinks about startups, I'm liking the idea of "dating website for social group X that has a hard time relating to the rest of society". It could start off with X = 1, and expand, and the parent business could run all of it.
Failure?
So, are we a failure? Is everything moot because "rationalists don't win"?
I don't think so. I think that rationality has had a lot of impressive successes so far. And I think that it has
A LOT
of potential (did I forget any other indicators of visual weight there? it wouldn't let me add color). But it certainly hasn't made us super humans. I get an impression that because rationality has so much promise, we hold it to a crazy high standard and sometimes lose sight of the great things it provides. And then there's also the fact that it's only, what, a few decades old?
(Sorry for the bits of straw manning throughout the post. I do think that it lead to more effective communication at times, but I also don't think it was optimal by any means.)
I've never really understood the "rationalists don't win" sentiment. The people I've met who have LW accounts have all seemed much more competent, fun, and agenty than all of my "normal" friends (most of whom are STEM students at a private university).
There have been plenty of Gwern-style research posts on LW, especially given that writing research posts of that nature is quite time-consuming.
I went to an LW meetup once or twice. With one exception the people there seemed less competent and fun than my university friends, work colleagues, or extended family, though possibly more competent than my non-university friends.
Huh, lemme do it.
Schelling fence → bright-line rule
Semantic stopsign → thought-terminating cliché
Anti-inductiveness → reverse Tinkerbell effect
"0 and 1 are not probabilities" → Cromwell's rule
Tapping out → agreeing to disagree (which sometimes confuses LWers when they take the latter literally (see last paragraph of linked comment))
ETA (edited to add) → PS (post scriptum)
That's off the top of my head, but I think I've seen more.
Their approach reduces to an anti-epistemic affect-heuristic, using the ugh-field they self-generate in a reverse affective death spiral (loosely based on our memeplex) as a semantic stopsign, when in fact the Kolmogorov distance to bridge the terminological inferential gap is but an epsilon.
The big reason? Construal theory, or as I like to call it, action is not an abstraction. Abstract construal doesn't prime action; concrete construal does.
Second big reason: the affect (yes, I do mean affect) of being precise, is very much negative. Focusing your attention on flaws and potential problems leads to pessimism, not optimism. But optimism is correlated with success, pessimism is not.
Sure, pessimism has some benefits in a technical career, in terms of being good at what you do. But it's in conflict with other things you need for a successf... (read more)
Or the definition of rationalism. Maybe epistemic rationalism never had much to do with winning.
What is the observed and self-reported opinion on LW about "rationalists don't win"? Lets poll! Please consider the following statements (use your definition of 'win'):
I don't win: [pollid:1023]
Rationality (LW-style) doesn't help me win (by my definition of 'win'): [pollid:1024]
Rationality (LW-style) doesn't help people win (by my definition of 'win'): [pollid:1025]
I think rationalists on average don't win more than on average (by my definition of 'win'): [pollid:1026]
I think the public (as far as they are aware of the concept) thinks that ratio... (read more)
Side-stepping the issue of whether rationalists actually "win" or "do not win" in the real world, I think a-priori there are some reasons to suspect that people who exhibit a high degree or rationality will not be among the most successful.
For example: people respond positively to confidence. When you make a sales pitch for your company/research project/whatever, people like to see you that you really believe in the idea. Often, you will win brownie points if you believe in whatever you are trying to sell with nearly evangelical fervor... (read more)
Let's say I wanted to solve my dating issues. I present the following approaches:
I endeavor to solve the general problem of human sexual attraction, plug myself into the parameters to figure out what I'd be most attracted to, determine the probabilities that individuals I'd be attracted to would also be attracted to me, then devise a strategy for finding someone with maximal compatibility.
I take an iterative approach: I devise a model this afternoon, test it this evening, then analyze the results tomorrow morning and make the necessary adjustments.
W... (read more)
Human beings are not very interested in truth in itself. They are mostly interested in it to the extent that it can accomplish other things.
Less Wrongers tend to be more interested in truth in itself, and to rationalize this as "useful" because being wrong about reality should lead you to fail to attain your goals.
But normal human beings are extremely good at compartmentalization. In other words they are extremely good at knowing when knowing the truth is going to be useful for their goals, and when it is not. This means that they are better than... (read more)
"Local rationalist learns to beat akrasia using this one weird trick!"
Depends on what 'win' means. If (epistemic) rationality helps with a realistic view of the world, then it also means looking behind the socially constructed expectations of 'life success'. I think these are memes that our brains pattern match to something more suitable in an ancestral environment. Hedonic treadmill and peter principle ensue. I think that a realistic view of the world has helped me evade these expectations and live an unusual fulfilled interesting life. I'd call that a private success. Not a public one.... (read more)
My life got worse after I found LessWrong, but I can't really attribute that to a causal relationship. I just don't belong in this world, I think.
I can imagine LW-style rationality being helpful if you're already far enough above baseline in enough areas that you would have been fairly close to winning regardless. (I am now imagining "baseline" as the surface of liquids in Sonic the Hedgehog 1-3. If I start having nightmares including the drowning music, ... I'll... ... have a more colorful way to describe despair to the internet, I guess.)
I agree. First off, I think it has a lot to do with a person's overall definition of 'win'. In the eyes of 'outside society' rationalists don't win. I believe that is because, as you said, if you look at things overall, you don't see an influx of success for the people who are rationalists. That isn't to say that they don't win, or that rationalism is pointless and does't offer up anything worthwhile. That would be a lie. I think that rationalists have a better grip on the workings of the world, and thenceforth, know what to do and how to achieve success, ... (read more)
We are the people who knew too much.....
I am not so sure that rationalists don't win, but rather that "winning"(ie. starting a company, being a celebrity, etc.) is rare enough and that few people are rationalists that people that win tend not to be rationalists because being a rationalist is rare enough that very few people that win are rationalists, even if each rationalist has a better chance of winning.
So you say altruism is something to "assume that we mostly agree on, and thus not really elaborate" and I know the sentiment is sometimes that it's like jazz and pornography, but fwiw I'd be curious about an elaboration. I don't think that particular prejudice is a big part of rationalist failures, but raising the possibility of it being a part is interesting to me.
What you less wrong folks call "rationality" is not what everyone else calls "rationality" - you can't say "I also think that rationality is doing a great job in helping people", that either doesn't make sense or is a tautology, depending on your interpretation. Please stop saying "rationality" and meaning your own in-group thing, it's ridiculously offputting.
Also, my experience has been that CFAR-trained folks do sit down and do hard things, and that people who are only familiar with LW just don't. It has also been ... (read more)