Epistemic status: Very Uncertain. Originally written with the belief rationality has harmed my everyday life; revised to reflect my current belief that it's been somewhat positive. The tone of this article, and some of the content, may constitute self-punishment behavior, signaling, or frustration rather than genuine belief.


I'm currently a male second-year undergrad in computer science. I am not clinically depressed. My first exposure to the rationalist community was largely limited to reading HPMOR and ~40% of the original Sequences around 2013-2014; I've had rationalist/EA friends continuously since then; in mid-2019 I started following LW; in March 2020 I read almost every scrap of COVID-19 content. I'm not sure how to evaluate my strength as a rationalist, but I feel epistemically slightly weaker than the average LW commenter and guess that my applied skills are average for non-rationalists of my demographic.

Others described the phenomenon of the "rationalist uncanny valley" or "valley of bad rationality" as early as 2009:

It has been observed that when someone is just starting to learn rationality, they sometimes appear to be worse off than they were before.

I've known about the rationalist uncanny valley since 2013, and was willing to accept some temporary loss in effectiveness. Indeed, before March 2020, the damage to my everyday life was small and masked by self-improvement. However, with isolation, my life has ground to a halt, in part due to "rationalist uncanny valley" failure modes, and failures I'm predisposed to which rationality training has exacerbated. Looking back, one of these also happened during my exposure to the community in 2014. This post is an attempt to characterize the negative effects exposure to rationality has had on my life, and is not representative of the overall effect.


1. Bad form when reading LW material

I'm very competitive and my self-worth is mostly derived from social comparison, a trait which at worst can cause me to value winning over maintaining relationships, or cause me to avoid people who have higher status than me to avoid upward comparison. In reading LW and rationalist blogs, I think I've turned away from useful material that takes longer for me to grasp because it makes me feel inferior. I sometimes binge on low-quality material, sometimes even seeking out highly downvoted posts; I suspect I do this because it allows me to mentally jeer at people or ideas I know are incorrect. More commonly, I'll seek out material that is already familiar to me. Worse, it's possible that all this reading has confirmed beliefs I was already predisposed to, and therefore been net-negative.

As a concrete example, Nate Soares has a post on the "dubious virtue" of desperation. It's dubious because it must be applied carefully: one must be desperate enough to go all-out in pursuit of a goal, but not burn out or signal visible desperation towards people.

I am already strong at the positive aspects of desperation, but the idea of "dubious virtues" is appealing to me (maybe it's the idea that I can outdo others by mastering a powerful technique often considered too dangerous). I read the article several times, largely disregarding the warnings because they made me feel uncomfortable, with the result that I burned out and signaled desperation towards people.

Something similar but more severe happened in 2013-14, when I fell into the following pattern (not quite literally true): A friend links me a LW article. Then my defense mechanisms of epistemic learned helplessness activate and I stop reading. (didn't they make the basilisk thing? I should read all about that so I can identify suspect arguments) Then I decide I should prove my defense mechanisms wrong by reading a quarter of HPMOR in one night and memorizing the Rationalist Virtues! Then I completely stop reading out of fear that rationality is a cult/mind-hacking attempt. I decide to wait several years to dampen the cycle before becoming a rationalist. It's possible I spent six years in the rationalist uncanny valley and I'm not sure there was a simple way out before approximately last year.

2a. Predictions and being a Straw Vulcan

Others have gone through a phase of making all decisions by System 2 because they no longer trust System 1. I'm somewhat related. Over the last few months, I've worked on making calibrated predictions, including predicting my own future to inform career planning decisions. Perhaps due to the way I approach this exercise, I feel much less in touch with my emotions, and all predictions feel fuzzier. (It's also possible that my emotions are just unstable or suppressed due to the circumstances.) My feelings about the world vary with my mood, but now I try to correct for this and feel uncertain enough that I defer to a reference class or other people. Since I don't get to actually check on my own feelings, this is bad for practice.

2b. Predictions reify pessimism

Calibrating myself might be a good thing to do in ordinary times, but isolation has made me mildly depressed, causing reduced willpower. Consider a commitment I made recently to study with an peer over Zoom. In ordinary times, there's a 90% chance I keep this commitment. Taking into account my reduced agency, I predict an 80% chance I would do something that I normally do at 90%. However, there's a 65% chance that I actually do something I predict at 80%, so I continue until the fixed point, which is about 25%, which turns out to be accurate. Sometimes this fixed point is 0%.

In the past, I would mentally commit to actions I think I want to take (e.g. meditate regularly) then not actually follow through. Since I have realized how often this happens, I now make very few commitments. This has technically made me much more trustworthy, but the number of commitments I keep (to myself and others) has decreased in absolute terms.

3. Not Actually Trying

Reading feels much better than trying, especially when it requires willpower and time and the outcome is uncertain.

I think this was out of reach in 2014 even if I had developed enough trust in LW to self-modify based on LW1 principles-- EY notes that the biggest mistake with the original Sequences was neglecting applications and practice opportunities.

4. Disorientation and miscellaneous disruptions

Anna Salamon says that a particular type of disorientation can result when a new rationalist discards common sense, and manifests as an inability to do the "ordinary" thing when it is correct. After LW discovered that the efficient market hypothesis is sometimes false relative to strong predictors, I updated strongly towards rationalist exceptionalism in general, which may be correct, but this also increased my disorientation. Some examples I can identify:

  • I tried to convince my friend who's a good fit for climate policy to shift to AI policy.
  • I noticed that I sometimes need to rationalize my curiosity about the world as something useful.
  • Rationalists beginning to see non-rationalists as normies, NPCs, or otherwise sub-human: I now find talking to non-rationalists much less interesting.
  • I often sink into a debate mindset that any proposition can be true if I make the right argument for it, which I previously only liked to enter while playing devil's-advocate. When arguing for a point, it's slightly more common for me to be unsure whether I actually believe it than before. I have no idea what's going on here since I'm not much better at rhetoric than before. Is my unconscious rebelling against efforts to stop motivated reasoning? Am I trying to play status games? Should I have resolved my unwillingness to apologize?
  • Several counterproductive, intrusive thoughts that haven't gone away for several months of discussions with friends and occasional therapy:
    • My self-worth is derived from my absolute impact on the world-- sometimes causes a vicious cycle where I feel worthless, make plans that take that into account, and feel more worthless.
    • I'm a bad person if I'm a dedicated EA but don't viscerally feel the impact I have.
    • I'm a bad person if I eat meat (despite that vegetarianism is infeasible in the short term due to circumstances, and is a long-term goal for me)
    • After thinking about morality for a while, I'm 35% nihilist. This is supposed to not have an effect on my actions-- nihilism can just be subtracted out-- but everything feels approximately 35% hollow.


While I derived benefits from the content, I think it's plausible that COVID-19 was otherwise a bad time to dive headfirst into rationality. If I am to make guidelines for people exactly like me, they would be:

  • Engage with material that interests you, but recognize discomfort and unhealthy reading patterns.
  • Consume material when you can actually practice it (e.g. mentally stable, some minimum amount of slack).
  • Practice it (still have to figure out how).
New Comment
18 comments, sorted by Click to highlight new comments since: Today at 11:45 PM

Part of my uncanny valley was failing to realize that being able to identify a pattern was not sufficient to being able to step outside of it. I got to the point where I developed enough awareness to _notice_ that I was currently trapped inside a bad pattern, but I didn't have the tools to be able to step outside the pattern.


Not being able to go from "I notice my current patterns of work are unsustainable" to making them more sustainable.

Knowing I'm overconfident, but not making myself less confident.

Knowing about the planning fallacy, but pleading my own exceptionalism by placing myself in a tiny reference class.

Knowing that doing things like journaling, exercise, eating better, drinking more water, sleeping more, etc. are good for me, but not being able to actually do them.

As I've made progress on many of these things, my sense is that trying to solve your problems _until they are actually solved_ is the cornerstone of applied rationality. Techniques let you think about the problems differently and offer new angles of attack, but there is no substitute for _actually practicing_. I think getting people to actually do the thing is a relatively unsolved problem (for rationality, but also for all of society, so not a _particular_ failing on our part).

I got to the point where I developed enough awareness to _notice_ that I was currently trapped inside a bad pattern, but I didn't have the tools to be able to step outside the pattern.

Oof, I have also ran into this. 

I've been aware for a while now that having enough awareness to notice being trapped is not enough to step outside the pattern, but I can't step outside this pattern. I also believe that admitting that there is no substitute for practice isn't going to be causally linked to me actually practicing (due to a special case of the same trap), so I'll just go on staying trapped for now I guess.

Oof, yeah, this is a tough thing to work through anytime, so having more restricted support options makes it harder.

For what it's worth, and because it points to a larger literature than exists inside the rationalist community, the uncanny valley is the rationalist-specific manifestation of a common phenomenon I call "confusion" in one model of personal growth and some spiritual traditions talk about as a "dark night". It's especially powerful the first time you really "get" nihilism (which it sounds like you're still dealing with), so working with that may help lead you out.

The good news is that, no matter what it feels like, you're not permanently in the uncanny valley, just passing through it, even if you have to make the passage on a moonless, foggy night where you can barely see to the next step ahead. Just keep going and sometime, maybe after days, months, or years, you'll find your way out.

Thanks for the link on the "dark night". This passage seems to be the clearest definition:

While the manifestations of spiritual crisis are highly individual, and no two spiritual crises are exactly the same, there are some common features that appear for most people. These include a loss of sense of identity, radically changing personal values, and the occurrence of mystical and spiritual experiences.

It's plausible to me that this represents the philosophical/spiritual side of the rationalist uncanny valley for many people but is less related to practical/epistemic issues. This gives me a little hope that the "dark night" aspects can be tackled using established techniques from spiritual traditions (Anna Solomon's advice is quite different). Though that also seems dangerous, and I wouldn't want to try it myself unless desperate-- I believe in myself enough to try to get out using standard techniques.

At risk of digressing, I think I used the word "nihilism" as combining the flavor of moral anti-realism best explained as option 4-5 of this post with the intuition that any values meaningful enough to optimize towards must be reasonably robust to changes in initial conditions of reflection in an individual, and possibly to different humans and non-humans using the same meta-level reflection process. My conclusion is that I should use my moral intuition, but then dramatically pare down the list of things I think I value for the sake of robustness. Given the counterarguments, it doesn't get all the weight, though I take it seriously. While Meaningness claims to roundly refute nihlism, it seems to use the word slightly differently and focus on other ways people turn to nihilism.

If I understand you right, you value some things (finding them meaningful) because you robustly value them regardless of circumstances (like I value human life regardless of whether I had coffee this morning). Is this correct?

But you also mentioned that this only accounts for some values, and other things you value and find meaningful aren’t robust?

It might help if you try to think less in terms of making rationality and EA part of your identity and instead just look at them as some things you're interested in. You could pursue the things you're interested in and become a more capable person even if you never read anything else from the rationality community again. Maybe reading stuff from people who have achieved great things and had great ideas and who have not been influenced by the rationality community (which, by the way, describes most people who have achieved great things and had great ideas) would help? E.g. Paul Graham's essays are good (he's kind of LW-adjacent, but was writing essays long before the rationality community was a thing): http://paulgraham.com/articles.html

I think the rationality community is great, it has hugely influenced me, and I'm glad I found it, but I'm pretty sure I'd be doing great stuff even if I never found it.

This is a beautiful post in a way because it signals the ascent out of that valley. Looking at any ‘uncanny valley’ is regarded as being uncomfortable with the prospect of the bottom, but comfortable with either extreme end of it. Discomfort is essential to growth. Pushing the limits of yourself; pushing your understanding of your flaws; facing those flaws with an understanding that you will not be the same person after than you were before. I have been reading LW for while and this is my first post here, because I empathize dearly, and I hope you will understand my criticisms of your takeaways in this experience. ~In retrospect, I see my post got a bit longer than I expected, and I welcome replies. ~Okay, looking back again, I got really out of hand. Maybe a bit of an 'oops' but I too often write something and then shy from posting it because I feel it's too much, but this time I will.

Do you want to change?

Are you ready to change?

Do you know what you will change into?

It is not about practicing rationalism as a rule, and if you fall from it, you fall from grace. Rationalism is not a religion. It should not even be a way of life, in my opinion. It is just uncommon sense. A sensible way to make decisions on problems you struggle with understanding the concepts of, and being able to deconstruct the problem, examine it from various perspectives, and then come to the most reasonable answer, if not always the correct one, if such an answer exists absolutely. While I would be pleased if it became common sense, it is not, as the balance of Type 1 to Type 2 decision making is very hard to keep once you are aware of being able to influence it without falling heavily into one side. There are very few people who do not make decisions based on their understanding of the world, ‘rationally’. What the context of that rationale is very much insufficient in most of those cases. Hence why ‘rationalism’ is not just considered the default, even though when asked if their religion is rational, I suspect nearly every practicing member of a religion would say ‘yes’. This is semantical, however.

Someone who makes decisions not based on the rationale of their world has severe mental disorders. Even worse than most disorders; as even true insanity is based on an internal rationale, though in reverse. We can easily look at any person we would jeer at and ignore the differences in our base knowledge. You can either laugh and walk away; or try to educate them on those base knowledges. However, the time investment in doing so can make this not worthwhile, since the benefit gained is just not worth the cost.

That is why we must not denigrate those who have insufficient information for their rationale. What is frustrating are people who do not willingly wish to expand that context of rationale. This isn’t for people who “aren’t actually trying”, but who don’t even want to try at all. This extends not only to the obvious people we may jeer over for entertainment, but also those closest to our own lines of thinking. The danger is not in not knowing; the danger is in not wanting to know. There can be a cost/benefit analysis to your mental health in this, as well, since not everything you learn is valuable, and much will be a waste of time and non-applicable to your daily life. Whether it may be valuable in the future is uncertain.

I'm very competitive and my self-worth is mostly derived from social comparison, a trait which at worst can cause me to value winning over maintaining relationships, or cause me to avoid people who have higher status than me to avoid upward comparison.

This is not being competitive. This is being avoidant, and though I need not say it now as you described it later, insecure in what knowledge you do not know. Essentially, insecurity in ignorance. I was very concerned with this for a great portion of my life as well. It did not help that I was a quick learner, and so eventually figured I did not need to keep learning, since it would come quickly anyway. As topics got more advanced, especially ones outside of my scope of knowledge which I fast learned I hugely underestimated, my ability to learn them slowed considerably, and in some instances, I had to give up. Rather than become depressed over my lack of mastery in a particularly difficult subject, I find it to be something I can revisit later- perhaps with more alternative experience- or perhaps when in a different lifestyle or headspace that is more conducive to learning. I can do N activity with my friends all day long and win; I could also go do N activity with more skilled acquaintances all day long and mostly lose, if not always lose; one is good for my ego, the other is good for improvement. Requiring your ego to be sated is not being ‘competitive’, it is a superiority complex. These are separate things since one accepts failure and grows from it and the other rejects failure as out of their control. To be direct, and I apologize if it is too direct: do not define yourself as competitive if you only willingly compete with someone inferior in the context of the situation.

To be competitive is to, keyword, strive, to be better, not to strive to win or be right. An “I Will Never Lose” approach to life is an avoidant one, since as you take hits from those inevitable losses, it leads you into taking less risks. You express that well, but it is more dangerous than it seems. It not only reaffirms your predispositions, which may or may not be acceptable in the context, but it also limits your ability to learn and adapt. It reduces the number of new doors to walk through, since you now consider every door with a brass handle a ‘loss’. To become better, mistakes will be made. When mistakes are made, you correct them, and eventually make less mistakes, as the more mistakes you encounter the more answers you know. It feels very obvious in text, but in practice, it’s painful. Very painful. It does not end; there are always new mistakes. Every decision is made with some degree of uncertainty. Anxiety grows and develops out of knowing how many mistakes you can make in a given activity. Practice and experience alleviates that anxiety, knowing how well you can avoid making those mistakes. The same exists in both physical and mental exercises. This arises further questions: can I even avoid making X or Y mistake? How do I get around it if I cannot avoid making it through ordinary means? Do I lack alternative knowledge to make that judgment?

In this, you think. You learn. That is what’s important. To develop yourself so that when faced with any problem or challenge, you aren’t concerned with winning or losing, but with growing and understanding both it and yourself.

There is nothing wrong with being competitive. There is a great deal wrong with avoiding losses, and especially with being forced into upwards comparison. If you avoid those who are more ‘advanced’, I say that term loosely as the concept may well be arbitrary depending on the context, how do you know what to improve in yourself? Are you even trying to improve yourself? Is it merely feeling you have already improved yourself in some manner but fail to put that into practice, and so feel knowing that you have not done so creates insecurity? It is much more secure to not know how poor one is at any activity, but that also contributes to arrogance and ignorance in general in ourselves and in the world. Having the strength to accept that weakness, an oxymoron, is the first step to true improvement.

Your guidelines tell me those things because I find them to be misguided advice. Don’t do things you know you can do, if you are trying to improve yourself. Routine is for comfort, when desired. Don’t be content with doing things you find mentally safe. Don’t shy from discomfort. “Unhealthy” is different- do not obsess, I can agree with this, as burnout both reduces what you take from the readings and curbs further reading, making a valuable source of varied conjecture and opinion wasted. However, discomfort is good. Discomfort means you must think.

Why am I uncomfortable with this?

Why does it bother me?

Is there logic to why it bothers me?

Is this discomfort rooted in a predisposition, prejudice, ignorance?

Do I simply not have the time today to click every link in this post to understand the full context?

The answer to every one of these questions leads to growth in some way, even the last one, since it means you should probably go finish whatever work needs to be done that is distracting you from learning. I must specify I am not encouraging dangerous things. Inevitably there is a darker side that must eventually be understood in ourselves, and that is a difficult beast to cope with. That, however, is outside the scope of this post. To even begin to try to understand that, finding your mentality towards your self-improvement and your perception of yourself among your peers is more important.

Do not simply challenge yourself in comparison to yourself and worry about lacking commitment. It is perfectly normal, even moreso with ADHD, medicated or not. Alternative perspectives are incredibly valuable, and not looking to understand them when they do not line up with what you are predisposed to (or even simply disregarding them because they are of a certain group or subgroup) is willingly choosing ignorance. Note: you do not have to agree with what you learn. You do not even need to find it acceptable. Without context, all other perspectives are meaningless at the surface.

In learning why you disagree, you now understand more. It is rarely a calm conversation, but afterwards, reflect not on their reasoning with distaste but with your newly acquired knowledge. You now know why their position is incorrect, moreso than before, because of this or that flawed point, this or that lack of context, this or that lack of knowledge. You can then apply that in the future.

I always found the idea of concerted applied rationalism ridiculous. If you can comprehend and realize how rationalism and the subcategories relate to your thinking and your approach to perspectives and discourse, you’re automatically applying this rationalism in your everyday life. It’s a result of being aware of it and being able to pull yourself back when you find you’ve started to become nonsensical, and, as you say, resolve the unwillingness to apologize.

When you have understood the basic precepts, they will naturally absorb into your day to day actions and personality. You don’t need to think, as you’re looking through your phone, “Hmm, I don’t like this article, it’s got a title that is clearly written by someone without any context, so rationally I will move on to a different article to read.” You will just swipe through it without thinking. You often don’t need to explain your decision-making, because once the heuristics are there, certain things that may qualify as Type 2 for people who haven’t delved very far into the idea become Type 1 thinking for someone experienced in even the basics of rationalism. The problem of status does become an issue here. Again, note you are not superior because of this. What you want is for the consequences of your decision making to be superior to someone who doesn’t use those autopilot heuristics. Anyone can philosophize all day; if nothing good comes of it, no one will care, or find any purpose to it. When good comes of it, you can then have impact. You can improve systems already in place or convince others of your methods.

I think you have applied those training regimes to your life. You just aren’t actually as aware of it as you think. Just making this post makes it clear to me that you are, in some measure, affected by it, though not those perhaps specifically, but the concepts in general. There’s a lot to still improve, and there always will be.

I have to stop writing here or I’ll just end up having to write my own post instead since I got a little out of hand with what I expected to be a couple paragraphs. A lot of it is just repeated rationalist rhetoric anyway, though in a way I think is more realistic than what I believe some end up doing: making rationalism a mystical Type 3 mode of thinking, beyond anything else and exclusive to non-NPCs, something difficult to apply and even more difficult to explain to someone who doesn’t use it.


The goal of life and reason is not to win. It’s to grow. In that growth you will help others, and they will help you. You will fail others, and they will fail you. Humility is the greatest teacher, but there is no test to check if you’ve passed or not. Challenge yourself, but within the limits of your mental health.

Now may not be the time for that. We’re all feeling the affects of the quarantine. Even believing all I’ve said above, I’ve acted irrationally by my terms over the past two months, tore apart relationships, made mistakes. It is a great mental stress that at any moment more could happen, things could get worse, and if we are too quick to return to normalcy as they start to improve, knowing that would make it even worse too. There are not many ways out of this situation without being patient and following the guidelines, staying safe, and helping who you can.

If nothing else, it puts a lot into perspective that some may not have understood before this, though something I tried to always express. Our circumstances determine our tendency to irrationality. Those who have felt safe and secure suddenly, even when they may still be safe and secure, are posited with a situation that may throw that into a great imbalance. Some of it may not be obvious, but if you can see it in your friends and they deny it, they probably see it in you too. I know I can see it in me.


I would start with the ideas that natural behavior is typically adaptive, and that society is built for people who behave naturally. Rationality is highly effective at the margins, but tends to cause issues if used without restraint. Don't go overboard. If you do one thing at professional quality, it's okay to be normal everywhere else.

In a related note, we all have meta-preferences that are different than our preferences; there are many things that we want to have done but don't actually want to do. And often our preferences are wiser than our meta-preferences; spending your youth studying and self-improving is not always a better use of your time than getting drunk with attractive people. You don't have to be superhuman, which is good since it isn't an option. Have some compassion for yourself; it's okay to be merely okay.

Note that this is the opposite of the advice that I would give to an underachieving slacker. Some people need to learn to act with intent, and some people need to learn to chill. You seem to be the second type.

Rationalism is like climbing stairs , its easier to say then to do , one look up to know the direction no to despair at one's position ,one feel humble that everyone is on a stair some were/when . pride at being higher then other and advancing , and know when to take breaks so one doesn't fall down.

climbing stairs is dangerous , so running up depend on one's condition.

naivete is looking from a certain stair , its justified on its own or looking down on lower stairs , but not when thinking one is at the top.


I need to start off by saying that I strongly encourage those who can to achieve fluency with the techniques of rationality. They're often very useful, and not knowing them is often crippling.

Having said that, if reason is the only tool in your toolkit you're not likely to get far. Empathy, charisma, confidence, psychology, and physical attractiveness are often even more useful. You are surrounded by seven billion apes who are smart enough to invent nuclear weapons and stupid enough to use them; they are by far the most important part of your environment and Donald Trump is better at manipulating them than Eliezer Yudkowski.

Beyond that there are the insights of meta-rationality. If you think of rationality in terms of optimization, meta-rationality is the art of choosing what to optimize. If rationalism is like climbing stairs, meta-rationality is deciding which staircases are worth climbing (there's a lot more to it than that.).

What I'm trying to say is- don't be so proud of your rationalism. It's only a part of what you need.

Thanks for this post!

One reason I found it interesting is in spurring me to think about my own journey thru 'the rationalist uncanny valley'.

Would you mind sharing your thoughts?


I'll consider writing and publishing a longer post, but here's a quick summary off the top of my head:

There's a real tension – in my own mind anyways – between epistemological and instrumental rationality, particularly in areas dominated by 'psychology' and 'sociology', i.e. when interacting with other people, either alone or in groups. Epistemological rationality is, or at least feels, easier. This tension is, I think, either the or the main cause of the uncanny valley. The first item of the "frontpage comment guidelines" hints at this:

Aim to explain, not persuade

Knowing when to avoid persuading, or recognizing when one is doing that, is hard!

And even explaining is difficult! At some point, I find myself trying to persuade others to accept my attempted explanations and at least understand it to my satisfaction. This is a big reason why I empathize with this statement in your post: "I now find talking to non-rationalists much less interesting". Rationalists at least have norms that depend on differentiating the two. I find that a lot of non-rationalists almost inevitably pattern-match what I intend as an explanation as attempted persuasion.

Because of uncertainty, chaos, and path dependance, even just picking targets against which to judge one's own effectiveness is a seemingly inevitably and permanently nebulous project. I try to maintain the idea that my own effectiveness is bounded by constraints, including my own psychology, and I don't know all of the constraints. But another idea that accompanies that is that I might be limiting my effectiveness by handicapping myself in my own thoughts as a means of preserving (some amount of) my self esteem.

I also struggle with integrating my own preferences into my own judgements about my effectiveness.

I don't think I've climbed out of the rationalist uncanny valley. I think I have been descending into and then climbing out of several uncanny local minima in the landscape of my personal effectiveness. I also think that I descended into a valley (at least once) before I found Overcoming Bias, and then Less Wrong – and before either existed. (I'm 38 years old.) I also feel like my 'effectiveness record' is very mixed. I don't think I've ever been, overall, ineffective, and I think I've definitely scored some clear victories, so, in a sense, there are many different dimensions and it's only in some that I consider myself, at any one time, to be in a valley (or not).

I was glad to read a post like this!

The following is as much a comment about EA as it is about rationality:

"My self-worth is derived from my absolute impact on the world-- sometimes causes a vicious cycle where I feel worthless, make plans that take that into account, and feel more worthless."

If you are a 2nd year undergraduate student, this is a very high bar to set.

First impact happens downstream, so we can't know our impact for sure until later. Depending on what we do, until possibly after we are dead.

Second, on the assumption that impact is uncertain, it is possible to live an exemplary life and yet have near zero impact due to factors beyond out control. (You cure cancer moments before the asteroid hits)

Third. If we pull down the veil of ignorance, it is easy to imagine people with the motivation but not the opportunity to have impact. We generally don't think such people have no worth - otherwise what is it all for? By symmetry we should not judge ourselves more harshly than others.

I find intrusive thoughts take hold when I suspect they may be true. I hope this is one which might be exorcised on the basis that it is a bad idea, not an uncomfortable truth.

I'm very competitive and my self-worth is mostly derived from social comparison, a trait which at worst can cause me to value winning over maintaining relationships, or cause me to avoid people who have higher status than me to avoid upward comparison. In reading LW and rationalist blogs, I think I've turned away from useful material that takes longer for me to grasp because it makes me feel inferior. I sometimes binge on low-quality material, sometimes even seeking out highly downvoted posts; I suspect I do this because it allows me to mentally jeer at people or ideas I know are incorrect.

I want to share that I have done this as well. In my case, I would be slightly more charitable and claim that the motivation was not to jeer at people who say incorrect things but to derive a feeling that I myself am doing okay. LessWrong has very high standards and there are a lot of impressive people here, which can make it terrifying for those of us who have the deeply rooted instinct to compare ourselves to whatever people we see around us. So if I see something downvoted, it gives me reassurance that I at least must be above some vaguely defined bar.


One thing I really don't like about this post is that it presumes the reader knows what 'rationalist uncanny valley' means. Without that context, the post is really chaotic.

Edited to introduce the concept.