[ Question ]

Why do humans want to be less wrong?

by Samuel Shadrach2 min read28th Oct 20215 comments



Basically what drives humans to value being more rational than they currently are? Has this been studied?

Humans value a lot of things. Some are likely selected for by evolution. Some could even just be random. Here's some that may possibly be relevant to explain why they value being rational.

  1. Capacity for rationality has it in-built - Invention of language and recursive structures. This gives us capacity to be rational. Possibly this module also comes in-built with a desire to be rational, i.e., use this capacity. Or possibly not. Note however that the two questions are distinct - "why are we capable of being rational" and "why do we want to use this capacity". This post is on the latter.
  2. Curiosity - Perhaps curiosity has been fundamentally selected for evolutionarily. And in the case of beings like humans, this also takes the form of wanting to invent ever better reasoning structures via recursion and going meta. Or perhaps not.
  3. Survival is now recursive - All animals are atleast somewhat evolved for survival purposes, whether or not they are aware of it. A bacteria for instance is likely not aware that they are optimised for survival, i.e., they don't have a brain that internally represents concepts of survival or value. In the case of humans however, we are aware of the fact that we value survival, we can represent this, and we can throw the cognitive module at the problem. Or perhaps this is unimportant.
  4. Random - Perhaps it is just random. Perhaps the invention of language was sheer luck of the draw, and so was our desire to use it.

I think this will be useful because it'll enable us to find the bounds of valuing rationality. As in how strongly are we capable of valuing rationality, at the cost of all the other things we also value. And more importantly, why do we value rationality? Is it just due to survival, is it just random, is it a mix of both?

Just because there exists an ideal rational agent doesn't mean humans are capable of becoming ever closer to this. Or more importantly for this post, that they want to. Ofcourse people on this website have a stronger desire to be rational as compared to other humans. So it might be useful to know the bound.

It might also be useful because I'm curious :p Having deeper insight never a bad thing.

P.S. I think the notion of "ideal rational agent" also deserves a lot more scrutiny, but that's for another post.


New Answer
Ask Related Question
New Comment

2 Answers

As best as I can tell, most human don't care about being rational. Am I misunderstanding?

Rationality makes me stronger and thus more able to achieve my goals. You would be better served by asking why humans have the true desires they do.

I ask that question too - but that question gets asked a lot. If anything, I am trying to expand the scope of that question. All desires have some source in the physical world. But people typically don't classify "desire to be rational" as a desire, even though at first glance imo it should.

If I may ask, "being more able to achieve goals" applies at what level? Like is it - you deliberately think that "Okay if I keep pursuing more rational thought processes I'll get to my desires"? Or is it that animals evolved rationality even before they could think about rationality? Or do you have another explanation?

2Raven1moPeople don't see desire to be rational as a desire? You mean, it's instrumental rather than terminal? Definitely the first for me, I remember staring at the cover of the sequences and clearly seeing that they would change me and make me stronger. It was an offer of power, and I made a conscious decision to accept it. I suppose that insofar as humans are already rational, we probably did evolve it for survival purposes. But... survival purposes often required irrationality.
1Samuel Shadrach1moI got the feeling some people didn't, even if they're not explicit about it. Interesting. That's the view from the inside [https://www.lesswrong.com/posts/yA4gF5KrboK2m2Xu7/how-an-algorithm-feels-from-inside] though. From the outside view you are likely a deterministic machine that was already pre-disposed to absorbing from the Sequences once in contact with it. That desire to absorb existed before you were exposed. Although ofcourse, on exposure your desires could change. True, I wonder what's the maximum we can deviate from this scripted irrationality.