I'm finally managing to finish my "basic" training in rationality, which is to mean finish studying "Rationality: A-Z" (I had studied the first half years ago, but I foolishly stopped when I got to the part about reductionism, which was unbelievably stupid of me even with all the reasons the led to me doing so). I plan to continue studying even more material once I'm done with it, to train myself in instrumental rationality and everything else I can find to make myself as smart as I could possibly be. I'm very satisfied with my progresses, the first half of the sequences helped me improve tremendously years ago, and now I can see myself improving again.

 

But, even while I am still at what I think is just the beginning of my improvement, I'm noticing more and more a rather serious problem.

To put it politely, I hate how people think, now.

 

I know it's really unfair because I didn't know any better mere weeks ago, and years ago I was a good textbook example of an intelligent person who'd keep mainly using his intelligence to rationalise whatever questionable decisions he made, but I just can't help it.

 

I notice logic leaps, cognitive missteps and dumb conclusions of people who are considered smart, deep and expert on stuff while they talk on the radio or on other medias and I get angry.

I notice idiotic ideas, as well as practices of thoughts that are the cognitive equivalent of shooting yourself in both knees, spreading inside ideologies I deeply care about, because the evils they fight are very real and demonstrated by science, but now I can see how all the truth is hopelessly getting mixed up with stuff that's just stupid or wrong, and that the intelligent people that once introduced me to these ideologies are absolutely incapable of judging and criticising any bad idea that's coming from their own side, and I get livid.

Half the time I hear someone talking I have to choose between politely tearing apart the majority of what he said, growing more and more annoyed, or just shutting off my attention and think about something else while pretending to listen to them.

And all this is just when I have to deal with intelligent people. 

I can't comprehend how a stupid person thinks unless I just stop thinking of him as an actual human being, switch off my empathy completely and just model him as a badly designed computer program with a bunch of floating beliefs in his memory and no analytical or critical skill whatsoever. If I try doing it the intuitive way, using empathy and telling my brain to think like him, my brain just keeps running out of suspension of disbelief as I can't avoid thinking that, no matter how much I could believe that political party/religion/philosophy x is right, I'd still recognise that blatantly idiotic part of it as a very, very stupid idea the first time I'd heard it, since even before rationality I've never actually been stupid enough to believe something that even at surface level was just plain dumb, so I can't even understand why he's doing what he's doing, forget predicting it.

 

And all this is really starting to weight on me. I think my mood has changed for the worse in the last weeks. 

If you have read HPMOR, I think I'm starting to feel like professor Quirrel, and my brain has started to actually think the words "avada kedavra" when I hear something particularly stupid and hateful. I wouldn't do that even if I could get away with it, but, emotion-wise, I have to consciously remind myself reasons why to kill someone that stupid wouldn't just be a net positive gain for mankind and wouldn't just spare us a waste of oxygen. The me of several years ago would have just smirked and nodded at this kind of thoughts, but I want to be smarter than the old me, and smarter than professor Quirrel as well.

 

I'm sorry if that was longer and more emotional than what strictly necessary, I wanted to communicate exactly how I feel and really needed to say these things to someone. I'll try to go straight to the point now. 

I think that rationality is completely worth it, I don't regret at all studying it, I don't want anyone to think that I regret studying it or suggest not studying it, and I will continue to move forward and improve myself. But I also think that the smart thing to do is look for ways to cheat and avoid paying this "price" as well.

 

So, what I want to know is: 

  1. Did other people who already learned rationality went through this as well?
  2. If so, does it continue or eventually you just get used to other people being insane and you don't emotionally mind it that much anymore? I can't remember being this annoyed at people when I had read the first half of the sequences.
  3. Do you know of or have you tried any particular strategy to not being annoyed or feel... disinterested in other people? If so, did it worked? Could you suggest any material that explains it in more details?
  4. What do you currently do when you have to deal with the kind of problem I have described? (If your answer to this is similar to 3. you can just skip this)
  5. Can you suggest me any material or strategy to effectively model and predict stupid people's behaviour?

And, on a side note:

6. Can you recommend me any reading material or training you think it made you smarter or better at predicting the world or other people? I have checked some of the posts about it on this side but still thought it was worth asking. If you know of posts and lists about this, linking those would also be a huge help.

 

Thanks to everyone who will choose to answer this, I'll really appreciate any help and information I can get. 

 

Edit 04/11/2020: I stress tested some of the advice I could apply right away, by watching a 45 mins video of interviews made at a Covid-19 deniers mass protest.

I got angry about twice and got a really odd look from the person who was with me because I said out loud something about the most annoying kitten I ever saw, but I have to say my mood was a lot better than when I usually tried to just not get angry at people.

What seemed to work the most was:

  • Thinking about people with very bad epistemology and beliefs as the victims of a bad epistemology infective process.
  • Trying to understand why they believed what they believed and why they thought the way they thought. I finally managed to form predictions and make models with moving, detailed parts. Every time I noticed I was confused about why someone believed something I just kept trying until I had a model I could really understand and wasn't just "non-sentient entities that resemble real people have been observed to exhibit stupidity number x". It's the first time in my life I managed to reach the level of empathy with that type of mental processes, to understand why they didn't felt their world-view was weird rather than just remind me that people believe weird things.

This question has been really useful to me already, I expect its usefulness will shot up a lot further as I read the materials people suggested me. 

I really wish to thank everyone for the excellent advice, and please do feel free to still post  advice on 6. if you wish to!

New to LessWrong?

New Answer
New Comment

14 Answers sorted by

lsusr

Nov 03, 2020

231

Just as intelligence is orthogonal to morality, the intrinsic value of a human being is orthogonal to that human being's intelligence. I don't judge other people for being stupid anymore than I would judge a dog for being stupid. We are all just animals. I love dogs and people for being exactly what we are.

I went through a cynicism phase similar to what you seem to be going through. I realize, looking back, that my disdain was connected to having low status myself. These days, now that I have high status, I think of dumb people more like kittens and less like bad guys.

If you think you are smarter than other people then either you are wrong or you are right. If you are wrong then you should change your mind. If you are right then you live in an extremely inefficient world and can make a killing. The antidote to stupid words is intelligent action. If you're not winning then you're doing rationality wrong.

In the land of the blind, the one-eyed person is dictator. It's good to be the dictator. If you're not dictator then either you are blind or you do not live in the land of the blind.

Can you recommend me any reading material or training you think it made you smarter or better at predicting the world or other people?

  • Abstain from stupid media like news, Facebook and videogames.
  • Learn to use Anki spaced repetition software.
  • Teach yourself to read and write Chinese. (This is my favorite antidote for thinking you're smarter than other people.) Then read The Art of War in its original language.
  • Complete a college degree in physics.
  • Complete a college degree in mathematics.
  • Learn economics, especially microeconomics.
  • Read all of Paul Graham's articles.
  • Teach yourself computer science and machine learning.
  • Start a tech company.
  • Start a non-tech enterprise.
  • Get in shape by lifting weights.
  • Learn history. Make sure you cover at least three major civilizations (China, the Islamic World and Europe is a good place to start). This helps with perspective.
  • Read ethnographies on pastorialism and hunter-gatherers. Two excellent books are Arabian Sands by Wilfred Thesiger and Nisa by Marjorie Shostak. This helps you understand what people were designed for.
  • Learn the basics of evolutionary biology.
  • Acquaint yourself with the research on IQ and the Big 5 personality traits.
  • Take a long-distance trip with $100 in your pocket, earning the money you need to survive en route.
  • Teach classes.

This is... an impressive list. I really mean it.

Some items are pretty much exactly what I need for my goals, and if I had a lot of time I could try a lot more. 

Sadly I need to get as smart as I can really fast. I do know a lot of things that are going in my "first century of life" list, though.

 

I don't judge other people for being stupid anymore than I would judge a dog for being stupid.

It's funny, I got to a similar moral conclusion about an hour before reading it in your answer. 

 

I think of dumb people more like kittens and less like

... (read more)
2lsusr3y
When translated into English, The Art of War loses almost as much as Romeo and Juliet loses when translated into Japanese. If you can't read it in Chinese then this the best translation I know of.

Which gives this person who is asking nothing. Just do what is fun for you wound be a better advice

Daniel Kokotajlo

Nov 02, 2020

180

A wise friend once said to me something like this:

"You could look at all the stuff that's happening in the world, and all the things people are saying and doing, and be like 'They're all monkeys! Monkeys in suits! AAaaaagh!' However, you could also look and say: 'Wow, look at what the monkeys built! It's so cool that they got even this far!"

When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization. Otherwise, civilization would have happened earlier, to our hominid ancestors. We're just monkeys that have learned some cool tricks.

The next thing to remember, of course, is that you're a monkey too. You may be teaching yourself some cool rationality stuff, but you are still a monkey, and if you aren't careful you'll get arrogant/overconfident or some other such problem.

...practices of thoughts that are the cognitive equivalent of shooting yourself in both knees, spreading inside ideologies I deeply care about, because the evils they fight are very real and demonstrated by science, but now I can see how all the truth is hopelessly getting mixed up with stuff that's just stupid or wrong, and that the intelligent people that once introduced me to these ideologies are absolutely incapable of judging and criticising any bad idea that's coming from their own side ...

I sympathize with this bit especially. My reaction tends to be more cosmic horror than anger/frustration though. I tried to express it here.

[-][anonymous]3y160

"When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization. Otherwise, civilization would have happened earlier,"

I actually profoundly disagree with this both empirically and theoretically.

Civilizations are not some kind of natural inevitable 'next step' that must happen when you have a smart animal.  They are a thing that CAN happen in the context of a smart animal that is capable of inventing agriculture.  But there are other prerequisites.

I find the argument that complex culture is a thing that can happen in dense enough human populations, running away as it further densifies the population, persuasive.  The idea is that in a low density human population ideas sometimes fail to percolate down the generations, while in a dense enough social network innovations stick down the generations more frequently because losses are less likely.  It is possible that you can reach a 'tipping point' in a dense enough population at which point the ability to pass on new innovations allows a denser population still and further accumulati... (read more)

4Daniel Kokotajlo3y
Good points. I think I agree with everything you said, so I'm confused as to why we disagree. I guess your model is that we got intelligence + rationality first, and then civilization came later when we had population density, and therefore we might have more intelligence + rationality than we need to sustain civilization. The fact that brain size has been shrinking supports this; maybe we were more rational 15,000 years ago, or at least more intelligent. I think my claim is still true though -- it does seem like civilization would collapse if we got significantly dumber or less rational. I guess I had been meaning "hovering around bare minimum level" more loosely than you.
2Daniel Kokotajlo3y
I think I concede that my argument was shaky and that we probably aren't at the bare minimum level for reasons you mention. But I still think we are close, for a loose definition of close.

When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization.

I think this will be a really helpful thought to keep in mind, thank you.

 

The next thing to remember, of course, is that you're a monkey too.

Also helpful, I think I was starting to think of myself as having done with all the basic biases.

 

My reaction tends to be more cosmic horror than anger/frustration though. I tried to express it here.

I guess I cou... (read more)

aa.oswald

Nov 02, 2020

90

When a person changes their way of thinking radically, it is normal for them to want to tell everybody about them. This happens even if the change is what people here might consider irrational- think becoming religious. There's even a Wikitionary phrase for it, "passion of a convert".

So, the first thing I would say to your anger phase is, "Don't worry, you'll get over it."

If you want to speed up getting over it, it might be useful to practice two things. The first is to really focus on personal improvement and realize you're still a newb. The second is to deeply empathize with why other people do and believe the things they do, and realize that you were that way even a few weeks, months, years ago.

A sophomore in engineering can't feel angry that an undecided freshmen doesn't know calculus. A senior in aerospace engineering can't feel angry that a senior in mechanical engineering doesn't know anything about wing design. Who are you to get angry that a person hasn't memorized yourbias.is when you can't even differentiate the Many Worlds interpretation from the Copenhagen interpretation? 

Everybody is still building out their map, and just because you've luckily found yourself on a part of elevated territory and you're able to make a better map, doesn't mean those with a lesser view are worse

Secondly, it would help to read about how people come to their world views, and also specifically read about how people came to the rat-community. Basically, read people's personal "testimonies" and you'll find that a lot of it is driven by a mixture of personal and cultural facts. Also read testimonies of people that converted into different religions, or even the testimonies of people who didn't convert at all. 

For example, I have a Jehovah's Witness friend. She got very close to deconversion 10 years ago to the point of listing out reasons that the JWs were wrong. Yet, last I saw on Instagram she was going to the JW headquarters and performing missionary work. Her family, her extended family, and most of her friends were all religious. Can I really be angry that her brain said, "Yeah, I'm going to believe what gives me massive amounts of comfort or am I going to believe something that could literally cause my death?"

As far as books, I would encourage reading Jonathan Haidt's The Righteous Mind. The book attempts to look at the evolutionary background for humans' moral systems, and is very good at injecting a large dose of empathy into its readers.  

So, the first thing I would say to your anger phase is, "Don't worry, you'll get over it."

That's a relief.

A sophomore in engineering can't feel angry that an undecided freshmen doesn't know calculus.

Yeah, I usually try to think like that, what I felt lately was more like... finding out that your calculus professor doesn't actually know how to do calculus in one case, and finding that the freshmen in a scientific faculty can't actually manage to understand simple Aristotelian logic... 

Usually I get most of my annoyance from listening to supposed expert... (read more)

Viliam

Nov 03, 2020

80

Reading Less Wrong made me unable to enjoy debating politics. Now the average online debate seems like a competition who is most stupid. When Facebook shows me a news article with more than 100 comments and I read a few of them, I feel dirty.

My recommended first help would be: think less about stupidity of other people, and more about your own. (Applying my lesson on myself: why am I clicking the "comments" link when I see there are more than 100 comments? And why am I even browsing Facebook in the first place?) If you are so rational, why aren't you winning more? Yeah, some things in life depend on cooperation of others. But some other things don't -- have you already maximized those? Why not? Did you already clean up your room?

And my point here is not that if you focus on improving yourself, miracles are going to happen just because you read the Sequences. It's just that focusing on improving yourself has a chance to lead to something useful, unlike complaining about the stupidity of others.

Most people simply don't care about their sanity. It is a fact about your environment, deal with it. To certain degree, this is about the "near" vs "far" thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote. They survive, because they do not try to connect these two parts; it is as if they live in two completely different universes at the same time.

When you think about incentives, here is the reason: in the "near" mode you are rewarded or punished by the natural consequences of your actions; in the "far" mode you are rewarded or punished by the social consequences of your statements. This it makes sense to act reasonably in your everyday life, and spout exactly the type of crazy bullshit that gets rewarded in given social situation. On average. Sometimes following the socially approved action (using homeopathics for actual illness, or not wearing face mask in COVID-19 situation) gets you killed. But historically, way more people got killed becaused they pissed off their neighbors by openly disagreeing with them about something; and it didn't matter who was actually right.

I kinda see people on a scale, roughly separated into three groups: On one extreme, wannabe rationalists. Those are my tribe. On the other extreme, actively irrational; the kind that not only believes something crazy, but won't shut up about it. Those I consider hopeless. But between them, and I think it might be the majority of population, is people who kinda try to do their best, sometimes impressively, sometimes their best is not very good; who have some bullshit in their heads because their environment put it there, but they are not actively promoting it, they are merely unable to clean it up; and who are able to see and listen. With those, I need to find the safe set of conversation topics, and remain there most of the time, sometimes gently probe the boundaries. There is this "agree to disagree" bullshit, which would be intellectually lazy and kinda offensive against your fellow rationalists, but is a great peace-keeping tool between different tribes.

I never try to convert people. I explain, sometimes I nudge. If there is no reaction, I stop.

I am bad at predicting stupid people. I mean, I can vaguely predict that they will most likely "do something stupid", but it is hard to make specific predictions. People are usually driven by emotions: they defend what they like, and attack what they dislike. They like things that make them feel good, and dislike things that make them feel bad (e.g. being told they are wrong about something). But in real-life situations, multiple forces act upon them at the same time, and I can't predict which effect will prevail.

My recommended first help would be: think less about stupidity of other people, and more about your own.

This is generally good advice and I do need to be more mindful of my own stupidity, but my problem isn't that I go searching for other people stupidity so I can get angry at them, more that... I'm getting more and more annoyed every time I accidentally bump into it and I'm trying to avoid reacting by shutting off everything and everyone. Though some of the advice I'm receiving looks helpful about not doing that.

 

... people usually behave quite reaso

... (read more)
2Viliam3y
Yep. Let's not fight about it. I would say that even among rationalists, it may be sometimes useful to settle for: "logically, at least one of us must be wrong... but finding out which one would probably be too costly, and this topic is not that important".
3Emiya3y
Ironically I understood the "too costly" logic between rationalists pretty fast, since I've witnessed arguments being dissolved or hitting an objectively hard barrier to overcome really fast.  When I'm dealing with non rationalists, instead, I kinda have the impression agreement is just behind the corner.  "I understood your point of view and I have changed mine if I was doing a mistake. If we are still talking it means I figured out what mistake you are doing, why can't you just understand what I'm saying or tell me the part you aren't understanding, I'm doing my best to explain and I've been  honest with you..."  That's the sensation I usually feel when I care enough to argue about something and just don't write the effort as hopeless from the start, but it's just that, what I feel, it's clearly not easy at all doing all of a sudden what I specifically trained myself to do.

To certain degree, this is about the "near" vs "far" thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote.

When you think about incentives, here is the reason: in the "near" mode you are rewarded or punished by the natural consequences of your actions; in the "far" mode you are rewarded or punished by the social consequences of your statements.

This is very good.

ChristianKl

Nov 04, 2020

70

I notice logic leaps, cognitive missteps and dumb conclusions of people who are considered smart, deep and expert on stuff while they talk on the radio or on other medias and I get angry.

If you are listening to an expert on the radio and similar mainstream media the expert gives you a dumbed down argument for the position he's holding. 

In an interviews you have cycles of the expert making a complex claim and then the interviewer telling them: "Can you say this in a more concise way?"

If the expert doesn't really get it they might also be told: "Part of our audience is housewifes who never went to college who listen to our program while doing the dishes, can you make your point in a way that the housewife also understands while she does the dishes?" (This example is recounted from memory from https://media.ccc.de/v/24c3-2334-de-die_wahrheit_und_was_wirklich_passierte/related )

The argument that the same expert would make when sitting down with collegues where the expert can have an off-the-record conversation will be more nuanced and complex then the argument the expert gives on the radio.

If you hear an obviously flawed argument on the radio you shouldn't jump to the conclusion that the expert making it is stupid but that they are just not in a position to give you the nuanced argument.

"Can you say this in a more concise way?"

"No."

(When talking to non-experts, most points should become less concise than when talking to other experts, because to meaningfully communicate anything to a non-expert, you also have to communicate the necessary prerequisites that other experts already know.)

4ChristianKl3y
It's a valid stance to take but it's the stance that gets the journalist to ask some other expert that's willing to be concise. Those people you hear interviewed are generally willing to play the game of the journalists.  When being a news consumer it's useful to not have misconceptions about what kind of information you are exposed to.
4Vladimir_Nesov3y
Exactly, that's what makes the question as you formulated it funny. It's not a question, or even a request. It's a non-negotiable demand. If you don't concede, the whole deal is off. Yet not conceding is often the only reasonable thing to do, so it's a demand to be unreasonable masquerading as a question, because don't be rude.

I hadn't thought about this possibility. 

I remember having noticed people explaining badly things I knew were actually right and had better proofs than what was being explained. If I wouldn't know about the evidence already I wouldn't have noticed they were misrepresenting the position, but the subjects I get angry on are rarely the kind of things where background knowledge is so complex you can't explain it properly to a laymen.

Some of what I got angry about were just plainly stupid ideas, it doesn't look plausible that the people talking had better ... (read more)

3ChristianKl3y
I remember a TV interview I did with a friend on Quantified Self. One of the elements was my friend measuring stress with a emWave2. In the process of dumbing down the complexity of what we were doing to make it TV compatible, my friend in the end said that he was measuring heart rate with the emWave2 to measure stress.  The thing that emWave2 actually measure is heart rate variability but there was no time to explain what heart rate variabilty is. If a viewer would actually understand the subject matter they would rightfully find it strange that my friend said he measures heartrate for stress but for the average viewer that inaccuracy wouldn't be a big deal.  Complexity reduction like that happens when focusing on expressing oneselves in a way that works on TV and the radio.
1Emiya3y
I see, thank you for this example.  I'll remember to prepare the dumbed down explanations in advance, in my plans I'll have to communicate a lot in the future.

Vladimir_Nesov

Nov 02, 2020

60

I experienced something similar with spelling mistakes for a while. The solution was to explicitly conceptualize text-on-the-page as separate from idealized-text, so that the mistakes could be imagined to be blissfully absent in the idealized text.

The issue is that when you notice a bug, there is an urge to fix it that demands satisfaction. Sometimes, there is an actual plan that fixes the bug, but intuition won't come up with it, so deliberative thought needs to help. When fixing the bug is not on the table, it might suffice to just carefully formulate what's known about it, perhaps writing up some notes.

For people, productive activities include charity and steelmanning: figuring out why a behavior actually happens and how to channel its purpose better.

Thanks for the link, the chewing example does feels similar to my experience, will try to think about that.

waveman

Nov 03, 2020

50

It is kind of a meme that people learn about rationality and then observe how irrational everyone else is . It is a lot easier to observe others' irrationality than your own. But probably one's own irrationality is more important. 

1. So, work on your own irrationality first before focusing on others' limitations. 

2. As for dealing with other people's irrationality, see (1). 

3. Finally, people are going to do what they want to do. With some very rare people you can introduce them to rationality things and they might change. With most, they cannot or don't want to be rational. This is the reality that you need to rationally deal with. 

4. Also be aware that full rationality is not possible. This in the sense that you cannot do all the calculations needed to behave totally rationally. You need to employ all sorts of heuristics and short cuts. My computational capacity is limited, also my memory. Gathering data is costly.  Time is short. So tolerate other people who deal with this in ways you might not prefer.

It is kind of a meme that people learn about rationality and then observe how irrational everyone else is . It is a lot easier to observe others' irrationality than your own. But probably one's own irrationality is more important. 

I've just felt how much this is true by thinking about some of the answers I got. 

There really is a huge difference between just "knowing" something (I'd have knew this even before being told in these replies) and actually realising that I was making stupid mistakes in how I thought about this very subject. 

I would... (read more)

I like how much your answer bears resemblance to advice on other subjects unrelated to rationality.

Gunnar_Zarncke

Nov 03, 2020

40

Related: The treacherous path to rationality

I think that people don’t want to use explicit reason. And if they want to, they fail. And if they start succeeding, they’re punished. And if they push on, they get scared. And if they gather their courage, they hurt themselves. And if they make it to the other side, their lives enriched and empowered by reason, they will forget the hard path they walked and will wonder incredulously why everyone else doesn’t try using reason for themselves.

Maybe your question is addressed by this part:

People just like their friends. It simply feels right. It’s what everyone does. The way out of the valley [of disintegration] is to not to reject this impulse [...] but to integrate your deep and sophisticated friend-liking mental machinery with your explicit rationality and everything else.

The way to progress in rationality is not to use explicit reason to brute-force every problem but to use it to integrate all of your mental faculties: intuition, social cognition, language sense, embodied cognition, trusted authorities, visual processing… The place to start is with the ways of thinking that served you well before you stumbled onto a rationalist blog or some other gateway into a method and community of explicit reasoners.

I was really puzzled reading that post, to me learning rationality always felt wonderful, my first round with it was like I had suddenly noticed I was living in a really small cage inside my head, and now I could suddenly open the door to get out and walk outside on my legs for the first time and then run. Now that I'm finally managing to continue I feel like the rest of the world just gets clearer and clearer to understand, even if I got these negative emotions as side effects. 

I can only assume I was the ideal subject to learn it, when I stumbled in... (read more)

2Gunnar_Zarncke3y
I didn't intend to imply that learning rationality can feel difficult or hard. It sure didn't for me as my path started early and I had a lot of support. But I guess it can be challenging in some circumstances.
3Emiya3y
I understand, what I meant was that I initially felt confused reading the post you linked, since that one did implied that a lot of people do.  But having thought about it, it seems likely that a lot of people would find themselves in those challenging circumstances.
2Vladimir_Nesov3y
Intuition is distilled deliberation. Deliberation is a sequence of intuitive steps, amplified intuition. A given intuition is formed by (and stands for) the dataset that trains it, the habits of deliberative thought on its specific topic.

The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.

—George Bernard Shaw

I'm all for having an accurate map, and that does mean updating that map. But don't let that stop you from trying to alter the territory—and actually fixing problems.

If the world fails to meet your expectations, sometimes the problem is with the world.

2Stuart Anderson3y
-
4Vladimir_Nesov3y
I strongly disagree (about "by definition"; it's of course a popular sense of the word). Operationalization of caring is value, preference. It's channeled by decision making, and deliberative thought is capable of taking over decision making. As such, it may pursue an arbitrary purpose that a person can imagine. A purpose not derived from emotion in any way might be thought to be an incorrect idealization of preference, but even a preference ultimately grounded in emotion will be expressed by decisions that emotions are occasionally incapable of keeping up with.
1Stuart Anderson3y
-

When the world fails to meet your expectations then the problem isn't with the world.

This situation is an issue of emotional regulation (at the very least). I can recommend DBT as efficacious there.

Yes, that's exactly how I was looking at it, though I guess I didn't made a very good job at explaining that in my question. 

I mean, I still think the current lack of rationality in the world is a big problem, but it's not like I expect people to do better any time soon, I was just looking for ways to avoid feeling like I feel when I'm reminded of that. 

I'll look into DBT and try your advice, thanks.

1Stuart Anderson3y
-

remizidae

Nov 02, 2020

40

Maybe it would help if you realized that most people most of the time are not interested in being explicitly rational. They’re focused on something else: often they’re focused on building relationships, or getting a task done, or enjoying themselves. Maybe you could try focusing on those things too, especially the relationship-building bit, instead of choosing between “tearing apart” or ignoring what they say.

Also, I don’t know how old you are, but I’ve noticed that the people I interact with have gotten more congenial over time. As a child/teen/college student, many of my interactions were with nonchosen family or classmates. Now most of my interactions are with chosen family, friends, or workmates filtered to be more like me.

Oh, and since you mention being annoyed by “experts” on the radio, maybe...don’t listen to the radio or other media. You probably don’t need to do that, you’re not getting any relationship-building benefits out of it, and it’s annoying you.

Maybe it would help if you realized that most people most of the time are not interested in being explicitly rational.

I'm afraid that's the main reason I'm getting angry at them, the utter lack of trying at being intelligent when they have to choose what to do or believe. 

I never get angry at people for enjoying something stupid, or felt like they should treat each other are robots, or because they just follow (non evil) instructions, that I can understand.

I get angry only when it involves something where they really, really should try to get it right... (read more)

cousin_it

Nov 03, 2020

30

One aspect of intelligence/rationality is estimating the productiveness of a conversation before it happens. Another is expressing your views in a way that sounds palatable even to those who disagree. Another is recognizing that on any given topic there are more knowledgeable people than you, and seeking them out. Another is directing most of your effort and emotion toward things you can influence. You can't learn these things from a book though, you have to practice them.

I think I'm doing more or less okay in most of these (still room for a lot of improvement of course), my problem seems to be focused around:

1)

Another is expressing your views in a way that sounds palatable even to those who disagree.

I can only do this if I understand how someone thinks, and I have to get a better model of how people I usually wouldn't really want to talk to think. (I need for my goals to be able to influence those kind of people as well)

2)

Another is directing most of your effort and emotion toward things you can influence.

I'm pretty good a... (read more)

TheFishBowl

Nov 05, 2020

20

"I know it's really unfair because I didn't know any better mere weeks ago, and years ago I was a good textbook example of an intelligent person who'd keep mainly using his intelligence to rationalize whatever questionable decisions he made, but I just can't help it."

If we approach this from an economists lens the situation seems to change slightly. To an economist, a rational actor is someone or something that acts in her own self interest. Acting in ones own self interest is to make decisions where the foreseen benefits outweigh the foreseen costs. This means that even someone who is addicted to a hard drug and continues to use that drug is acting rationally as long as the benefits of continuing to use said hard drug outweigh the costs for that particular person. 

Societally the values may not align but that doesn't mean that they are irrational. It just means that our foreseen benefits and costs are different from theirs. If you want to go down that rabbit whole, look into behavioral economics. They like to claim that people can act irrationally.

I understand what you mean, but, under these lens, I'd be using "irrational" to describe thought processes that negatively affect attempts to estimate the foreseen benefits and costs of a decision, or that cause people to connect their foreseen benefits with actions that have no real reason to lead to those.

Also, the way the word is used on this site, "rationality" is also the art of managing to not have your short term benefits get in the way of the real long term benefits you'd rather choose, and in choosing which foreseen benefits and costs should matte... (read more)

BladeDoc

Nov 02, 2020

20

People are not only not rational, most do not WANT to be rational, and value many other things higher than rationality. Remember the story arcs of characters like Spock, Data, and Sheldon do not celebrate their becoming more rational.

That's an interesting thought, I was aware that most fiction kept saying to people that Kirk beats Spock, I hadn't noticed that even the character arcs of rational or smart characters are almost never about them getting smarter or improving their minds...

I think I saw that in a very few manga, even those that have a genius main character doesn't think about him getting more rational or smarter at all, he's just a genius from start to finish, any progress he makes are usual on other sides of himself...

It seems this is really something that's lacking from our culture.

Productimothy

Jan 30, 2023

10

You may be confused by some of my response. I'm well aware it deviates substantially from your inquiry--there is just substantial back-end stuff I think would help your autonomy to more efficiently improve in anything.

In Eliezer's "12 Virtues of Rationality", read the last virtue--the nameless virtue of the void. Take what follows as a guide to approach what he writes.

You appear to be approaching these problems with a vague mainframe--possibly even rationality as a whole with a vague superframe. When you ask for advice and sources to help, you think you want the subframes, which will fit on your vague mainframe. While it will correlate to better decisions and will eventually lead to a clear mainframe, it will not nail them as efficiently or as expansively than could be accomplished if you were to deliberate it the other way around (recall the effects skimming a book before reading, or defining the purpose before action, versus reading the book before skimming or acting without purpose).

To devise a mainframe, though, you do need some knowledge both about how to best make a schema and general knowledge about your area of improvement. Very quickly, you will find yourself scaffolding a formalization of the outer-boundaries of what you and rationality currently knows.

This principle can be applied to learning efficiency, rationality, or anything cognitive. This is how the mind works most naturally. This is what top thinkers are actually doing; it is how some people see the world clearer than others. This is how you prevent yourself from creating sub-optimal circumstances from within your own confusion and ignorance. This is not clearly widespread, and much less so brought to application. There are tools and decisions that arise from it. 

If you do not have a clear and accurate model on which to assess yourself, you cannot expect to understand the beat of a situation, will not respond in the best way pragmatically possible, and your improvement will be drastically slower. You may be guessing about what exactly constitutes your insufficiency and thus not target your limiting attributes as well.

This is to aid you in constructing a proper mainframe for your specific inquiry:

When you feel emotional tension, there is two options: you can change yourself or you can change others. Pragmatically, you cannot often change others. It is the job of your short-term advocate to choose, and it is the job of your long-term advocate to make the prior knowledge required to assess if it can (or should) be done.

With tension, there is some underlying value you are predisposed to assume. You can change this emotional tension from within the experience by changing your lens from which you are viewing it. Or, you can train the predisposition, which is to internalize general features of the desirable type of lens-changes.

Both are indispensable for a bounded rationalist. Training the predisposition means you can make better decisions across more instances, quicker, and with less cognitive effort. And being able to change your lens real-time is a good patch where your predisposition is insufficient. This autonomy can be defined as a controller of predispositions. 

You do not want to eradicate emotional tension, you merely want to get rid of the unhelpful tension. Tension within can be extremely useful because it necessitates thought and behaviors to occur. We just want those thoughts and behaviors to be aligned to wider knowledge and purpose. My wider purpose through my bottle-necked knowledge, in short, is to minimize human suffering while maximizing sustainability.

Don't let these simple words fool you--there is a great complexity to what they actually mean and how they may be applied. Abstract thinking applied seems to be the foundation for all decision-making; this is what rationality is in thought and action. Abstractness prevents details, thus inherently coming out more correct. After practice and targeted training can one refine his abstractions down to subsets of abstractions, and further still.

I recommend these two as the strongest sources that have brought me to the above propositions. 
ICanStudy ("chunkmapping" is what they call the efficient frame-making. I cannot think of a more efficient and pragmatic way to organize a schema. Principles: Video 1, Video 2.)
and Jordan Peterson's lecture series 2017 Personality and its Transformations.

5 comments, sorted by Click to highlight new comments since: Today at 1:52 PM

Have you seen Street Epistemology yet? It's an effective way of leading irrational people to notice their own contradictions, but it does take some patience.

I'll give that a look, maybe I could try that on the people I care about.

Let us know how it goes.

I have the same problem. But I kind of focus on my goals and don’t care so much about what other people say do or recommend. I also doubt that learning about rationality changed you. It was caring about rationality. Because I cared about it most of the time quiet deeply and I was a bit like that all the time. Find people like yourself and if there are no people like yourself just do what you enjoy. And to a certain extent you can enjoy irrational people. They have often some resemblance of humor. Also we are probably not totally rational. 

I know I'm not totally rational, most of my anger was coming from my own cognitive missteps that I was explicitly warned against while studying rationality. I also knew that I wasn't perfect before, I think my anger came out when I witnessed rationality dropping below what I thought were unjustifiable levels (because I was failing to understand what could be messing up other people's cognitive skills or how they could have had so few to start with). 

I can hear my thoughts and I can see they have changed. I'm performing all kind of mental operations that I wasn't doing before, so the way my brain produces beliefs has changed.

Also, I was caring about rationality a lot before going in the second phase of my training. I don't think I care more now, just that I know better about it and I can see irrationality more clearly.

 

You are right about the fact that my "caring about rationality" so much was the fuel for my change, I wouldn't have applied myself this hard if I didn't just cared so much about finally having a way to be a lot smarter.