Improving in mathematics forced me to unlearn many of the cognitive tools I learned from the Sequences.[1]
One of them is the narrative of noticing confusion. “When you feel confused…” it goes, “…it is because your beliefs contradict reality. Instead of quashing the feeling, expand it, and open your eyes to reality to replace the erroneous beliefs.”[2][3]
In reality confusion arrives more often due to contradictory beliefs than due to input from reality. More often than “I believed X, but observed not X,” one encounters “I believed X and Y, but just inferred a contradiction between them.” This is a generalization: an observation is not pure sense-data, but an interpretation (a belief) of that sense-data.
Therefore resolving confusion doesn’t necessarily make your beliefs more accurate. To see examples, read Hasidic philosophy. “The Torah says X, which contradicts common sense W. The Torah says X, which contradicts when it says Y. The Torah says X, but why didn’t it say not not X, which is synonymous but differently connoted? To understand this, let’s add seventy epicycles of absurd metaphysics, which we justify by wordplay.”
Habitually avoiding truth-y reasoning and orienting towards fast, clear feedback from reality might suffice to solve this problem. The latter is very hard in all interesting domains.[4]
It doesn’t matter. The real problem is much worse.
–
There’s an underlying assumption that confusion arises in response to a particular situation in reality. It doesn’t.
We know this because you can decide to feel confused at will. Here’s how:
Pick two things.
Find how they are like each other.
Find how they are unlike each other.
Notice the dissonance.
This algorithm is essential to doing mathematics.
(1) Rewriting something in an equivalent form can yield a different intuition. Examining how your intuitions are altered can yield insight. Thurston uses the example of derivatives in the second section of “On Proof and Progress in Mathematics.”
(2) “Advancement often comes from new proofs of old theorems,” is something I heard from a (very successful) mathematician who would write a classical theorem on the blackboard to teach to undergraduates, look at it, and seem surprised at the result.
(3) Outside the realm of easy mathematics, I don’t fully understand theorems even after I can fluently prove them. Definitions and methods become so detailed that one usually finds strangeness upon zooming in.
To do mathematics effectively one manually activates their feelings of knowledge and ignorance. Unprincipled switching of cognitive modes is an essential element of intellectual activity.
Emotions are not signals, but tools.
–
Once I understood this error I saw it everywhere.
“Actually, I still don’t think this is an error. Your emotions are tools rather than signals because you restructured yourself to make them that way to improve at mathematics, possibly to your general detriment! Other people’s emotions are signals.”
Thank you for your disagreement, interlocutor. You are wrong.
Human biology was made by a causally-naive evolutionary algorithm on a data set mostly from the environment of evolutionary adaptation. We are far out-of-distribution, so we should expect the products of that algorithm to behave strangely.
To the extent emotions are biological, many of them are already instrumental rather than epistemic. An evolved anger response in an organism is a credible threat of retaliation against violence in the same way that bright colors are a credible warning to predators.
Culture installed emotional software in you. Some of it is to your benefit. Tradition is usually smarter than you are. Some it is to your detriment. Tradition does not optimize for your well being!
Both of these narratives predict that emotions are neither “signals” nor “tools,” but rather deranged self-contradictory bundles of heuristics.
“That contradicts my theory, but it also contradicts yours.”
True. Emotions are more like materials, out of which to build strategies of behavior. My point is that, in terms of quality, we’re working with jagged rusty pieces of shit. Emotions are not well-formed, they are not precise. Cognition requires precision. Better to use emotions for things that don’t require precision or reward anti-precision. Persuasion rewards anti-precision because people like moral simplicity. Trial-and-error rewards anti-precision because to avoid wastefully repeating effort one needs a source of random bits.
The arationality[5] of emotion is easy to notice if you have a mood disorder.
One can also observe that changing superficial features of a situation changes its emotional tone. The canonical example of this is Scott Alexander’s “Wirehead Gods on Lotus Thrones.”
Another example is emotional contagion. Emotional contagion could be rational if other people’s emotions were themselves rational. They aren’t, so it isn’t. Even worse, emotional contagion is often used adversarially. “I’m panicking so you panic as well and now you’re gullible and aggressive with short time horizons give me money!” Fuck Twitter.
–
One context in which I often encounter this error is in internal conflict resolution mechanisms, which tend to treat emotions as “functional” or “rational.”
“I used IFS and talked to my subagent which–“
No you fucking didn’t. “Agent” means something specific, even if it’s not clear what that thing is. “Agent” also connotes coherence, and extreme incoherence is exactly the condition that motivated you to use IFS in the first place. What you actually did was perform an animistic ritual as self-therapy. This is not an insult. If it works, it works. Just don’t apply an insufficiently justified ontology. If you have to do so for the technique to function, do it privately to not poison the world-models of other people.
“I resolved my pathological fear when I realized it really just wanted to protect me from–“
Fear doesn’t necessarily want anything. Try again.
“I had an unhealthy fear response. I observed that if I acted in the way it led me to feel inclined to, I would relax. So I interpreted it as serving a functional purpose of avoiding something dangerous. I considered various possibilities and achieved insight into what dangers I want to avoid. I committed to avoiding those dangers even when not in the throes of the fear response, and the fear mostly disappeared.”
Better! Go with my blessing, so-called post-rationalist.
–
Another common context in which one notices this error is in deeply immature people and the media designed for them, i.e. most public-facing media. These people are unwilling or unpracticed in distinguishing what they feel to be real from what is real. As a consequence, adversarial emotional contagion can be used to steal from them. This is why your inbox of political emails looks the way it does.
–
Emotions are a pile of evolutionary and cultural detritus. What does this mean for us?
Dunno.
Some people think many emotions can be radically altered without changing behavior. If this is true then it’s obviously desirable to be happy all the time.
To the extent this isn’t true there is a problem. Imagine I want to subject my emotions to instrumental rationality. To do this I must ask “which emotions produce effective behaviors?” which queries my model of the world. My emotions influence my model of the world. Emotional states in which you frequently find yourself tend to be self-endorsing. Otherwise they would be unstable and you wouldn’t frequently find yourself in them.
There is no obviously privileged emotional vantage point from which to make predictions about human behavior. While depressed I overestimate the degree to which other people are hostile towards me. While happy I intuitively underestimate it, and must use metacognition to notice how my current intuitions differ from my stated beliefs.
While in a neutral mood I am disinterested in reality and do things mostly out of inertia.
It seems increasingly probable that emotions are actually net-negative for cognition, mostly due to their widespread adversarial use. Spock Was Right?
Whenever I think about my fundamental flawed nature as a human being, I feel bad. This too is a fundamental flaw.
The Sequences are a rich text, so I don’t want to put words in its mouth. ¾ this is an oversimplification of the Sequences which results from my own unsophisticated, identitarian, or secondary-source-contaminated reading
Because the rationality community was founded by Eliezer Yudkowsky and assigns status to independent thought, disagreeing with Eliezer has gone from a status move to a cliché to the default. This is unfortunate because by default signaling is orthogonal to truth. My current strategy to gain status is to agree with Eliezer Yudkowsky whenever this option is permitted by Truth and Honesty.
For example, “Does the afterlife exist” is not a question that can be directly answered by data which is close to the issue, unless you’re willing to pay a heavy price for the data.
Note the word “arational” and not “irrational.” The whole point is that emotions don’t have goals or beliefs, so it doesn’t make sense to describe them as irrational. They’re just processes, of no clear origin or privileged purpose.
Improving in mathematics forced me to unlearn many of the cognitive tools I learned from the Sequences.[1]
One of them is the narrative of noticing confusion. “When you feel confused…” it goes, “…it is because your beliefs contradict reality. Instead of quashing the feeling, expand it, and open your eyes to reality to replace the erroneous beliefs.”[2][3]
In reality confusion arrives more often due to contradictory beliefs than due to input from reality. More often than “I believed X, but observed not X,” one encounters “I believed X and Y, but just inferred a contradiction between them.” This is a generalization: an observation is not pure sense-data, but an interpretation (a belief) of that sense-data.
Therefore resolving confusion doesn’t necessarily make your beliefs more accurate. To see examples, read Hasidic philosophy. “The Torah says X, which contradicts common sense W. The Torah says X, which contradicts when it says Y. The Torah says X, but why didn’t it say not not X, which is synonymous but differently connoted? To understand this, let’s add seventy epicycles of absurd metaphysics, which we justify by wordplay.”
Habitually avoiding truth-y reasoning and orienting towards fast, clear feedback from reality might suffice to solve this problem. The latter is very hard in all interesting domains.[4]
It doesn’t matter. The real problem is much worse.
–
There’s an underlying assumption that confusion arises in response to a particular situation in reality. It doesn’t.
We know this because you can decide to feel confused at will. Here’s how:
This algorithm is essential to doing mathematics.
(1) Rewriting something in an equivalent form can yield a different intuition. Examining how your intuitions are altered can yield insight. Thurston uses the example of derivatives in the second section of “On Proof and Progress in Mathematics.”
(2) “Advancement often comes from new proofs of old theorems,” is something I heard from a (very successful) mathematician who would write a classical theorem on the blackboard to teach to undergraduates, look at it, and seem surprised at the result.
(3) Outside the realm of easy mathematics, I don’t fully understand theorems even after I can fluently prove them. Definitions and methods become so detailed that one usually finds strangeness upon zooming in.
To do mathematics effectively one manually activates their feelings of knowledge and ignorance. Unprincipled switching of cognitive modes is an essential element of intellectual activity.
Emotions are not signals, but tools.
–
Once I understood this error I saw it everywhere.
“Actually, I still don’t think this is an error. Your emotions are tools rather than signals because you restructured yourself to make them that way to improve at mathematics, possibly to your general detriment! Other people’s emotions are signals.”
Thank you for your disagreement, interlocutor. You are wrong.
Emotions are the product of biology and culture.
Human biology was made by a causally-naive evolutionary algorithm on a data set mostly from the environment of evolutionary adaptation. We are far out-of-distribution, so we should expect the products of that algorithm to behave strangely.
To the extent emotions are biological, many of them are already instrumental rather than epistemic. An evolved anger response in an organism is a credible threat of retaliation against violence in the same way that bright colors are a credible warning to predators.
Culture installed emotional software in you. Some of it is to your benefit. Tradition is usually smarter than you are. Some it is to your detriment. Tradition does not optimize for your well being!
Both of these narratives predict that emotions are neither “signals” nor “tools,” but rather deranged self-contradictory bundles of heuristics.
“That contradicts my theory, but it also contradicts yours.”
True. Emotions are more like materials, out of which to build strategies of behavior. My point is that, in terms of quality, we’re working with jagged rusty pieces of shit. Emotions are not well-formed, they are not precise. Cognition requires precision. Better to use emotions for things that don’t require precision or reward anti-precision. Persuasion rewards anti-precision because people like moral simplicity. Trial-and-error rewards anti-precision because to avoid wastefully repeating effort one needs a source of random bits.
The arationality[5] of emotion is easy to notice if you have a mood disorder.
One can also observe that changing superficial features of a situation changes its emotional tone. The canonical example of this is Scott Alexander’s “Wirehead Gods on Lotus Thrones.”
Another example is emotional contagion. Emotional contagion could be rational if other people’s emotions were themselves rational. They aren’t, so it isn’t. Even worse, emotional contagion is often used adversarially. “I’m panicking so you panic as well and now you’re gullible and aggressive with short time horizons give me money!” Fuck Twitter.
–
One context in which I often encounter this error is in internal conflict resolution mechanisms, which tend to treat emotions as “functional” or “rational.”
“I used IFS and talked to my subagent which–“
No you fucking didn’t. “Agent” means something specific, even if it’s not clear what that thing is. “Agent” also connotes coherence, and extreme incoherence is exactly the condition that motivated you to use IFS in the first place. What you actually did was perform an animistic ritual as self-therapy. This is not an insult. If it works, it works. Just don’t apply an insufficiently justified ontology. If you have to do so for the technique to function, do it privately to not poison the world-models of other people.
“I resolved my pathological fear when I realized it really just wanted to protect me from–“
Fear doesn’t necessarily want anything. Try again.
“I had an unhealthy fear response. I observed that if I acted in the way it led me to feel inclined to, I would relax. So I interpreted it as serving a functional purpose of avoiding something dangerous. I considered various possibilities and achieved insight into what dangers I want to avoid. I committed to avoiding those dangers even when not in the throes of the fear response, and the fear mostly disappeared.”
Better! Go with my blessing, so-called post-rationalist.
–
Another common context in which one notices this error is in deeply immature people and the media designed for them, i.e. most public-facing media. These people are unwilling or unpracticed in distinguishing what they feel to be real from what is real. As a consequence, adversarial emotional contagion can be used to steal from them. This is why your inbox of political emails looks the way it does.
–
Emotions are a pile of evolutionary and cultural detritus. What does this mean for us?
Dunno.
Some people think many emotions can be radically altered without changing behavior. If this is true then it’s obviously desirable to be happy all the time.
To the extent this isn’t true there is a problem. Imagine I want to subject my emotions to instrumental rationality. To do this I must ask “which emotions produce effective behaviors?” which queries my model of the world. My emotions influence my model of the world. Emotional states in which you frequently find yourself tend to be self-endorsing. Otherwise they would be unstable and you wouldn’t frequently find yourself in them.
There is no obviously privileged emotional vantage point from which to make predictions about human behavior. While depressed I overestimate the degree to which other people are hostile towards me. While happy I intuitively underestimate it, and must use metacognition to notice how my current intuitions differ from my stated beliefs.
While in a neutral mood I am disinterested in reality and do things mostly out of inertia.
It seems increasingly probable that emotions are actually net-negative for cognition, mostly due to their widespread adversarial use. Spock Was Right?
Whenever I think about my fundamental flawed nature as a human being, I feel bad. This too is a fundamental flaw.
Then I had to relearn them, because in the absence of a guiding normativity I fell towards the wishful-thinking attractor. There’s a lesson in that.
The Sequences are a rich text, so I don’t want to put words in its mouth. ¾ this is an oversimplification of the Sequences which results from my own unsophisticated, identitarian, or secondary-source-contaminated reading
Because the rationality community was founded by Eliezer Yudkowsky and assigns status to independent thought, disagreeing with Eliezer has gone from a status move to a cliché to the default. This is unfortunate because by default signaling is orthogonal to truth. My current strategy to gain status is to agree with Eliezer Yudkowsky whenever this option is permitted by Truth and Honesty.
For example, “Does the afterlife exist” is not a question that can be directly answered by data which is close to the issue, unless you’re willing to pay a heavy price for the data.
Note the word “arational” and not “irrational.” The whole point is that emotions don’t have goals or beliefs, so it doesn’t make sense to describe them as irrational. They’re just processes, of no clear origin or privileged purpose.