That's actually one I wanted to link but I just could not remember the title for the life of me. Thanks!
My mental model of this is that in an adversarial world the mind must resist most changes, so each little part of the map is like a rubber band that snaps back until ripped or worn out. Or like a puzzle piece that cannot be arbitrarily replaced without changing a whole bunch of pieces around it.
There is an aspect you did not mention, which seems important to me: it is easier to change minds by peer pressure than by one person's arguments. The evolutionary story is that in case of a conflict, we "want" to join the stronger side; that's how we more likely survive the fight.
Therefore, the greatest problem of one person's arguments, no matter how smart or convincing, is that ultimately they are one person's arguments. Even worse if they are merely your own. Your brain keeps waiting for greater social approval.
Cults, on the other hand, create the feeling of almost universal social approval. They often achieve it by cheating, for example they discourage talking to outsiders and reading/watching mainstream sources, so that everyone you interact with supports the same ideas. And this can change your behavior dramatically.
People often change their minds dramatically when they change their environment, for example when they move from their family to a campus.
Therefore, if you want to change your behavior, I think it would help to have a group that would hold you accountable for your actions.
Good catch, didn't think of that. Definitely seems like peer pressure is a better way to change minds rather than one-on-one. This is still parasitism, though - I don't know if I'd trust most people to form a group to hold me accountable for changes in my behavior. Seems too easy for them to, intentionally or not, shape my request into ways that benefit them.
For example, I might form a group to help me lose weight. I care very much about my physical wellbeing and reducing discomfort, but they might care more about my ugly appearance and assume that's what I'm going for, too. Worse yet, my discomfort is invisible to them, and my ugliness in their eyes is invisible to me!
Certainly not an insurmountable obstacle, of course, but one to be aware of.
EDIT: I read your paragraph on cults and then completely ignored it when writing my response. Of course you know that peer pressure can be bad, you said it yourself. My mistake.
Seems too easy for them to, intentionally or not, shape my request into ways that benefit them.
Yes. (Alignment problems everywhere.) It is better if your goal is measurable somehow, so you could provide a report with numbers, and the audience would... clap if the numbers increase, or something.
"Losing weight to feel comfortable" is like the opposite of this, and it takes a lot of time. Probably would need to replace it with an instrumental goal such as "get weight from X to Y" (to make it obvious it is not your goal to keep going below Y; getting to Y counts as success full stop). And there may be other things that could make you comfortable, for example buying softer shoes. Or, exercise could improve your muscles and make you feel better, without actually losing weight.
Another possible approach is to reward work, not outcomes. Like, you could make a plan "exercise twice a week, stop drinking soda", and then just report every week whether you did this or not. The group would reward the effort.
All approaches have their disadvantages (e.g. the work you reward may actually not lead to the desired goal), but if it's up to you to define and change your goals, you can try different things and see what works.
This is good stuff, thank you. I think these are all good ways to avoid the trap of letting others decide your goals for you, and I like the idea of continuously changing your goals if you find they aren't working/have been Goodharted/etc.
Stubbornness of beliefs is like the dial of progress, the appropriate response is to pull sideways, focus on asymmetric weapons that break upon misuse, instead of applying symmetric ones in either direction. Then there's charity, opening walled embassies in one's own mind for others' worldviews, to make more informed decisions about what to do about them.
Sounds about right! Thanks for these links, I look forward to reading them. Pulling sideways is an underappreciated life skill - sometimes you have to question the playing field, not just the game.
Epistemic status: Tying together the great works of others into something less great
I think the mind's defenses against change in its beliefs are a form of anti-parasitism.
Society commonly bemoans the difficulty in changing people's minds. We wish we could change the minds of our friends and family about all sorts of issues: vaccines, policy issues, religious beliefs or lack thereof, and on and on.
We struggle to convince ourselves of things, too. Diet, exercise, sleep, laziness or workaholism. We make the same New Year's Resolutions, year in and year out, only to drop them after a week or two, just like every year past.
When we try to change someone's mind, even our own, we do so in a remarkably useless way. If we're not flinging angry insults on Facebook, at best we're running down the same old list of reasons to believe X that they've already heard, processed, and created counterexamples for years ago. When we try to change our own minds, we try pumping ourselves up, insulting ourselves, or just declaring Today to be The Day that Things Change™.
I've been in more arguments and debates than I care to count. They usually don't lead anywhere: I say my side, they say theirs, and we both walk away even more convinced of our respective correctness. I've been the recipient of much well-meaning advice over the years, too, but my mind is excellent at making up counterexamples or not being swayed by the words (most of which are unchanged since I first heard them). And, of course, I've told myself many things about how to live and what to do. Sometimes I even believe my mind has finally integrated them.
In some sick meta joke, our belief that mere arguments and debates and advice can change minds is itself resistant to change. We get caught in the same old arguments that go exactly the same way. We use the same old talking points, believing subconsciously that this time, they'll surely work. Our debates become mere reflex, yet we never question our methods when they invariably fail again.
The mind's defenses are powerful. If the part of the brain that is able to integrate new information is damaged, we see a module of the brain completely take over and spin up arbitrary stories defending pre-existing beliefs.
Even in healthy people, our beliefs color our perception. Any ambiguity is resolved, automatically, subconsciously, in favor of what we already know. The misanthrope believes that the brusque person was angry at them specifically, the anxious person believes that the laughter of the crowd is at the way they look... everything neatly resolves to the map we already carry.
Why is it so hard to change a person's mind?
Well... what would the world be like if it was easy?
Stubbornness of Beliefs as a Form of Anti-Parasitism
Imagine that a well-reasoned, seemingly airtight argument in favor of some position was all that was necessary to change a person's mind. Or that a strong appeal to emotion worked just as well.
We'd all be happy slaves of whatever cult first rose up. We'd meet with a charismatic leader who knew all the right emotional buttons to push, backed up those emotional statements with solid-looking data that addressed many common counterarguments, and we'd be eating out of the palm of his hand.[1]
Bad beliefs, beliefs that cause you to serve others without your consent, are parasites, no better than the fungus that takes over an insect's brain.
If beliefs were easy to change, people would be able to trivially recruit anyone to their cause. This would cause a huge negative selection pressure - people who have been trivially recruited into a cause not their own are also much more likely to die, or at least not reproduce.
That is, if an evil, lying leader can recruit people with a minor amount of effort, individuals become much less valuable. Tending to their needs and their goals becomes much less important to the leader, and I feel like that would lead to evolution making minds harder to change as all the gullible people had long been neglected to death.
We see this in real cults. A cult cannot recruit people in one seminar or a few minutes with a charismatic figure. Cults slowly, systematically break you apart from your normal life, friends, and family. They subject you to months or years of psychological torment. The damage they do can take just as long to unlearn as it took to learn. The mind's defenses are formidable.
This is Gonna Hurt
So, what does change people's minds? Few of us are cult leaders, and many of us have beliefs in ourselves or others that really would be better off abandoned. Consider the poor agoraphobic who spends most of their life in their room, or the sick man who doesn't trust doctors. Consider ourselves: over- or underweight, exercising more or less than we'd like to, caught in relationships that we don't like, or too afraid to take risks in new jobs or new cities.
Well, words don't work. Adam Mastroianni has an excellent post on this very topic over on his Substack. Not only does he talk about the brain's natural defense against words, he also talks about how abysmally lossy words are in describing human experience. It's a really good piece, definitely one of my personal favorites among the entire Rationality project.
(You may fairly ask "if words don't work, why do we use them?". Scott Alexander answers that via XKCD: we seem to have an instinct to defend our beliefs, and words is one of the ways we do that. It seems reflexive, and it makes us feel better, even if we don't actually change anyone's mind.)
The brain is malleable, although you may not like the results. Scott has also written about using psychedelics to lower the mind's defenses against belief change. Positive anecdotes abound: people's long-standing ruminations were immediately thrown into the light and dissolved in their own obvious wrongness like mist. Unfortunately, lowering the brain's defenses doesn't just work for true beliefs but also for false ones. People, per Scott, would sometimes walk away with weird new beliefs that also didn't match the territory.
I think there's only one thing that actually works. Moments when the territory and the map don't match. Moments where we encounter something that our beliefs say should be extremely rare. Moments that have such low Bayesian priors on their occurrence that we are shocked. Moments so obviously correct that our brain doesn't even try to defend existing beliefs. Failing that, a constant stream of weaker evidence, like when Daryl Davis convinced 200 KKK members to leave the group over the course of many years.
It has to be real, Bayesian evidence, not second- or third-hand accounts. It has to be something they can see with their own eyes.[2] And it either has to be extremely strong Bayesian evidence, or a lot of it, over a long period of time. And you have to, if you will, dance gracefully with your partner, leaving enough space and time for them to process what they've seen. Emotions like defensiveness can be poisonous here.
This is not easy, and the evidence cannot be generated on demand, otherwise we run into the same parasitism problem.
Is It Any Easier to Convince Myself?
Kind of, yeah. You are the closest thing to an administrator of your mind that exists, even if your control isn't as good as it is over your computer. You benefit from being part of your rich internal experience, the one you can't see in someone else.
I think that changing your own mind is much more understanding it as it is overwriting it. Methods like Cognitive Behavioral Therapy, a lot of stuff here on LessWrong (personal favorite: Multiagent Models of Mind by Kaj Sotala), and meditation help you to notice the deeper processes of your mind as a separate thing from your conscious experience.
I think a human starts out as holding their beliefs as equivalent to the way the world really is - they equate the map and the territory. If, say, you don't like a given movie, that means it's objectively bad, and those who like it are Wrong! CBT and meditation help give you a small moment to realize that your belief is a product of your brain, and a moment to see if it actually applies to the world. It's like turning on notices for your thoughts.
Hopefully you have some beliefs to change in mind. Maybe you want to think of the world as a safer place. Maybe you want to believe that people are nice and kind. Maybe you want to believe that junk food is bad for you, or believe that exercise can be useful and enjoyable.
The next step is to try stuff. Gather the evidence, both in favor of your desired beliefs, and against it. Actually figure out what the territory is, as best you can. If people in your town actually are all jerks, it would be good to believe that! Your desired belief may become "I need to move out of here".
One trick is to write down your hypotheses before heading out. If you want to be more sociable, but your mind believes "I can't talk to anyone", write that as your hypothesis. Then you can compare that to what actually happened. It's important to write this down before you leave, as your brain will forget what it predicted earlier and will fit whatever experience you had into your unwanted belief.
It may also help to try the methods used in convincing someone else. Your brain has hidden depths and it can be helpful to approach it as a sort of conversation. Why does your brain have this belief? What caused it? Why is it so strong? If the belief were to be challenged, what part of us would feel particularly unsafe? Sotala's excellent Sequence linked above covers that in much more detail.
Changing your own mind is not a matter of overwriting some text in a book. It's a careful, graceful dance through map and territory alike, one that might end up in an entirely different place than you first expected. This is not trivial.
I'm not talking about Eliezer or anyone in particular! This is not an essay about how Rationalism, or any other group, is especially gullible! This is an essay about why gullibility seems like it should be a form of negative selection pressure!
Reading this made me realize another possible reason why changing minds is hard: the Argument from Low-Hanging Fruit. Easy-to-change beliefs are probably already changed, leaving only the really tough ones left.