I'm putting this through discussion because I’ve never written a main section post before… If you have helpful criticism please comment with it, and if it does well I’ll post it in the main section when I get back from school tomorrow.

Things between the bars are intended to be in the final post, the rest are comments


There’s lots of things which can end the world. There’s even more things which can help improve or save the world. Having more people working more effectively on these things will make the world progress and improve faster, or better fight existential risks, respectively.


And yet for all of my intention to help do those things, I haven’t gotten a single other person to do it as well. Convincing someone else to work towards something is like devoting another lifetime to it, or doubling your efforts. And you only need to convince them once.

So there’s two things I want to learn how to do:

  1. Convince people to try and save the world
  2. Convince people to use more effective methodologies (especially with regards to world-saving)

I think that the rationalist community as a whole isn’t particularly good at doing these. Small efforts are made by individuals, but I think that most of the people who do try to do these run into the same problems.

I propose that we do more to centralize and document the solutions to these problems in order for our individual efforts to be more effective. This thread is for people who encounter problems and solutions for convincing other people.


  • I think that the activity of convincing people to try and save the world and using more effective methodologies should have a word or phrase. Suggestions?
  • Should it just be a thread? I feel like some of the particularly good comments would make good independent posts. Just link to the post version from in the thread?
  • I’m a bit worried that this sounds a bit culty… If you disagree please mention, and if you agree please tell me why.
  • This is a bit prompted by Alicorn's post , and some things which have recently happened in my life.

 

New Comment
14 comments, sorted by Click to highlight new comments since: Today at 2:08 PM

This isn't in itself culty -- but it is the cult attractor that's causing your problems, in a roundabout way.

When we hear people talking about some proposition, we normally either throw it out entirely or tentatively integrate its consequences into our thinking, modulo considerations of status and so forth. Normally higher-impact propositions assume higher priority in our thinking: a perfect stranger shouting "My God, the office is on fire!" takes higher priority than a friend telling you your shoelace is untied.

The memetic ecosystem we live in does contain cults and similar predators, though, which are best recognized (almost defined) by wildly overvaluing their core values. That translates into communication as what might be described as great vehemence. Very high-impact propositions, therefore, carry strong if unconscious connotations of MIND-KILLER STAY AWAY STAY AWAY; after some inflection point, they'll start getting discarded often enough that the priority effects are overtaken.

This isn't just theory. We're all constantly bombarded with exhortations to save the world, and for people who are not domain experts or highly skilled rational thinkers there's no good way to differentiate reliable world-saving imperatives from unreliable ones. The obvious priority-preserving move is to make sympathetic noises and refuse to update -- which indeed turns out to be the polite, socially expected response. If you don't want that to happen, expressing your views in terms of saving the world is strongly contraindicated.

So I guess the "save the world" part should get dropped then. Entirely.

Upon further reflection, it seems like a lot of people are already trying to do that (biomedical research, environmental causes, various anti-poverty charities, etc).

So now the question is "How do you teach rationality to people in a way that helps them do what they're doing in a no-strings attached way such that they actually use the information to improve". People still do whatever they were choosing to do, just more effectively.

Would that work better?

The kind of rationality we're investigating is inextricably bound to improvement; if it's being transmitted effectively, we don't need to attach extra semantic content to it to get people to adopt better practices, look at the future through critical rather than ideological eyes, et cetera. I'd actually strongly advise against attaching that sort of content; doing that would implicitly carry the message that rationality is tribal, like Lysenkoism or intelligent design.

This is true, at least, for improving in terms of habits of thought; improvement in habits of action has to do with instrumental rationality, and hasn't received much attention here. That does seem to be changing, though.

Er, there seems to have been miscommunication.

I'm not suggesting adding semantic content, I'm asking how you transmit rationality effectively.

Helpful comment with regards to cultishness. A pretty large number of people are already working to save the world in various ways. Tentatively, doing 2 without reference would probably be better.

And that's all I can say before I go to school.

From what I've gathered (can't find the link) Eliezer's attempt to answer this question eventually resulted in him deciding to focus on creating rationalists, rather than getting individual people to support particular causes he believed in

More things to say, which I'll say later, because I have to go to bed. One question though: which Alicorn post are you referring to?

Ah, sorry. I had meant for that to be a link. Fixed.

And thanks for the link.

My two cents about influencing people to work on a cause:

  • Usually, the most effective way to do so is to engage their emotions. Any discussion of existential risk implicitly engages people's fear, for example. That's a bit problematic in this case, since they'd be working within a community that tends to disparage signaling emotionalism. My guess is a happy medium is to engage people's emotions while signaling alliance to rationality. EY's "rationality dojo" posts are a good example of this, IMHO.

  • Causes may be general, but actions are specific. If I want to encourage action, therefore, I ought to be as specific as possible about what I want people to do. Often a useful combination is to raise a general problem and suggest a specific action people can perform to avoid it. It helps if the two are actually related in some way, though it's disappointingly unnecessary in many cases.

  • Convincing groups is even more effective than convincing individuals, since groups have a way of mutually reinforcing one another. Of course, groups also have a lot more inertia to overcome, for the same reasons.

  • You don't "only need to convince them once." Actual persistent behavior change is not usually a fire-and-forget thing; it's the result of continual effort. One reason so few people manage it is because we aren't willing to do the work.

  • It's generally considered bad form to talk about human operant conditioning, so I will point out the following ostensibly-irrelevant-to-humans fact about animal operant conditioning for no apparent reason whatsoever: it helps to reward compliant behavior and to not reward non-compliant behavior. (Actively punishing non-compliance has negative consequences in many cases, though.) Also, the more obviously the reward is connected to the behavior (for example, the closer they are in time, and the more reliably the latter entails the former, and the more reliably the absence of one entails the absence of the other), the stronger the conditioning effect.

  • Another ostensibly-irrelevant fact about animal behavior conditioning is that intermittent reward establishes conditioning that is harder to extinguish. This also allows for shaping -- once a pattern of behavior is established, reward only compliance that crosses a certain threshold.

My two cents about influencing people to work on a cause:

I've miscommunicated in that most people think I have a particular cause in mind.

Causes may be general, but actions are specific. If I want to encourage action, therefore, I ought to be as specific as possible about what I want people to do. Often a useful combination is to raise a general problem and suggest a specific action people can perform to avoid it. It helps if the two are actually related in some way, though it's disappointingly unnecessary in many cases.

You don't "only need to convince them once." Actual persistent behavior change is not usually a fire-and-forget thing; it's the result of continual effort. One reason so few people manage it is because we aren't willing to do the work.

Good points. That sort of implies that you can't really inspire people to go work on a cause without sticking around to tell them what that entails afterwards.

How helpful do you think it would be to just teach rationality to people for them to do whatever they're working on now?

Well, that is EY's ostensible purpose with this whole forum, so it's at least appropriate.

A grammar comment: Each "there’s" in the draft should be "there are".

I think that the activity of convincing people to try and save the world and using more effective methodologies should have a word or phrase. Suggestions?

Outreach. Possibly either "rationality outreach" or some other prefix.

One thing I do (I don't know if this is the level of specificity you wanted) is mention cost-effectiveness when explaining my (philanthropic) plans or discussing a third party's. It's been informative how many people don't seem to have realized before that thinking about the methodology of their world-saving is a good idea.

Convincing someone else to work towards something is like devoting another lifetime to it, or doubling your efforts. And you only need to convince them once.

Evangelism!

So there’s two things I want to learn how to do: 1. Convince people to try and save the world

It is a tricky proposition. The world has existed for billions of years, and doesn't look as though it is at much risk. There are billions of humans on the planet. Our species is doing spectacularly well.

There are lots of people trying to convince others that the world is at risk. Resource shortages, a climate apocalypse, nuclear warfare, technological meltdown.

One problem is that most of them are transparently after your money and time - and just want to use a superstimulus to get hold of it. Humans are relatively vulnerable to manipulation using fear - and an apocalypse is a fear superstimulus.

Another problem is that most humans - like many other animals - are fairly self-interested. The coming apocalypse is usually a large collective action problem that most individuals are poorly-placed to influence - and poorly-motivated to try to influence.

So, people build up a memetic immune system that rejects this kind of material.

Making messiahs is a traditional problem faced by religions. They have many of the best tricks for overcoming people's natural defenses:

Get to people when they are young. Use authority figures. Use sex. Use scripture. Sever ties to the family. Make sure they have no money - and are dependent. Invoke powerful super-beings - and make sure they are on your side. Eliminate doubt. Employ inspirational speaking. Make sure that their eyes shine.

It is a tricky proposition. The world has existed for billions of years, and doesn't look as though it is at much risk. There are billions of humans on the planet. Our species is doing spectacularly well.

On the whole our species is doing well, but if you look at history there's quite a bit of precedent for societies collapsing (Mayans, Mohenjo Daro, Sumerians), being wiped out by disease (Native Americans), and being destroyed by natural catastropes (Crete). We just don't find them that pressing because our society is still going.