Cross-posted from Putanumonit.
The Worst is to Care a Little
At a party the other day, a woman expressed her displeasure at men who “pretend to be allies of gay rights but don’t really care about it”. Did she mean men who proclaimed their progressive attitudes on LGBTQ but voted Republican on Tuesday anyway? Yes, she told me, but not only those. She reserved special scorn for those who vote in support of immigration from Muslim countries, not realizing how Muslim immigrants make the country less safe for gays.
I remarked that she’s setting quite a high bar for these men. Gay rights hardly seem like a central issue in the upcoming elections. In fact, even my gay friends are split among many political views and party allegiances. Considering the second-order impact of every political decision on the lives of gays is a standard that few senators or political pundits meet, let alone guys who have other day jobs and other interests.
“Not good enough,” the woman replied, “if you don’t care enough about gay rights to think it all through you’re not an ally.”
But the other other day, a man at an Effective Altruism meetup expressed his disdain for people who don’t care enough about climate change to vote based on climate policy and said that he doesn’t want them in EA.
And the other other other day, a woman expressed her bafflement at people whose politics aren’t driven by a concern for women’s reproductive rights. She said she doesn’t want to be friends with people who don’t think that abortion is the most important issue today in the US today.
How about someone who is pro-choice but voted for Hillary Clinton just because she shares a concern for AI safety? “That just shows you don’t really care about women,” she answered. Not caring enough is just as bad as having the wrong position.
Since I’m an alien, I don’t get to vote at all (check your privilege, citizens, and stop microagressing against me by sharing exhortations to go out and vote on Facebook). Even if I could vote in New York, I probably wouldn’t. And if I had to, I’d probably vote for Andrew Yang even though I didn’t check what his positions are on gay rights, climate change, and abortion. I can guess what Yang’s positions on these three issues are, and I guess they’re very similar to my own positions, but I wouldn’t care enough to check.
Here’s a list of other topics I was recently criticized for not caringenough about by people who, for all their caring, didn’t discover anything new and believe pretty much exactly what I believe on each topic: Russian meddling in the elections, wild animal suffering, male circumcision, free speech on campus, Kanye West, Brett Kavanaugh, Alexandria Ocasio-Cortez, anti-Semitism, LeBron James signing with the Lakers.
I’m getting a bit pissed about being told that I’m a bad person for not sharing in somebody’s mandatory obsessions, especially since everyone has different ones. I’m also confused – I personally am very much in favor of those around me being diverse in what they care about. So why doesn’t everyone else?
In Praise of Diverse Obsessions
This chart of priorities from the Effective Altruist survey makes me very happy:
Of the 9 EA cause areas, I care deeply about three of them, care a bit about another three, and care not at all about the final three. But every cause area has many people who are obsessed with it and also many people who don’t give a crap.
This is a diversity of priorities, not of opinions. I agree that factory farming and the criminal justice system are terrible – it’s better in general to have fewer sentient beings in cages. I just care about AIand rationality more. And yet, I strongly prefer this sort of EA community to one where the majority cares only about AI and rationality.
The first benefit of fascination diversity is breadth of coverage. I am not very interested in animal welfare because it strikes me as quite intractable at the moment, and I’m very skeptical of the impact of vegan advocacy. But with all the vegan EAs around I know that I’ll quickly learn if anything exciting happens in the space, like innovations in meat replacement technology. Hanging out in a community where 10% of people are obsessed with a topic is a much better way to stay on top of it than dedicating 10% of my own time to it.
Another example: a small part of the rationalist community has recently become fascinated with auras, tarot, chakras and other assorted esoterics. This has gotten to the point where we can’t be 100% sure if they’re still sane or not. I hold simultaneously both of the following:
- Looking for wisdom in chakras is a bad idea. Whatever aspects of chakra theory correspond to reality are hopelessly polluted by mystical hogwash, and the epistemic harm of hanging around chakra believers is greater than the benefits of hidden chakra knowledge.
- As long as it’s a small part of the community that’s obsessed with chakras, I’m happy to have them around and I think that they benefit the group as a whole.
Another benefit of diversity is specialization. The first person to become obsessed with a topic is someone who’s a good fit for it and will contribute a lot of original thinking. The two hundred and first person, not so much. Even in a small community there are diminishing returns to everyone piling into the same obsession.
This also allows the community to diversify status hierarchies. More fascinations mean more opportunities to gain respect as a contributor in the community. If only one thing is allowed to be important, Girardian terror ensues.
I make a point of diversifying interests in communities I help organize, like the New York rationalist meetup. We make sure to alternate statistics workshops with mindfulness exercises, and discussion of botany with deep dives into some abstruse LessWrong post. This invites many more people to get involved, contribute, and feel like they’re important members of the group. This keeps the group strong, vibrant, and less dependent on a handful of leaders.
Finally, diversity of both opinions and levels of caring protects from groupthink. If we all care deeply about X it becomes socially treacherous to question the group’s X-expert on anything. Going back to Effective Altruism, there are people whose main EA obsession is keeping EA honest, and they’re indispensable.
Of course, if a group’s purpose is to be obsessed with a single topic, enforcing this obsession is beneficial to the group. Everyone at Planned Parenthood should care about reproductive rights, everyone at MIRI about AI, etc. But if the group’s telos is something else, whether it’s purely social or just a broader range of issues, enforcing mandatory obsessions is harmful to the group.
Given the benefits of fascination diversity, why do people fight so hard against it in the communities they’re part of? I am going to present two theories. This involves me speculating about the minds of people very different from me, so take both with a salt shaker.
Hanlon’s Razor says: Never attribute to malice that which is adequately explained by stupidity. But Hanson’s Razor inverts it: Never attribute to stupidity that which is adequately explained by unconscious malice selfishness.
People who exhort you to care more about Issue X rarely follow it up by admitting that they personally don’t know much about X, it’s just important to investigate. And not only do they know a lot about the issue we all have to care about, but they’re also quite confident about how to address it. No one says: “We need to focus on climate change to find out if we’re wildly overreacting or underreacting to the issue.” They usually have a specific reaction in mind.
An inevitable outcome of the group shifting focus to Issue X is that the person with strong opinions on Issue X gets a massive boost in status. If the issue becomes a sacred value, an unassailable matter on which no compromises can be made, the status of the sacred value’s guardians also becomes unassailable.
Except for a few Machiavellian types, I suspect that this desire isn’t explicit or necessarily conscious. You notice that when the group’s conversation turns to Issue X you have a lot to say and everyone listens and nods in agreement. This feels good. You start trying to have more conversations about X, and resent people who turn the conversation to Y instead.
Once you’re the expert on X, if someone says outright that Y is more important you get personally offended. This isn’t always wrong. “Let’s shift some focus away from such-and-such” is often a politically savvy way to say “let’s shift some respect and prestige away from so-and-so”.
Of course, a single issue thinker can point out that I get a status boost from groups having diverse focus. I’m trying to establish a reputation as a purveyor of nuggets of wisdom on dozens of varied topics. But I’m not an expert, let alone a moral leader, on a single one. This wouldn’t be unfair. The point of Elephant in the Brain isn’t that other people do things subconsciously for status and political gain, it’s that we all do.
The more charitable view of people who feel strongly about feeling strongly is that these people are possessed by a mind virus. The fault, dear Brutus, is not in ourselves, but in our memes.
I’ve started to see big arguments, not as arguments over facts, but arguments over whether specific concepts should be worthy of controlling your mind; arguments over which immunities we shouldn’t have. Consider “won’t someone think of the children”, and interpret it as “why are you immune to the ‘children’ meme, you should allow it to take control of your life”; now replace “the children” with “free speech” or “minorities” or “self-defense” or “transgender people” or “the law” or “human rights”.
The right answer to “which immunities should we have” is “all of them”. Every argument is worthy of measure; no argument is worthy of instant and perfect obedience.
In my brain sits the meme: climate change is scary, but there’s little I can personally do about it and there’s no point discussing it ad nauseam because none of my friends impact Chinese energy policy.
For obvious reasons, my version is a lot less virulent than the meme: climate change is scary, and we should all talk about it ad nauseam. The second version contains the means of its own replication, like the flu virus causing a sick person to sneeze on everyone around them.
If you’re hearing about climate change for the first time, you’re almost certainly going to hear the self-replicating version, not the quiet one. This is simply because those infected with it are the ones doing 99% of the talking about climate change. Shifting to the quiet version requires building a general immunity to memes that contain the imposition to breathlessly proselytize them.
No argument is worthy of instant and perfect obedience, and no argument is worth repeating until your friends are sick of hearing it.
Ultimately, convincing people to believe a certain position is difficult enough, convincing them to care more about it than they do is damn near impossible. Communities like Effective Altruism have, for the most part, figured this out. The EA message is: “Hey, if you’re already vaguely utilitarian and you like animals, check out this cool book“. You may get someone to care more about utilitarianism and animals by telling them occasional cool facts and stories and the subject, not by shaming them for not caring enough.
I hope that Putanumonit gets you to care more about rationality, erisology, and the proper use of statistics in social science research. But if that doesn’t happen, the fault is mine, not yours.
For more: Caring Less by eukaryote.