Having weird ideas relative to your friends and associates means paying social costs.  If you share your weird ideas, you'll have more arguments, your associates will see you as weird and you'll experience some degree of rejection and decreased status.  If you keep your weird ideas to yourself, you'll have to lead a double life of secret constructed knowledge on the one hand and public facade on the other.

For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world.  (I've heard from reliable sources that the stereotypes about the South are accurate.)  Not many of us would choose to do this voluntarily.

The weirder your beliefs get relative to your peer group, the greater the social costs you'll have to pay.  Imagine we plot the beliefs of your associates on a multidimensional plot and put a hook at the center of mass of this plot.  Picture yourself attached with an elastic band to this hook.  The farther you stray from the center of mass, the greater the force pulling you towards conventional beliefs.

This theorizing has a few straightforward implications:

  • If you notice yourself paying a high social or psychological cost for your current set of beliefs, and you have reasons to not abandon the beliefs (e.g. you think they're correct), consider trying to find a new set of associates where those psychosocial costs are lower (either people who agree with you more, people who are less judgmental, or some combination) so you can stop paying the costs.  If you can't find any such associates, create some: convince a close friend or two of your beliefs, so you have a new center of mass to anchor yourself on.  Also cultivate psychological health through improving your relationships, meditation, self-love and acceptance, etc.
  • If you're trying to help a group have accurate beliefs on aggregate, stay nonjudgmental so that the forces pulling people towards conventional wisdom will be lower, and they'll be more influenced by the evidence they encounter as opposed to the incentives they encounter.  You may say "well, I'm only judgmental towards peoples' beliefs when they're incorrect."  But even if you happen to be perfect at figuring out which beliefs are incorrect, this is still a bad idea.  If I'm trying to figure out whether to officially adopt some belief as part of my thinking, I'll calculate my expected social cost of holding the belief using the probability that it's incorrect times the penalty in the case where it's incorrect.  So even punishing only the incorrect beliefs will counterfactually decrease the rate of people holding unusual beliefs.

Some more bizarre ideas:

  • Deliberately habituate yourself to/adapt to the social costs associated with having weird ideas.  Practice coming across as "eccentric" rather than "kooky" when explaining your weird ideas, and state them confidently as if they're naturally and obviously true, to decrease status loss effects.  Consider adopting a posture of aloofness or mystery.  Or for a completely alternative approach, deliberately adopt a few beliefs that you suspect are true but your social group rejects, and keep them secret to practice having your own model of the world independent of that of your social group.
  • If you notice a weird idea of yours is either not getting adopted by you because of social costs, or is costing you "rent" in terms of social costs you are having to pay to maintain it, do a cost-benefit analysis and deliberately either maintain the belief and pay the upkeep costs or discard it from your everyday mental life (preferably making a note at the time you discard it).  You have to pick your battles.
  • Start being kinda proud of the weird things you think you've figured out, in order to cancel out the psychosocial punishment for weird ideas with a dose of psychosocial reward.  Keep your pride to yourself to avoid being humiliated if your beliefs turn out to be proven wrong.  The point is to be guided only by the evidence you have, even if that evidence is biased or incomplete, rather than solely the opinion of the herd.  (Of course, the herd's opinion should be considered evidence.  But if you're doing it right, you'll err on the side of agreeing with the herd too much and agreeing with the herd too little about the same amount... unfortunately, agreeing with the herd too little and being wrong generally hurts you much more than agreeing with the herd too much and being wrong.)

New to LessWrong?

New Comment
47 comments, sorted by Click to highlight new comments since: Today at 2:08 PM

For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I've heard from reliable sources that the stereotypes about the South are accurate.)

I'd caution against using this as an example. Not just because it's a stereotype, or just because there's a wide variation in that stereotype, but because it's neither the most effective or the most common level of social pressure that inspires conformity. When you say "Bible-Thumping Redneck", readers will jump to some level of coercion between Inherit the Wind, and a literal torch-and-pitchfork-wielding mob. That's meaningful, but it's also an iceberg situation: the most obvious answer is not the full answer, and can distract you from the full answer. It tempts folk to think about social incentives that affect expression of belief, rather than social incentives that alter beliefs themselves.

A good deal of effective social conformity is far more subtle. You need a much greater form of self-introspection to counter this attribute than you'd expect, and it's far more pervasive than your example would suggest.

So, in the Wikipedia article about the Asch Conformity Experiment, it says that 25% of study participants never gave an incorrect answer. I'd expect readers of a blog about how to think rationally to plausibly be in the top 10% of the population when it comes to thinking rationally, so I doubt many of us would give incorrect answers in the Asch Conformity Experiment (except as a deliberate choice to tell a lie in order to fit in better).

I agree that a town full of religious hicks is maybe a bad example to anchor from. My thought was that some people are going to pretend that social pressure to conform with their beliefs doesn't exist, so if I convince them that they'd feel uncomfortable in the extreme case, then maybe I can convince them that something subtler happens in less extreme cases. Speaking personally, even though people who read Less Wrong agree with me on much more than a typical person does, I still notice substantial social incentives for me to conform with them better... and I had a reputation as a contented misfit in high school. So yes, I agree pressure to conform is very pervasive. My thought was by deconstructing the psychological and social pressures involved, they'd lose some of their power and we could consciously decide how best to deal with each pressure. I'd love to hear if you've detected additional subtle pressures in your own thinking that I didn't mention in my post.

My strategy is to associate my weird beliefs with the high status people who hold them and use it to curtail non-productive arguments. e.g. "I don't have much expertise in this area myself, but I trust X who studies this problem professionally."

Odd. I avoid doing exactly that, because I subconsciously expect people to call me on it; along the lines of "what, you believe X just because so-and-so says so?" It's fairly rare for me to express weird beliefs unless I'm either prepared to defend them in detail, or on such good terms with whoever I'm speaking with that I know they'll cut me some slack.

LW itself is exhibit A, I suppose. I've adopted a fair number of ideas from the Sequences based mostly on Eliezer being extremely convincing -- but I hesitate to put myself in a position where I'd have to admit that, because I can't replicate the argument spontaneously.

"what, you believe X just because so-and-so says so?"

"No, I don't believe X because so-and-so says so, I put a strong weight on X being true based on so-and-so's track record. If you'd like to discuss the evidence for or against this position in more rigor we should do so online so we can link to citations."

This acts as a great litmus for people who will actually provide me with high quality evidence as well.

Huh. I really like this approach.

The irony here, of course, is that Eliezer has written at length about the importance of being able to reconstruct the argument that convinces one that X is true, not just recite "X is true."

Reconstructing such, given time, is something I can do. But I can't do it in real time for non-trivial arguments. Does that make me a minority here? I have never been able to do that for any abstract argument that I can think of, except maybe in my area of professional expertise where all the relevant information is perpetually in cache.

I doubt it makes you a minority anywhere.

Practice coming across as "eccentric" rather than "kooky" when explaining your weird ideas, and state them confidently as if they're naturally and obviously true, to decrease status loss effects.

This only works if you're already high status. If you're low status you come across as attention-seeking.

If you're low status you come across as attention-seeking.

I'd be more worried about it coming across as mental illness. Being eccentric in the good way isn't easy.

That sounds wrong to me. Why do you think it comes off as attention-seeking if you're not already high status?

1) Given that you exhibit attention-seeking behavior, displaying eccentric beliefs is a reliable way of getting more attention. 2) Attention seeking behavior can be a pathological response to perceived low status, and egregious attention-seeking may cause you to lose status as well. 3) So seeing a low-status person with eccentric beliefs, you should update more in favor of them doing it for attention.

Of course I now realize I'm privileging the hypothesis. If you're low status with eccentric beliefs you might just come across as a creepy weirdo.

The reason I disagree is that when your goal is attention, you also show other signs (like trying to argue you into agreeing). If you aren't seeking attention, then you don't push your weird views and simply state them matter-of-fact when asked. I don't think someone who keeps their mouth shut until asked and then talks matter-of-fact about weird beliefs comes off as attention seeking or loses status in most cases.

Being wrong is correlated with using an authority heuristic. If you encounter someone who you're sure is wrong, consider using an authoritative source who agrees with you. Attempt to establish that source's authoritativeness before you reveal that source agrees with you.

Worst case scenario, that other person still disagrees, but you only pay for "listening to the wrong guy", which I expect (confidence 95%) is less expensive in the median case.

For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world.

A scenario much more available for the Libertarian readers would be living in an area and in broad social circles dominated by Progressives.

Nonjudgemental people may help you socially with trying new ideas, but will not help you epistemically with finding the correct ones. You will have to reinvent every wheel alone. If you have unlimited time, go ahead. Otherwise, it is better to find people who are a bit judgemental -- who have a preference for correct beliefs.

Of course, replacing judgemental people with a preference for incorrect beliefs by nonjudgemental people is an improvement. But sometimes you can do much better than this.

Maybe this is what happens to many people: they replace the judgemental people with incorrect beliefs by nonjudgemental people, they realize the improvement... but then they can't improve further because their heuristics says that "nonjudgemental is best". Noise can be better than active misinformation, but signal can be even better than noise. But when most of your experience is with active misinformation, all signals seem dangerous.

[-][anonymous]10y40

There is some confusion that pops up whenever there's a discussion of 'being judgmental'. Some people distinguish between disagreement and condemnation and believe that you can strongly disagree with someone in a non-judgmental manner and others think of it as a package deal, where being non-judgmental is a trade-off between niceness and ability to form correct beliefs.

When I hear people talking about being nonjudgmental I tend to assume the first interpretation (which I also agree with). But being non-judgmental in that way might itself be an example of a weird, costly attitude. If others don't share it, they will think that you are judging them and there's no way of convincing them otherwise.

The two contexts in which I see "judgemental"

1) Making assumptions about people based on incomplete knowledge: judging character from the way someone dresses. Or thinking badly of someone for, say, alcoholism without knowing them and the circumstances they live in. Or judging a lifestyle without understanding it.

I'd define this as: "the sort of person who tends to make moral judgements based on insufficient evidence". Seems like a reasonable accusation.

2) Treating oneself as the final arbitrator of right and wrong, deciding morality for others: "Who are you to say what is right?"

I'd code this as "anyone who makes moral judgements". That doesn't seem like a negative trait to me at all...rather, the usage of "don't judge' in this way seems like the moral equivalent of anti-epistemology. What gives?

The amalgam of the two definitions is "One who judges too much", where Judgement is a high confidence statement about the moral status of a thing. So "you're being judgmental" should be coded as "You are far too confident in your morality-related claim. Shame on you!" This makes sense to me...though it seems less like an actual argument and more like a statement of belief.

The seeming double meaning arises because some individuals believe that no one can make any moral claim with any confidence (especially when it comes to other people), while others believe in absolute God-given morality or absolute self-created morality. In fact, there is only one definition of the word, but the usage varies depending on the moral philosophy of the user.

Unfortunately in my experience, the majority of people who use "don't judge" are using it as a rhetorical device to put a stop to moral conversations that they'd rather not have. It's shorthand for, "Oh, you are trying making a moral judgement? But morality is relative anyway!" from a person who has no strong opinions and/or is largely naive to concepts in moral philosophy and thus is able to implicitly switch moral philosophies as it suits them in rhetoric without even realizing that they are doing it.

It's a lot like "faith = trust without evidence" vs "faith = justified trust as a result of evidence" in this regard. The simple definition is "the belief that you can trust someone", but one's epistemology as to how one aught to form beliefs alter one's usage of the word, and most people will use the "trust without evidence" version to implicitly switch epidemiological philosophies when it suits them in rhetoric.

Both interpretations are viable and can co-exist -- depending on the matter under discussion.

It's pretty easy for people to strongly disagree about e.g. the merits of a sports team without condemnation being involved.

It's very hard for people to strongly disagree about e.g. slavery without condemnation being involved.

I think the relevant attribute is "seriousness" or importance. If you imagine a spectrum of importance from "I don't really care" on one end and "I will die for this" on the other end, the closer you are to the don't-care end the easier it is to disagree without judging. But the closer you get to the will-die-for-it end, the harder passionless disagreement becomes.

[-]knb10y20

For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I've heard from reliable sources that the stereotypes about the South are accurate.)

They're such vapid hicks, I bet they believe negative stereotypes about large groups of people just because someone who shares their worldview assures them they are accurate.

Ugh they're such evil mutants! Good thing nothing like that happens on Less Wrong.

Fight the restoring force with leverage. Make friends with the local priest and affect the congregation through him. Most instrumental rationality techniques are compatible with religion. You can consider your options once you gain status.

I read this as "become the king's most trusted adviser; it's easy."

You can misread it any way you like, that's not what I wrote.

Was I overly hyperbolic, or completely orthogonal to your point? Was I reading in an implication that such things would be (somewhat) trivial that isn't there? (I've noticed quite a few comments that leave me wanting to reply this way can be read as trivializing something decidedly non-trivial, and I suspect that if I called out the commenters the response would be that they meant otherwise.)

My point was that you can befriend the person(s) with leverage in the community without having to become their "trusted advisor". Here is one scenario. Befriending a priest is indeed easy: if you come across as a friendly and curious non-believer, it is in his job description to attempt to win you over. If you cheerfully volunteer in the (non-religious) community events organized by the church, you win people's respect and eventually their sympathetic ear. If you know some useful stuff about instrumental rationality people can benefit from, that might be the time to share. Eventually you can start discussing religious beliefs and tenets in a respectful and friendly manner.

For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I've heard from reliable sources that the stereotypes about the South are accurate.) Not many of us would choose to do this voluntarily.

I have lived in a town full of religious hicks in the South, and I have many weird beliefs. My experience was that while it did cost me social status, it wasn't that important to me, and it would have been significantly more unpleasant to conceal my beliefs.

[-]lmm10y00

Does anyone have more bizarre / counterintuitive approaches? Those three seem fairly obvious; indeed I'm pretty confident I've witnessed the first two in others (and applied all myself at one time or another)

If you're trying to help a group have accurate beliefs on aggregate, stay nonjudgmental so that the forces pulling people towards conventional wisdom will be lower, and they'll be more influenced by the evidence they encounter as opposed to the incentives they encounter. You may say "well, I'm only judgmental towards peoples' beliefs when they're incorrect."

The problem with giving rationalists this kind of advice is that it lowers the average sanity of the people defining conventional wisdom.

I don't follow.

If the more rational people in a group are the least judgmental, the only source of peer pressure will be from the less rational people.

That's true only if judgmentalness is the only source of peer pressure.

If, for example, being successful allows one to successfully exert peer pressure and rational people are more successful, then even if the more rational people in a group are less judgmental they might still exert significant peer pressure.

But even if you happen to be perfect at figuring out which beliefs are incorrect, this is still a bad idea. If I'm trying to figure out whether to officially adopt some belief as part of my thinking, I'll calculate my expected social cost of holding the belief using the probability that it's incorrect times the penalty in the case where it's incorrect. So even punishing only the incorrect beliefs will counterfactually decrease the rate of people holding unusual beliefs.

This argument seems to assume, contra the first sentence, that I am coming down on the side of the received opinion each time, rather than on the side of the correct opinion.

Let's say I have an idea X which happens to be correct, although I only think it's correct with 80% probability. I believe that if idea X is incorrect, and I share it, I will be burned at the stake. So I keep idea X, a correct idea, to myself in order to avoid censure. It so happens that if I had shared idea X, the person in charge of handing out punishments would not have punished me because X happens to be a correct idea. But I had no way of knowing that for certain in advance.

Does that answer your question? I'm not sure I understand your objection.

Ahh. That works if you allow 'not holding a belief either way' as an option. If you punish failing to believe the true thing as well then that problem is avoided.

Yep. People typically don't receive social punishments for agreeing with the group even when the group ends up being wrong.

If you're trying to help a group have accurate beliefs on aggregate, stay nonjudgmental so that the forces pulling people towards conventional wisdom will be lower, and they'll be more influenced by the evidence they encounter as opposed to the incentives they encounter. You may say "well, I'm only judgmental towards peoples' beliefs when they're incorrect." But even if you happen to be perfect at figuring out which beliefs are incorrect, this is still a bad idea. If I'm trying to figure out whether to officially adopt some belief as part of my thinking, I'll calculate my expected social cost of holding the belief using the probability that it's incorrect times the penalty in the case where it's incorrect. So even punishing only the incorrect beliefs will counterfactually decrease the rate of people holding unusual beliefs.

There are already some heuristics that allow you to nudge people in a direction where they are more likely to accept your arguments. However, these techniques are all about getting people to like you, in effect, taking advantage of their cognitive biases, so it might seem to straddle that line of Dark Arts. One was actually posted pretty recently: Have the person tell a self-affirming thing about themselves before trying to convince them of your point of view. Or ask them for a favor or their opinion on something. Another way to get people to like you is to uncover likable things about the person; the Dark Arts version of that would be something like Barnum statements.

If anything, these persuasion techniques will increase your social capital so that you have more to spend on having beliefs that don't quite mesh with the group's center.

One was actually posted pretty recently: Have the person tell a self-affirming thing about themselves before trying to convince them of your point of view.

Hm, complimenting people right before telling them they're wrong about something seems like a good idea and not very dark arts-ish.

In my experience the 'social cost' tends to be paid by people trying to push (hard) concepts and ideas that are, often at least, very true and useful, but when it is not necessarily relevant to the conversation or the person isn't interested.

No price is due -- if anything, the opposite -- as long as you follow a few basic rules, and are able to explain your ideas eloquently and succinctly (and if you can't -- should you be talking about it at all?)

A curse of intelligent people seems to be to want to try to 'show' everyone just how smart they are by talking about subjects they don't totally understand, or else subjects that are irrelevant to whatever situation they are in.

A discussion on Bayesian Reasoning may be very appropriate in certain college or high school academic situations, but repeatedly bringing it up in all sorts of casual conversations will cause social harm to the individual refusing to follow social expectations in this are

If you keep your weird ideas to yourself, you'll have to lead a double life of secret constructed knowledge on the one hand and public facade on the other.

I kinda think most people have to hide some of their beliefs. It doesn't feel like a facade to me, just a fact of life you've got to live with.

[This comment is no longer endorsed by its author]Reply

For sure, but you don't feel like there's any kind of psychological cost you're paying there?