Related to: Branches of Rationality, Rationality Workbook

Changing your behavior to match new evidence could be harder than simply updating your beliefs and then mustering your willpower, because (a) we are in denial about how often we change our minds, (b) cognitive dissonance is tolerable in the medium-term, and (c) the additional monitoring required to verify that your actions as well as your beliefs have changed makes it easier for you to pretend that your actions are suitable to your reality. It might help to (1) specify a quitting point in advance, (2) demonstrate your new opinion with symbolic action, or (3) activate your emotions by reading non-rational propaganda. Additional solutions are eagerly solicited.

Disclaimer:

This post contains examples drawn from politics and current events. I do not hope to change anyone's mind about any specific political belief, I know that Politics is the Mind-killer, I have tried to use non-inflammatory language, and I have a good faith belief that this post contains actual content on rationalism sufficient to justify its potentially controversial examples. Equally powerful but less controversial examples will be cheerfully substituted if anyone can bring them to my attention.

Review:

As has been amply discussed in the sequences, a key tool for overcoming the human tendency to irrationally defend prior beliefs simply because they are comfortable is to ask what, if anything, would cause you to abandon those beliefs. For example, in the “invisible dragon in the garage” parable, it quickly becomes clear to neutral observers that there is no potential evidence that could convince an invisible-dragon-fundamentalist that the dragon is fictional. If you test for breathing noises, it turns out that the dragon is inaudible. If you test for ecological impact, it turns out that the dragon lives off of invisible hamsters, etc. Thus we say that the belief in the dragon is unfalsifiable; there is no way to falsify your hypothesis that there is a dragon in your garage, and so your belief in the dragon does not pay rent in anticipated experiences.
 
There is a second human bias that causes you to cache an unrealistically high summary statistic for how often you change your mind: you think you change your mind, in general, pretty often, but unless you are an expert, highly-practiced rationalist, odds are that you do not. As evidence, try thinking of the last time you changed your mind about something and force yourself to specify what you believed beforehand and what you believed afterward. Me personally, I haven't changed my mind about anything that I can remember since about November 10th, 2010, and I'm sure I've expressed thousands of opinions since then. The odds are long.

The Problem:

There is a third human bias that causes you to tell yourself that you have successfully changed your mind when you have not really done so. The adherent of the Reformed Church of Dragon leaves the garage door open, and cheerfully admits to anyone who asks that there is probably no such thing as an invisible dragon, yet she is unaccountably cautious about actually parking her car in the garage. Thus it is worth knowing not just how to change your mind, but how to change your habits in response to new information. This is a distinct skill from simply knowing how to fight akrasia, i.e., how to muster the willpower to change your habits in general.
 
One example of this failure mode, recently reported by Slate.com, involves American troops in Iraq: there are at least some regions in Iraq where many people strongly prefer not to have American troops around, and yet American troops persist in residing and operating there. In one such region, according to a former American soldier who was there, the people greeted the incoming foreigners with a large, peaceful protest, politely asking the Yankees to go home. When the request was ignored, locals began attacking the Americans with snipers and roadside bombs. According to the ex-soldier, Josh Steiber, the Americans responded not by leaving the region, but by ordering troops to shoot whoever happened to be around when a bomb went off, as a sort of reprisal killing. At that point, cognitive dissonance finally kicked in for Josh, who had volunteered for the military out of a sense of idealism, and he changed his mind about whether he should be in Iraq: he stopped following orders, went home, and sought conscientious objector status.

The interesting thing is that his comrades didn't, even after seeing his example. The same troops in the same town confronted with the same evidence that their presence was unwelcome all continued to blame and kill the locals. One of Josh's commanders wound up coming around to Josh's point of view to the extent of being able to agree to disagree and give Josh a hug, but still kept ordering people to kill the locals. One wonders: what would it take to get the commander to change not just his mind, but his actions? What evidence would someone in his position have to observe before he would stop killing Iraqis? The theory is that American military presence in Iraq is good for Iraqis because it helps them build democracy, or security, or their economy, or some combination. It's moderately challenging to concede that the theory could be flawed. But, assuming you have the rationalist chops to admit your doubt, how do you go about changing your actions to reflect that doubt? The answer isn't to sit at home and do nothing; there are probably wars, or at the very least nonviolent humanitarian interventions, that are worth sending people abroad for (or going yourself, if you're not busy). But if you can't change your behavior once you arrive on the scene, your doubt is practically worthless -- we could replace you with an unthinking, unquestioning patriot and get the same results. 

Another example was reported by Bill McKibben, author of Deep Economy, who says he happened to be in the organic farming region of Gorasin, Bangladesh the day an international food expert arrived to talk about genetically engineered "golden rice," which, unlike ordinary rice, is rich in Vitamin A and can prevent certain nutritional deficiency syndromes. "The villagers listened for a few minutes, and then they started muttering. Unlike most of us in the West who worried about eating genetically modified organisms, they weren't much concerned about 'frankenfood.' Instead, they instantly realized that the new rice would require fertilizer and pesticide, meaning both illness and debt. More to the point, they kept saying, they had no need of golden rice because the leafy vegetables they could now grow in their organic fields provided all the nutrition they needed. 'When we cook the green vegetables, we are aware not to throw out the water,' said one woman. 'Yes,' said another. 'And we don't like to eat rice only. It tastes better with green vegetables.'"

Bill doesn't say how the story ended, but one can imagine that there are many places like Gorasin where the villagers ended up with GMOs anyway. The November/December 2010 issue of Foreign Affairs has a pretty good article (partial paywall) about how international food donors have persisted in shipping grain -- sometimes right past huts full of soon-to-rot local stockpiles -- when what is really needed are roads, warehouses, and loans. One could argue that the continued funding of food aid at 100 times the ratio of food infrastructure aid, or the continued encouragement of miracle mono-crops in the face of local disinterest, simply reflects the financial incentives of large agricultural corporations. Considering how popular farmers are and how unpopular foreign aid is, though, there are doubtless easier ways for Monsanto and ConAgra to get their government subsidies. At least some of the political support for these initiatives has to come from well-intentioned leaders who have reason to know that their policies are counterproductive but who are unable or unwilling to change their behavior to reflect that knowledge.

It sounds stupid when people act this stubbornly on the global stage, but it is surprisingly difficult not to be stubborn. What if anything, would convince you to stop (or start) eating animals? Not merely to admit, verbally, that it is an acceptable thing for others to do, or even the moral or prudent thing for you to do, but to actually start trying to do it? What, if anything, would convince you to stop (or start) expecting monogamy in your romantic relationships? To save (or borrow) significant amounts of money? To drop one hobby and pick up another? To move across the country?

And, here's the real sore spot: how do you know? Suppose you said that you would save $1,000 a year if the real interest rate were above 15%. Would you really? What is your reference class for predicting your own behavior? Have you made a change like that before in your life? How did the strength of the evidence you thought it would take to change your behavior compare to the evidence it actually took to change your behavior? How often do you make comparably drastic changes? How often do you try to make such changes? Which are you more likely to remember -- the successful changes, or the failed and quickly aborted attempts?

Solutions:

Having just recently become explicitly aware of this problem, I'm hardly an expert on how to solve it. However, for whatever it is worth, here are some potential coping mechanisms. Additional solutions are strongly encouraged in the comments section!

1) Specify a quitting point in advance. If you know ahead of time what sort of evidence, E, would convince you that your conduct is counterproductive or strictly dominated by some other course of conduct, then switching to that other course of conduct when you observe that evidence will feel like part of your strategy. Instead of seeing yourself as adopting strategy A and then being forced to switch to strategy B because strategy A failed, you can see yourself as adopting the conditional strategy C, which calls for strategy A in circumstance E and for strategy B in circumstance ~E. That way your success is not dependent on your commitment, which should help reduce your commitment down toward an optimal level.

Without a pre-determined quitting point, you run the risk of making excuses for an endless series of marginal increases in the strength of the evidence required to make a change of action appropriate. Sunk costs may be an economic fallacy, but they're a psychological reality.

2) Demonstrate your new opinion with symbolic action. Have you decided to move to San Francisco, even though your parents and significant other swear they'll never visit you there? Great! We have nice weather here; look forward to seeing you as soon as you can land a job. Meanwhile, buy a great big map of our beautiful city and put it on your bedroom wall. The map, in and of itself, doesn't get you a damn bit closer to moving here. It doesn't purport to influence your incentives the way a commitment contract would. What it does do is help you internalize your conscious decision so the decision is more broadly endorsed by the various aspects of your psyche.

I remember at one point a religious camp counselor caught me using a glowstick on the Sabbath, and advised me to throw the glowstick away, on the theory that kindling a new light on the Sabbath violated the applicable religious laws. I asked him what good throwing away the light would do, seeing as it had already been kindled and would keep on burning its fixed supply of fuel no matter where I put it. He said that even though throwing away the light wouldn't stop the light from having been kindled (there were limits to his magical thinking, despite his religious convictions), it would highlight (har har) my agreement with the principle that kindling lights is wrong and make it easier not to do it again next time...the very sense that it is strange to throw away a lit glowstick helps put cognitive dissonance to work for changing your mind instead of against it: if you didn't strongly believe in the importance of not kindling glowsticks, why on earth would you have thrown it away? But you did throw it away, and so you must believe, and so on. Also, not reaping the benefits of the wrongly kindled light makes kindling lights seem to provide fewer benefits, and makes it easier to resist kindling it the next time -- if you know, in the moment of temptation, that even if you kindle the glowstick you might repent and not let yourself enjoy its light, you'll factor that into your utility function and be more likely to decide that the no-longer-certain future benefit of the light isn't worth the immediate guilt.

Anyway, this is a fairly weird example; I certainly don't care whether people light glowsticks, on a particular tribe's Sabbath or otherwise. I think it probably does help, though, to be a bit of a drama queen. If you buy a cake while you're dieting, don't just resolve not to eat it; physically throw it out the second-story balcony. If you've just admitted to yourself that your erstwhile political enemies actually have some pretty good points, write your favorite ex-evil candidate a letter of support or a $5 check and physically put it in the mail. As much as possible, bring your whole self into the process of changing your actions.

3) Over-correct your opinion by reading propaganda. Propaganda is dangerous when you read it in order to help you form an opinion, and a deontological evil when you publish it to hack into other peoples' minds (which, depending on circumstances and your philosophy, may or may not be justified by the good consequences that you expect will follow). But when you've already carefully considered the rational evidence for and against a proposition, and you feel like you've changed your mind, and yet you're still acting as if you hadn't changed your mind, propaganda might be just what you need. Read an essay that forcefully argues for a position even more extreme than the one you've just adopted, even if the essay is full of logical cul-de-sacs. In this limited context alone, gleefully give yourself temporary permission to ignore the fact that reading the essay makes you notice that you are confused. Bask in the rightness of the essay and the guilt/shame/foolishness/low-status that people who disagree with it should feel. If you gauge the dosage correctly, the propaganda might nudge your opinion just enough to make you actually adopt the new action that you felt would adequately reflect your new beliefs, but not enough to drive you over the cliff into madness.

As an example, I recently became convinced that eating industrially raised animals while living in San Francisco before the apocalypse can't ever be morally justified, and, yet, lo and behold, I still ate turkey sandwiches at Subway 5 times a week. Obviously I could have just used some of the tactics from the Akrasia Review to make eating less factory-meat a conscious goal...but I'm busy using those tools for other goals, and I think that there are probably at least some contexts in which willpower is limited, or at least a variable-sum game. So I read Peter Singer's book on Animal Liberation, and blamed all the woes of the world on steak for a few hours while slowly eating a particularly foul-tasting beef stew that was ruined by some Thai hole-in-the-wall, to reinforce the message. I'm doing a little bit better...I'm more likely to cross the street and eat vegetarian or pay for the free-range stuff, and I'm down to about 3 Subway footlongs a week, without any noticeable decrease in my willpower reserves.

Your mileage may vary; please use this tactic carefully.

4) Your suggestions.

Seriously; most of my point in posting here is to gather more suggestions. If I thought of the three best solutions in two hours with no training, I'll eat my shirt. And I will, too -- it'll help me repent.

New to LessWrong?

New Comment
154 comments, sorted by Click to highlight new comments since: Today at 1:05 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I think your examples are terrible, and in part it's because they're political - but for a somewhat different reason than the one elaborated in Politics is the Mind-killer.

First, there's the mismatch between the problem you're addressing and the problem your examples illustrate. The problem you're addressing is how to make sure your behavior changes to match your updated beliefs. In this problem, your beliefs have already updated due to the weight of the evidence, but for some reason (and your list of plausible reasons is compelling) your habitual behavior fails to reflect this change in your beliefs. However, both your examples aren't about that at all - they're about beliefs not changing in the face of the evidence. Josh Stieber's fellow soldiers did not change their minds about whether they should be in Iraq. Your example actually appears to argue that they should have, if they behaved rationally - but whether or not it's true, there's no relevance to the problem your post addresses. At one point, you're doing a sleight of hand of sorts (unintentionally, I'm sure):

One of Josh's commanders wound up coming around to Josh's point of view to the extent of being able to agree to di

... (read more)
8zyxwvutsr13y
"Josh Stieber's fellow soldiers did not change their minds about whether they should be in Iraq." None of us has any idea whether or not they changes their minds about anything. A soldier can hold a fully-formed (and informed) negative opinion about the strategic efficacy of their mission, but still follow orders and complete that mission.
6Eliezer Yudkowsky13y
Consider rewriting this as a post?

the story of people failing to account for compelling evidence is by itself a familiar, >ubiquitous, low-status specimen of political propaganda.

In fact, one of the most frequent arguments you encounter as you read political >discussions is the argument that the other side are ignoring obvious facts, and so >failing to behave rationally, because they're blinded by their ideology. To a first >approximation, everyone believes that about everyone else.

It seems to me that many of the arguments made on this site based on or referring to the Politics is the Mind-Killer article are based on extrapolations from a single well-known highly-polarized (essentially) 2-party system, i.e. the USA.

I am from a country with many political parties. No party ever gets more than 50% of the votes, in fact it is rare for any party to get over 20% of the votes. The parties are always forced to form a coalition to make a majority government. This system is not without its flaws, and far be it from me to argue that it is superior to the American system.

Nevertheless, it seems to me that many of the failure modes of 'politics', as often described of this site, are actually failure modes of pres... (read more)

7Tyrrell_McAllister13y
1. I would expect that some parties know that they will never form a coalition with certain other parties. If so, do these "incompatible pairs" show more inclination to accuse each other of ideological blindness? 2. It sounds like people within your country are pretty ideologically homogeneous. But you must differ ideologically from other countries. Your homogeneity leads me to expect that your country is relatively small. This, in turn, means that, relative to a larger country, you probably have less control over the policies of other countries, but those policies have a greater effect on your country's interests. Does the "ideological blindness" explanation sometimes get invoked when talking about why people in other countries chose those policies? (For example, I have seen some people in European countries blame some of their economic problems on a world-wide economic meltdown caused by the free-market ideology of the United States.)
0Plasmon13y
There is a party that is shunned by most other parties because it is almost universally agreed upon to be a racist party (even by themselves in some cases). To a certain extent, the answer to your question is yes. Nevertheless, the present attempt to form a government involves negotiations between a somewhat right-wing separatist party in one part of the country (got almost 30% of the votes in that part) and a somewhat left-wing socialist (yes they call themselves socialists. It's not an insult in Europe) party. The negotiations have been going on for many months, and many colourful analogies have been used (yesterday I heard the separatists compared to Hannibal, and the socialists to the Romans), but I have yet to hear either of them accuse the other of ideological blindness. Perhaps the ideology here is closer to mono-modal than the ideology in the USA. But is this ideological inhomogeneity in the USA a cause or a consequence of the political system? Politicians in a 2-party system have an incentive to polarize : it ensures they get a large amount of voters for their party, and then they just have to focus on the small amount of "swing voters" remaining in the center. True. I'm sure the Netherlands have a similar system. I don't know what the largest country with a true many-party system is. Yes.
1Vaniver13y
Doesn't India have a many-party system? And since they're the largest democracy, I think we're done :P
0blogospheroid13y
This is true. Last 5 governments have been coalition governments.
0TheOtherDave13y
It's a good question, and the polarizing effect of political parties certainly does work the way you describe. That said, I do think the rural/urban divide in the US is a real split in terms of the kinds of public services and private contributions different communities value and expect, and the political parties have exacerbated that rather than created it. Regardless, I agree with your main point about the polarizing effects of bicamerality.
-1wedrifid13y
Some people in this country are more inclined to criticize certain failures to implement the free-market ideology.
5wedrifid13y
Thanks for pointing out another perspective, there could be something to it. Which country are you from, if you don't mind me asking? (Note that I think politics is always a mind killer, however I usually think of the problem more in terms of social politics and moral wrangling in general than governmental politics specifically.)
1Plasmon13y
Belgium
3Jack13y
This is an interesting theory and the two-party system may exacerbate the problem. Great Britain, however, has essentially a two party system (Clegg's relatively new, barely relevant, ideologically indistinct party doesn't really count) and they seem to have about the same level of rationality in their politics as most of multi-party Europe. As others suggested, I suspect the difference has much more to do with the United States cultural, economic and racial diversity than anything else. America is a single tribe to a far lesser extent than other countries- even our white majority, which is smaller than it is in most of Europe consists of four genetically and culturally distinct traditions (and that isn't including Hispanic). This kind of diversity means that we have less in common to start from and have resolved fewer basic issues. We've never gotten around to European style social welfare for much the same reason- that kind of altruism isn't supported for those outside of the tribe. We're also large enough and wealthy enough to support more fractured news media environment- which lets people insulate themselves from opposing view points. This does suggest that discussion of politics could be more successful on Less Wrong (given how much we all have in common) but having to work over the internet involves other difficulties. I would be interested to see, however, whether the differing political climates influence the way people talk about politics. We could select some posters from Northern Europe and some posters from America. Have them discuss a series of emotional and controversial political issues. Have another group evaluate their comments (with the anti-kibitzer on) and grade them by degree of motivated cognition and mind-killing rhetoric. See if the Europeans do better.
-2ChristianKl13y
The US is essentially a zero party system. Passing laws in the senate requires 2/3 of the votes with usually means that politicians from both parties have to support the legislation. US politicians have no problem with having discussions in private. They all believe in doing realpolitik. It's their public rhetoric that differs.
6jimrandomh13y
Not true; laws can pass with as few as 1/2 of the votes (51). However, this is increased to 60 if the opposing side chooses to filibuster (which non-selectively blocks all legislation), and it's increased to 2/3 if the President chooses to veto it. Use of the filibuster was rare before Obama came into office, at which point the Republican party adopted a policy of using it constantly.
0ChristianKl13y
Okay 60 isn't 2/3 but it's still the votes that you need to prevent a filibuster. To prevent the opposing site from filibustering you need to be able to speak with them.
0Perplexed13y
A good analysis of what it is that makes politics (or at least American politics) a mind killer. In fact, worse than a mind killer. The habit of convincing yourself that those who disagree with you are subrational (and intellectually dishonest to boot) is the community killer - it is the first step in a rationalization of disenfranchisement. Are there other subjects besides politics which lead to the same dehumanization of the people who disagree? I think so. One sees it frequently in theological disputes, pretty often in ethical disputes, and occasionally when discussing interactions between the sexes. But very rarely in discussions of the arts, music, spectator sports teams, grammar, and even nutritional practices - even though tribalism is common enough in these areas, no one tries to paint their opponents as either fools or knaves. Why the difference - is it just because these topics are less important than politics? According to Aumann, we should be able to agree to disagree only if one of the following is the case: * We have different priors (or different fundamental values) * One of us is irrational * We don't trust each other to report facts and beliefs truthfully * We just don't talk enough. So, if Aumann is to be believed, in those cases where we do talk enough, and in which we claim to share priors and fundamental values, disagreement is likely to turn nasty. ETA: HT to Plasmon for pointing out the counter-intuitive fact that disagreement may be less nasty when divergence of fundamental values is acknowledged.
2NancyLebovitz13y
I don't think the current state of American politics is a result of structural problems-- it's gotten a lot worse as far as I can tell in the past decade or so. I don't know who started it, or who's done the most to amplify matters, but I think Republicans and Democrats have become a lot more contemptuous of each other.
1Oligopsony13y
American politics has gotten steadily more partisan over the last several decades, mostly as a result of desegregation. While the south was under an apartheid regime many Republicans ("Rockefeller Republicans") were to the left of Democrats ("Dixiecrats.") This is no longer the case; every Democratic senator is to the left of every Republican senator - if you have strong politics yourself, the absolute distance looks small, but the lack of mixture is an undeniable fact. The decreased importance of regional party machines plays into this as well. Parties now function much more like coherent policy packages, so legislators have less allies outside of their own party.
1Jack13y
Desegregation isn't irrelevant to what has happened to American politics- but this doesn't have anything to do with where senators are on an arbitrary political spectrum. The particular manifestation of the left-right political spectrum you have in mind here is the invention of the post-segregation political climate. Pre-desegregation issues didn't break down into positions corresponding to our current political spectrum.
3Oligopsony13y
That's probably a better way of phrasing it. Perhaps I should have said that great majority of variance in political opinion today can be explained with one eigenvector while pre-segregation it would have taken two. Either way, the greater level of ideological coherence is responsible, I think.
0Eugine_Nier12y
I suspect that's just nostalgia filter.
0NancyLebovitz12y
Hard to prove-- I'm not nostalgic in general though. For example, I think food's generally gotten a lot better since the 90s. A lecture about political rhetoric which shows that the nastiness level can change over time-- in particular, it goes into detail about shifts in which words got used in political discourse during the Nazi era. I can tell you with certainty that Republicans and Democrats didn't used to have nasty names (Rethuglicans, Libtards) for each other.
0Perplexed13y
I agree it has gotten worse, though I would trace it back at least to the Bork nomination fight. So, if I want to stick to my AAT-based explanation of the facts, I need to claim either that we have only recently started claiming to have the same fundamental values, or that we are talking more. I believe that there has been a convergence regarding claimed values, over that period, but the situation regarding communication is more complicated. Political activists (and they are exactly the people who have poisonous attitudes about the opposition) probably do communicate more, but they do so over completely distorted channels. Democrats learn about what Republicans are saying from the Daily Show, the Onion, and Pharyngula. Republican learn what Democrats are saying from Rush Limbaugh and Glenn Beck. I suppose the real question is why today's activists seem to think that these channels are sufficient. Perhaps people would always have preferred those kinds of channels, but in the past they just weren't available.
2Nornagest13y
Talk radio's been around for a while, and TV pundits only a little less so, so I'd hesitate to blame either one. The political blog scene might be more directly involved; it's highly polarized, has excellent visibility among politically aware individuals, tends to be kind of incestuous, and coincides roughly with the 10-year timeframe we're discussing.
0Jack13y
I think existing structural problems were dramatically magnified by the modern media environment. The growth of politically involved evangelicalism is also relevant.
1Jack13y
Except in American politics all of those are always the case. You just can't agree to disagree when the outcome of the argument influences who gets to be in charge of how much people are taxed, how much people get through social welfare and who gets thrown in prison. Lets not make the mistake of thinking political discourse is in anyway about trying convince your opponents to change your minds- it's about trying to convince the small portion of the electorate that hasn't made up its mind that your opponents can't be trusted. Actually... it's a prisoners dilemma and that might explain why the problem is worse in the American system. Cooperating would be communicating and debating honestly to sort out who is right. Defecting means using lies, distortions and nefarious tactics to look better than your opponent. Cooperation would make both parties look better but either party increases their chances of victory by defecting. And if you think the other party is going to defect you have to defect or else you'll lose. So the strategy of of Domination leads to both parties defecting, as in the prisoner's dilemma. But in a multi-party system you a) have other agents that can punish defectors by not forming coalitions with them and b) a means by which the electorate can punish defectors... they have someone else to vote for. So the game here is the prisoner's dilemma with additional agents able and willing to punish defectors. This actually seems like a sound structural analysis most of us could agree on- perhaps these kind of institutional questions can provide a rational foothold on political questions.
1Perplexed13y
Your Prisoner's Dilemma argument seems appealing - until you realize that electoral politics is an iterated game. The two players ought to be able to achieve an agreement. It is definitely not a zero-sum game. Both parties have a shared interest in keeping the country governable. They have apparently already discovered the virtues of Tit-for-Tat retaliation. Now if only the electorate were to provide a little added payoff to whichever side first makes an effort to be 'nice'. I once attended a business (soft skills) training seminar in which a variant of the Prisoner's Dilemma was played. Two teams played PD against each other. But, within each team, it required a consensus decision (100% vote) to cause the team to cooperate. If any team member votes to defect, then the team as a whole must defect. The relevance to the question of civility between political parties should be obvious.
2Jack13y
Only if you model each political party as the same entity over time. But Presidents are term-limited and losing in a general election often means a leadership change for the party. For some individual legislators the relevant time horizon is never more than two years away (and as in your training seminar, it only takes a few bad apples). But this is a game-of-chicken-like incentive. They have incentive to swerve when the cars get too close, like maybe they'll sit together for a speech after one of them is nearly killed in an assassination attempt; but that isn't sufficient for general cooperation. Sure, it would be nice if defecting was counter productive- but the fact that the electorate always falls for the defection is what makes it a prisoner's dilemma. In any case, at this point both parties (though, I'd say the Republicans in particular) have pre-committed to defecting for the foreseeable future. When you use dehumanizing rhetoric to describe the opposition your allies will see compromise as treachery. In this case, you'll face a well-funded primary challenge from your party's ideological extreme. This can be useful if you want to be pre-committed into voting a particular way- but obviously it is extremely dangerous when used in a semi-iterated prisoner's dilemma with certain high risks associated with D/D. Every time I interact with you I think for a minute that you must be from Russia... heh.
0Perplexed13y
Thx for that insight. I'll try to use it in my continuing struggle to promote discounting of expected future utilities. Oh, I'm even more alien than that. I used to be a Republican!
0Jack13y
Ha! Though just to be clear since I might have gotten a downvote or two for the grandparent... I don't mean to just be trashing Republicans. I think my claim that they are more pre-committed to defecting for the foreseeable future is justified by an objective consideration of the strength and organization of their class of activists and ideologues versus that of the Democrats. I don't think it is mind-killing bias leading me to the conclusion that the Tea-party has had much greater success recently than the netroots or whatever you want to call the equivalent on the Left. I didn't mean anything evaluative beyond that (I have my opinions but those probably are subject to bias). (For the record I used to be a partisan, Left-wing Democrat. Now I'm vaguely aligned with that party but mostly for cultural and foreign policy reasons. Where I live, your vote doesn't count if you're not a Democrat. Ideologically I'm basically at the liberal-libertarian nexus.)
0Nornagest13y
That's a really interesting question. The Aumann analysis works well for politics. It works well for some theological questions, too: it's a handy explanation for why schismatic branches of a religion often become mutually antagonistic, for example. It isn't quite a complete description of antagonism when conformity with dogma is a fundamental value, but it's easy to augment Aumann with that. When it comes to cultural disagreements, though -- arts, music, sports teams -- there's a tacit understanding that people's priors are different. Appreciating that sort of thing isn't just about the immediate experience; it can vary depending on who you're trying to impress, and also on immutable products of upbringing and convenience. And people accept this. No one expects a resident of Oregon to be a Green Bay Packers fan, unless the Packers have been having a particularly good year -- and even that comes with a status penalty associated with the expectation of future defection.
0[anonymous]13y
It seems to me that many of the arguments made on this site based on or referring to the Politics is the Mind-Killer article are based on extrapolations from a single well-known highly-polarized (essentially) 2-party system, i.e. the USA. I am from a country with many political parties. No party ever gets more than 50% of the votes, in fact it is rare for any party to get over 20% of the votes. The parties are always forced to form a coalition to make a majority government. This system is not without its flaws, and far be it from me to argue that it is superior to the American system. Nevertheless, it seems to me that many of the failure modes of 'politics', as often described of this site, are actually failure modes of present-day American politics, and not of politics in general. For example, I encounter the argument described above, that "other side are ignoring obvious facts, and so failing to behave rationally, because they're blinded by their ideology" very rarely, even in political discussions. Politicians saying such things would find it hard to negotiate with other politicians to form a government, and are mostly smart enough to not say such things. They would have no difficulty admitting that other politicians/parties behave differently simply because they have different goals (they represent the interests of a different set of voters), while still acting on almost the same set of evidence.

There is a second human bias that causes you to cache an unrealistically high summary statistic for how often you change your mind: you think you change your mind, in general, pretty often, but unless you are an expert, highly-practiced rationalist, odds are that you do not. As evidence, try thinking of the last time you changed your mind about something and force yourself to specify what you believed beforehand and what you believed afterward. Me personally, I haven't changed my mind about anything that I can remember since about November 10th, 2010, and I'm sure I've expressed thousands of opinions since then. The odds are long.

It is interesting to hear you say that. I would not go as far as to contradict you but I would be equally unsurprised to find out that I changed my mind more than I thought I did. This too is a human bias that crops up all the time, albeit in different circumstances. People are quite capable of completely changing their beliefs to a new belief that they sincerely believe they had all along.

This is a miscalibration that can go either way depending on which way the ego is pulling at the time.

1atucker13y
I think that most people don't really update their far beliefs particularly frequently, but when in near mode will completely contradict their far beliefs when acting. Does that count as changing their mind? Or do you also mean that people just consciously change their mind too?
0wedrifid13y
I think I mean "unconsciously change their conscious beliefs". As an example I have found myself arguing on a different side of a debate to what I had argued in the past and thought "Oh, look at me. I'm all human and stuff with the changing my mind after the fact."
0bentarm13y
Excellent point! This is one of those things where, on first reading, I just accepted the OP's assertion without question, but now having had it pointed out to me, I want data! So, if anyone knows, it must be someone at LW. Do people change their minds more often or less often than they think? For what values of "change their minds"?
0bgaesop13y
I would like to read something more in depth about this. Could you write up a post, or link to an article about it or something?

I found the use of political examples grating, and wish we could enforce the "no politics" guideline more consistently.

4wedrifid13y
The most grating part was that they relied on entirely naive assumptions. You don't need to posit 'don't change your mind' bias on the part of Josh Steiber's peers. Just that none of them were under the misapprehension that they had joined the Salvation Army. Consistently enforced 'guideline'? Something in there verges on oxymoronic.
0[anonymous]13y
The soldier's notion that he would not be expected to participate in bloody reprisals and violating other people's preferences was hopelessly naive historically speaking.
0steven046113y
Fair enough; when I edited "rule" to "guideline" I should also have edited "enforce" to "follow".
0[anonymous]13y
Now that is a sentiment that I can endorse.
3shokwave13y
The Iraq example was good and added to the post. I could go either way on the agriculture example. "We could replace you with an unthinking, unquestioning patriot and get the same result" could possibly be "unthinking, unquestioning automaton", but wouldn't cause the same feeling for me in the pit of my stomach, the "I really don't want to produce those results" feeling.
2BillyOblivion13y
The Iraq example was awful because it is a very charged issue with people lying DEEPLY on both sides. There are a lot of people (myself included) who have been there, and who have either seen the same thing and gotten different impressions (and hence beliefs) about it, or people who have seen very different things and of course come away with different beliefs. What Stieber did was an example of someone coming to a conclusion that their actions were wrong (not irrational as a large part of why he thought they were wrong was that people around him were acting contrary to his beliefs and their stated beliefs and acting from emotion rather than reason) (as an aside much of his conversion seems to his christian beliefs, which I respect more than most people here seem to) and changing what they were doing because of it at a very expensive cost, however it is a bad example because there are very logical reasons why what he did was wrong and those get in the way of understanding what the author's point is. It would be like me arguing that I realized my diet where I got most of my calories from starches and sugars was wrong, so I switched to a diet much heavier in meat and fresh vegetables, and that eating things like soy and wheat, because of things like gluten, phyto-estrogens, and phytic acid, are bad for you. Now, it is true that I recognized a problem, did some research, evaluated the evidence and made changes to my diet. This will be ignored in certain circles in favor of the position that EATING MEAT IS WRONG. It is hard to get past the position (in my mind) that what Stieber did was wrong, and just deal with the point the author is making--that someone came to a decision and then made a change. There is also the problem that the Author slightly mis-represents the facts presented in the article. The people in Baghdad didn't say "Yankee's go home"--they suggested that they did not want Americans in their part of Baghdad. That is a very different thing. This is ac
2Nornagest13y
This must be weighed against the proportion of the audience in whom such a phrase would inspire exactly the opposite reaction (or, more likely, a stronger but opposite one). Though it's not the phase itself but the associations the phrase triggers that'd do the damage; few people want to be unthinking adherents of anything but many have heard phrases like "unthinking and unquestioning" used to describe their political allies. No idea what those proportions would be here, though.
-2bgaesop13y
I find the no politics guideline a bit odd. I mean, shouldn't a rational humanist arrive at certain political positions? Why not make those explicit?
4TheOtherDave13y
I agree that the exercise of converging, based on a consideration of plausible consequences of plausible alternatives, on a set of policy positions that optimally support various clearly articulated sets of values, and doing so with minimal wasted effort and deleterious social side-effects, would be both a valuable exercise in its own right for a community of optimal rationalists, and a compelling demonstration for others of the usefulness of their techniques. I would encourage any such community that happens to exist to go ahead and do that. I would be very surprised if this community were able to do it productively, though.
3benelliott13y
I don't think you're right about it being a compelling demonstration of their techniques. People who already agreed precisely with the conclusions drawn might pretend to support them for signalling purposes, and everyone else would be completely alienated.
3TheOtherDave13y
That's certainly a possibility, yes. For my own part, I think that if I saw a community come together to discuss some contentious policy question (moral and legal implications of abortion, say, or of war, or of economic policies that reduce disparities in individual wealth, or what-have-you) and conduct an analysis that seemed to me to avoid the pure-signaling pitfalls that such discussions normally succumb to (which admittedly could just be a sign of very sophisticated signaling), and at the end come out with a statement to the effect that the relevant underlying core value differences seem to be the relative weighting of X, Y, and Z; if X>Y then these policies follow, if Y>X these policies, and so on and so forth, I would find that compelling. But I could be wrong about my own reaction... I've never seen it done, after all, I'm just extrapolating. And even if I'm right, I could be utterly idiosyncratic.
3handoflixue13y
I used to participate in a forum that was easily 50% trolls by volume and actively encouraged insulting language, and I think I got a more nuanced understanding of politics there than anywhere else in my life. There was a willingness to really delve in to minutia ("So you'd support abortion under X circumstances, but not Y?" "Yes, because of Z!"), which helped. Oddly, though, the active discouragement of civility meant that a normally "heated" debate felt the same as any other conversation there, and it was thus very easy not to feel personally invested in signaling and social standing (and anyone that did try to posture overly much would just be trolled in to oblivion...)
1jfm13y
I used to participate in such a forum, politicalfleshfeast.com -- it was composed mainly of exiles from DailyKos. Is this perhaps the same forum you're talking about?
3[anonymous]13y
Politics is nearly all signalling. Positions that send good signals only occasional overlap with positions that are rational. Also the other apes will bash my head in with a rock so I really need to seem to be right even if I'm wrong. Being right on politics and the other side being wrong is a matter of life and death.
3bgaesop13y
Talking about politics may be mostly signaling, but politics itself-that is, the decisions made and policies enacted-is something else that is really, really important. If you care about the future of humanity and you have examined the evidence, then you should be concerned about global warming. I don't understand how that statement should be any more controversial than being concerned about the Singularity.
4[anonymous]13y
Then I will get back to you as soon as I have meaningful influence over any policies enacted.
-2bgaesop13y
Good point. One interesting thing you can do is advocate for or attempt to participate in a revolution: the odds may be very low of succeeding, but the payoff of successfully succeeding could be almost arbitrarily large, and so the expected utility of doing so could be tremendous.
3Dreaded_Anomaly13y
One would think so, but there seem to be many libertarians here.
-3[anonymous]13y
Upvoted for self-aware irony.
0satt13y
Which certain political positions did you have in mind?
-1bgaesop13y
Well, for example, one should oppose the use of torture. Torture is Bad because it in and of itself reduces someone's utility, and because it is ineffective and even counterproductive as a means of gathering information, and so there isn't a trade off that could counteract the bad effects of torture.
3wedrifid13y
The word you are looking for is 'nice', not 'rational'.
-1scav13y
Hmm. I suspect there's a tiny little bias, possibly politically influenced, whereby signalling that you are nice implies signalling that you are irrational: naive, woolly-minded, immature, not aware of how the world really works, whatever. But it is rational for us to oppose torture because public acceptance of torture is positively correlated with the risk of members of the public being tortured. And who wants that? It is also negatively correlated with careful, dispassionate, and effective investigation of terrorism and other crimes. I also oppose it because I love my neighbour, an ethical heuristic I would also defend, but it's not to the point in this case.
-1bgaesop13y
That was assumed when I said that the person we're describing is a humanist.
0wedrifid13y
I suppose then that the site that your conclusion would apply to would be humanistcommunity.org, not lesswrong. ;)
2ArisKatsaris13y
If you could convince people that it's ineffective and counterproductive, they wouldn't even need to be rationalists or even humanists in order to oppose it. So your opposition to torture (which I also oppose btw) doesn't seem like a conclusion that a rationalist is much more likely to arrive at than a non-rationalist -- it seems primarily a question of disputed facts, not misapplied logic. There's one point that seems to me a failure of rationalism on the part of pro-torture advocates: they seem much more likely to excuse it away in the case of foreigners being tortured than in the case of their own countrymen. If the potential advantages of torture are so big, shouldn't native crimebosses and crooks also be tortured for information? This to me is evidence that racism/tribal hostility is part of the reason that they tolerate the application of torture to people of other nations. Btw, I find "reduces someone's utility" a very VERY silly way to say "it hurts people".
-1Vaniver13y
Indeed, as revealed preferences show us that not torturing people reduces many people's utility. It is a stretch to say it hurts them, however.
-1shokwave13y
It would be trivial for me to construct a hypothetical where torture is unambiguously a good idea. It wouldn't even be hard to make it seem a realistic situation; I might even be able to use a historical example. To call something generally irrational, or to claim that rationality is opposed to a thing, you have to make the argument that in principle it's not possible for this to be either a terminal goal or the only available instrumental goal.
-1scav13y
I think the original claim was that political opposition to torture was rational, assuming we are talking about the use of torture by the state to investigate crimes or coerce the population, domestic or abroad. That's a less strong claim, and fairly reasonable as long as you allow for the unstated assumptions. A much stronger claim, IMO
-5bgaesop13y

Nice.

I would add to your list: choose an appropriate community.

If I wanted to stop/start eating animals, I think the single most effective thing I could do would be to start hanging out in a community of vegetarians/omnivores. (Especially if I considered it the moral/prudent thing to do, though it would work about as well either way.)

Similarly, my social circle is at this point largely polyamorous. My own relationship is not, essentially because neither I nor my husband have any particular interest in inviting a third person into it -- we barely manage to... (read more)

0Nisan13y
Yep, I was going to suggest this. What does this mean?
4TheOtherDave13y
It was a reference to the original post; the story about Josh Steiber. What I mean is, if I choose a community in order to reinforce a lifestyle, I make it more difficult to extract myself from that lifestyle if I later choose my mind. It's a powerful solution, but it's not a flexible one.

Off-topic halachic minutia:

I remember at one point a religious camp counselor caught me using a glowstick on the Sabbath, and advised me to throw the glowstick away, on the theory that kindling a new light on the Sabbath violated the applicable religious laws.

It sounds to me like your camp counselor was ignorant of the actual halachah, but had some vague of how the relevant halachot worked and tried to construct his own rational for them. A glow stick does not produce significant quantities of heat, so a glowstick is probably at most Rabbinically proh... (read more)

I don't think I can even begin to comprehend the kind of bizarre law-fetishism that could lead to this runaway ridiculous situation - where the answer to "can I move this candle" is "it's complicated".

6JoshuaZ13y
There are a lot of comments already in this subthread addressing these issues already but I'm going to just comment on one other issue that's worth bringing up. There's a common belief among Orthodox Jews that the rule system reflects reality at some level. This is most common among certain chassidic groups, especially the Lubavitch, who believe that doing mitzvot (commandments from God) actively make the world a better place (less disease, fewer natural disasters etc.) and that doing bad things has the opposite effect. In the context of that belief, understanding the exact boundaries of the laws is similar to understanding the exact boundaries of the laws of physics. Whether a given mass of enriched uranium will go critical is complicated, a function of the exact shape, the U-235/-238 ratio, the presence and types of trace impurities and other factors. We don't mind that because we all see the results. To some believers, whether religious Jews, or other highly legalistic religions such as some branches of Catholicism, this feels very similar. Caring about the minutia is an example of really acting like there's a dragon in the garage.
3TheOtherDave13y
This was very much not the case within the Orthodox tradition I was raised in. Something similar was true for mishpatim (1), I guess -- in the same way that secular communities frequently assume that their preferred policies make the world a better place -- but chukim (1) were presented entirely deontologically. Sure, one could make an argument to the effect that God was omniscient and benevolent, and wanted these rules followed, and therefore it was likely that the effect of following the rules would be beneficial... but mostly nobody did; the more common stance was that obedience to God was the proper terminal value, and God wanted these rules followed, and therefore compliance with the rules was a proper instrumental value. Likely consequences didn't enter into it at all. (1) Jewish tradition divides the commandments derived from the Old Testament, which are by tradition understood as coming directly from God (as distinct from the ones that are understood as coming from later rabbinical bodies), into two classes, chukim and mishpatim. Roughly speaking, mishpatim have a reason given and chukim don't.
5TheOtherDave13y
(shrug) I think what I said here applies. Try explaining the social rules governing successfully navigating your own community to a complete outsider -- or even better, someone with Asperger's -- and you may find it easier to comprehend how one gets into such a ridiculous situation.
5Costanza13y
I'm a gentile atheist and I find that Halachich debate and reasoning totally appeals to the detail-oriented fanboy in me. Ultimately, it was a barren intellectual exercise with respect to the real world, but a hugely challenging intellectual exercise -- a game -- nonetheless. Maybe one of the great games of history. Who can tell how many brilliant minds wasted their lives building this enormously refined system of law, based on the myths of one of many, many barbaric tribes? But with that said, in the modern day, many of the best jurists and legal scholars today are Jews who owe some debt to this cultural inheritance.
5shokwave13y
This says some good things about the cultural laws, but it also says some bad things about our legal systems.
2Costanza13y
I think this must apply to every legal system which has governed humans so far. If laws are to be made known to everyone, and generally comprehensible, then they can't be too complicated. As it is, they tend to be plenty complicated. Even so, great numbers of people in aggregate are still far, far more complicated than any human system of laws. They will do things unanticipated by the lawmakers, and not exactly covered by the words of the lawmakers. Then, a court of law may be required to decide whether or how an inherently ambiguous law applies to an unanticipated fact pattern.
5shokwave13y
I agree, and it's factually true; my concern was that if training on Halachic law was good practice for common law, then our legal systems suffer too much from complications. I think the Halachic system is bad, and to the extent that our legal system resembles it enough to measurably advantage Halachic scholars, our legal system is bad too. There was a move at one point to write laws in Python or some other programming code; I would then argue that if thinking like a programmer made you a better jurist or legal scholar, it says good things about both systems.
3Costanza13y
I am seriously interested in more information about this approach. I think that right now, there are two modern systems of law: Roman-derived law and English-derived, or "common" law. Sharia law might count as a close runner-up. I think Halacha is well-developed, but not widely-enforced, so I would not count it as a major modern legal system. With that said, and admitting I don't know much about civil law or the religious laws, my impression is that all the above are similarly complicated, and have been for centuries. I am in doubt that human behavior and its ambiguities could be simplified by being encoded in Python. I think it's a really, really hard problem, at least as long as humans remain as unpredictable as they do.
5Will_Sawin13y
Off-topic: Why does everyone on lesswrong say Python when they need to mention a programming language?

Rule 46b:: I will not turn my programming language into a snake. It never helps.

3shokwave13y
It has a very high ease of learning to usefulness ratio? edit: It seems to come highly recommended as a first programming language (certainly it was such to me).
2scav13y
Do you mean a high usefulness to difficulty of learning ratio? Atari BASIC had a nearly infinite ease of learning to usefulness ratio. :)
1shokwave13y
Right.
2Normal_Anomaly13y
Python is my first (and currently only) programming language. It's easy to read, easy to learn, and useful.
2Risto_Saarelma13y
Python code is also reasonably easy to read. It's sometimes called executable pseudocode.
1timtyler13y
I did a Google duel - and it appears that "Java" beats "Python" for mentions around here.
0[anonymous]13y
I don't get it either I'm more of a C guy.
3bogus13y
David Friedman has taught a course in "Legal Systems Very Different From Ours" in both 2008 and 2010. See these course pages: [1] [2]
1ShardPhoenix13y
I think the Python thing was just for the payoff functions of securities, not for laws as such.
1shokwave13y
That is disappointing. Lawmakers who think like programmers seems like it would be a huge improvement on the current system.
6nerzhin13y
Lawmakers who think like programmers might be an improvement. But I'm not sure. On Less Wrong, this almost reads as "if only lawmakers were more like me, things would be okay." I'm skeptical.
3Sniffnoy13y
It would probably have to be coupled, though, with a state where laws are actually enforced consistently, and can be changed quickly if they end up screwing things up massively.
2[anonymous]13y
They may have wasted their minds on it, but the better they where at wasting their minds the higher their status was, the likelier it was they would marry a girl from another respected or wealthy family and consequently the more they got to reproduce. Where their minds truly wasted? Or did it by happy accident, a hack of our out of date reward systems, managed to produce more brilliant, if deluded and blinded minds? History has also since shown that the minds aren't irreversibly deluded. I can't help but wonder if we would have had quite as many wonderful minds like Bohr, Einstein, Hertz or Nobele prize winners like Richard Phillips Fenyman or Isidor Isaac Rabi (!) if those minds in the late middle ages or early modern period weren't wasted.
3JoshuaZ13y
Possibly, but at the same time, a lot of those people in the Middle Ages were still wasting time and are still doing so today. There's no question for example that Maimonides was brilliant. He was impressive for his accomplishments in philosophy, medicine, and even in other areas that he only dabbled in (such as math). That he spent most of his time on halachah certainly held back society. And he's not the only example. Similar remarks would apply to many of the great Rabbis in history and even some of the modern ones.
2TheOtherDave13y
I'd be interested in seeing how you draw the line between Maimonides' work in halachah and in philosophy. I can certainly identify outputs that I would classify as one or the other, but I would have a very hard time drawing a sharp line between the processes.
0JoshuaZ13y
I agree that there isn't a sharp line. But if we just look at the material that falls unambiguously into halachah as opposed to all the material that falls into philosophy or the borderline, there's a lot more halachich material.
2TheOtherDave13y
Sure. Again, classifying the outputs isn't too hard. Philosophical and halachic writing are different genres, and it's relatively easy to class writing by genre. Sure, there's a fuzzy middle ground, but I agree that that's a minor concern. But your argument seems to depend on the idea that if he spent a year thinking about stuff and at the end of that year wrote five thousand words we would class as halachah and five hundred words we would class as philosophy, that means he wasted that year, whereas if it had been the other way around, that would advance society. Before endorsing such an argument, I'd want to know more about what was actually going on in that year. I could easily see it going either way, simply because there isn't a clear correlation in this context between how useful his thinking was vs. what genre he published the results in.
0JoshuaZ13y
Regarding fanboyism, that's certainly an aspect that it has similarity to among the more self-conscious Orthodox Jews there's a feeling that they understand it as an intellectual game. And for what it is worth, when I've wrote on my blog entries about things likes the halachot of making a horcrux, or the kashrut status of a Star Trek replicator, most Orthodox readers are interested and generally not offended.
0Alicorn13y
I want a link to that.
0JoshuaZ13y
Discussed in this entry.
0[anonymous]13y
I think Lawyers are like Warriors in this regard.
4[anonymous]13y
What you call law fetishism I call ensuring high fidelity in meme transmission while at the same time trying to make irrational memes that are adaptive easier to rationalise. The whole collection of memes that lead to this "ridiculous situation" was fitness enhancing, even from a genetic perspective, but people who transmitted it didn't know why it worked (note: We probably still don't appreciate how culture modifies behaviours and expectations in unknown ways). It was improved mostly via groups that where unfortunate enough to stumble on a variation that didn't work dying off or being out-shined. The Shakers are a example of a Christian group stumbling upon a variation (or mutation) of Christianity that unfortunately doesn't work, However over time people found it harder and harder to execute all these seemingly random instructions (junk memes build up over time), that is where scholarship comes in. Sometimes you can streamline the rules, at the price of adding a few more junk memes, other times you can create convoluted rationalizations that most never study but helps those in positions of authority being more or less sincere in promoting the rules. Abrahamic faiths are basically elaborate collections of scripts that when executed in the "ancestral" environment helped the entire list of memes come down to the present. They like I previously stated often even helped general genetic fitness (just look at the increase in numbers of Jews in Eastern Europe from the 16th century onwards, or the increase in Amish numbers since the early 20th century) They could employ a much longer list of behaviours than the limited lists of taboos and rituals of older traditions because of writing and boon that is the idea of a omnipotent omniscient agent which basically acts as universal solvent for the feeling of cognitive dissonance. Religious scholars and leaders where in times when they where greatly respected basically social engineers who tried to tweak the DNA of their soc
3Costanza13y
I think there's a principle at work here. I suspect that this has been expressed more formally. But... ... laws proposed to govern human behavior -- which is complicated -- can only anticipate a portion of that behavior. Lawmakers may enact the most rigid, black-and-white, unambiguous law you could imagine, but it must be expressed in words more ambiguous than the words used in mathematics. There will be a grey area, and human action will find that grey area. It will be complicated on the fringes. This applies to any system of law by which humans are to govern themselves, Halacha just as much as the United States Code of Federal Regulations.
2shokwave13y
That sounds related to Goodhart's Law. Could reasonably be called "Costanza's Corollary to Goodhart's Law" .
0[anonymous]13y
I see Wikipedia says Goodhart's Law may mean: "that once a social or economic indicator or other surrogate measure is made a target for the purpose of conducting social or economic policy, then it will lose the information content that would qualify it to play such a role." I tentatively think prescriptive laws do not correspond to measures, whether surrogtate or direct. Right now, I think surrogate measures are like maps, which may or may not match the territory. On the other hand, I think laws are not like maps. Rather, they are like plans, especially like plans made for exploring territory that has not yet been mapped. Every so often, explorers must revisit their plans in the face of reality. My metaphor may be breaking down a bit here, but imagine law as a single set of instructions issued by the King to innumerable groups of hopeful colonists, setting out to explore a new world. The instructions may suffice for many or most, but some will have to make some creative interpretation.
7komponisto13y
Out of curiosity, do people who grow up under this sort of regime end up thinking it's normal, similarly to the way people raised in Christianity end up desensitized to the absurd-sounding nature of the beliefs about virgin birth and so on? Does it cause them to e.g. be more accepting of government regulation than average? Or is there some kind of compartmentalization going on where they continue expecting rules in general to make some sort of sense (and not interfere with practical functioning), just not those labeled "religious"? My suspicion, of course, is the latter (just as people compartmentalize their epistemic beliefs, and allow their absurdity heuristic to function more-or-less normally outside of the religious domain), but I'd be curious to hear reflections from those who were raised in strict legalistic religions about the extent to which such practices actually struck them as absurd inside their own minds (even allowing for belief in the empirical claims of the religion about the nature of the universe).

I can't speak for anyone else, but I was raised an Orthodox Jew and I basically took to treating it as "normal" in the same sense that any set of arbitrary social rules is "normal." It was no weirder than the rules governing, say, when it was OK to wear a T-shirt and sneakers vs. when it wasn't, or when it was OK to eat the last piece of cake, or whatever.

And I still basically think that. It's not that there's some default state where there aren't any arbitrary rules to follow, against which I can compare the rules of Orthodox Judaism. There are just different cultures, each with its own set of rules.

I suspect that, again as with any set of social norms, the key distinction is between people who are raised with only one such set of norms, compared to people who are raised having to navigate among several. The former group can treat their culture's rules as invisible and default and "common sensical"; the latter group can't get away with that so easily.

Anyone interested in pointing Less Wrong out to Josh Steiber, from the linked slate article? I'll contact the author.

This paragraph:

What is your reference class for predicting your own behavior?

and this one:

I think it probably does help, though, to be a bit of a drama queen.

crossed the line from good to awesome for me. Thanks for the post!

There is a third human bias that causes you to tell yourself that you have successfully changed your mind when you have not really done so. The adherent of the Reformed Church of Dragon leaves the garage door open, and cheerfully admits to anyone who asks that there is probably no such thing as an invisible dragon, yet she is unaccountably cautious about actually parking her car in the garage. Thus it is worth knowing not just how to change your mind, but how to change your habits in response to new information.

Related: The Mystery of the Haunted Rationalist.

What if anything, would convince you to stop (or start) eating animals? Not merely to admit, verbally, that it is an acceptable thing for others to do, or even the moral or prudent thing for you to do, but to actually start trying to do it?

In my case: adequate alternatives. I tried to become a vegetarian once before I succeeded. However, this was before the day I spontaneously woke up one morning with a taste for vegetables (it happened, it was weird), so I ate grilled cheese every day for a few days and then gave up. Later, when I a) liked vegetable... (read more)

If you gauge the dosage correctly, the propaganda might nudge your opinion just enough to make you actually adopt the new action that you felt would adequately reflect your new beliefs, but not enough to drive you over the cliff into madness.

This sounds difficult enough to do reliably that I have to question whether it's actually a good tactic.

One thing I think may be helpful, that I've noticed some people here seem to practice; if someone says something to you which makes you think about revising your opinion, tell them so. You'll have forced yourself... (read more)

One of the best articles here lately. The first two advices are very good, even if probably not new, but you have formulated the point very persuasively. I would also not worry about the political example: in spite of the mind-killing abilities of politics, the way how you have stated your examples is unlikely to incite a flame war in this community (if it does, I will be afraid that our level of rationality is not much higher than that of average folk, despite our aspirations).

I have a little problem with the third advice, though. I suspect it would not w... (read more)

-3Will_Sawin13y
Is having correct political beliefs important to you? Because it seems like you have a serious deficiency there that, since you are aware of, you may be able to correct. For instance, exposing yourself to lots of high-quality arguments from both sides might help. But we have no theory of correct political beliefs, so you might be kind of helpless here.
1prase13y
They used to be more important that they are now. I didn't intend to imply that I have avoided good arguments in favour of poor propaganda. I think I have heard most of the good arguments too (and the stupidity of the poor arguments is more apparent when compared to the good ones). I have only described the effect which propaganda has on me. This effect is irrational, since it activates the stupidity reversing reflex, so I try to avoid it; I wanted to point out that using propaganda as a mind hack may work differently for different people.
0nazgulnarsil13y
I'd start with coherence if you're looking for correct beliefs of any sort. centrism certainly doesn't meet this guideline.
2prase13y
It may certainly depend on what do you exactly mean by centrism, but can you be more explicit in your statement about its lack of coherence? Also, I am more likely to form my political beliefs based on my actual values rather than on elegant philosophical principles. Human values are complicated and unlikely to be expressible by a succinct coherent belief system.
0nazgulnarsil13y
is the centrist position today the same as the centrist position of 10 years ago? what about 100? what about the centrist position in germany in 1942? taking the average of two wrong positions is unlikely to produce a correct one. and you admit that your values are incoherent so readily? that is unusual but highly beneficial as a starting point.
0prase13y
I have written that I am now closer to the centrist (as the word is defined now and in my country) views than I have been few years ago, when I was sympathetic with a bit more radical leftist (once again, as defined now and in my country) opinions. I have not included the clarifications in the parentheses because I did find that interpretation obvious. Since your replies imply that my words can be interpreted differently from what I have meant, I should have been perhaps more clear. So, I do not say that I average the extreme positions and that I am close to the centrist position just because it appears to lie in the centre, and thus I will shift my opinions when the centre moves. That they are incoherent or inconsistent doesn't mean that they are so in an obvious manner. Values are complicated and not all conflicts are easy to see, and even after being seen, they are not easy to resolve. Think about the trolley problem for example. Edit: just to be more clear, I have to add that the (approximately) centrist position I hold means sharing some opinions which are more common on the right and others which are prevalent on the left, not being close to average on each opinion separately.
0Will_Sawin13y
No, he said that they're probably either incoherent or not succinct. There should exist coherent positions that are roughly in the center of the two parties/idealogies. One can argue that libertarianism is, for instance.
1nazgulnarsil13y
directional metaphors may fail if looked at with reasonable rigor.
2prase13y
Any statement may appear incoherent if looked at with unreasonable rigor.
[-][anonymous]13y20

The same troops in the same town confronted with the same evidence that their presence was unwelcome all continued to blame and kill the locals.

Generally when occupying another country or supporting its government with your troops you only care what the frak the locals think in a very limited, well defined and may I say small sense.

If you have decided that all things considered you want your troops in a location that generally takes into account them not wanting you there. The most one can say to local opposition is "noted".

The theory is t

... (read more)
-1katydee13y
This post basically boils down to "MOST PEOPLE ARE STUPID AND EASILY TRICKED, BUT I'M NOT." Probably true, but do you have to be so overt about it?
-2[anonymous]13y
Upvoted for bringing my attention to this. I didn't feel so, but reading my response to the second comment I see how one can get that impression. I've edited that bit of the text while trying to keep its original meaning. Does it come of any better? Part of the reason I perhaps came off the wrong way might have been that I was mistaken thinking that not many people are geniuenly fooled by the rational and are aware of a ulterior motive that makes it awfully convenient to "help" that particular group of people, unless they also seek to ensure universal adherence to their values. In which case I also thought was obvious to most that when they cheer for "spreading democracy" or things like that what is happening on a basic level is satisfaction of the urge to convert the infidels not a rational judgement based on a unbiased consideration of what is best for them. If it was the first part of my post that bothered you, perhaps I should emphasise that I don't object to not caring what the locals think to a extent, I just object to not being honest to oneself about it. I also implicitly stated (small) that by the standards we apply to some other situations concerning government and violence, a occupying force cares a little bit less than one might first assume.
-2katydee13y
The new version does indeed seem better, though the second part of the post seems less clear and perhaps overly general now-- I'm extremely confident that violence is applied in at least some cases primarily to help others.
-3zyxwvutsr13y
"...a occupying force cares a little bit less than one might first assume" I don't mean to be overly critical of your imprecise language, but in this context I think it is important to note that a "force" does not care at all. More to the point, a military force comprises individuals who hold a whole range of opinions and who may act in ways that are contrary to those opinions.

rationalism

This triggered me posting this article, where I write:

I feel that the term "rationalism", as opposed to "rationality", or "study of rationality", has undesirable connotations.

(Discuss there, not here.)

Off-topic: Meatless (and pattyless) sandwiches are surprisingly good if you load them up with most of the vegetables. I go to Subway a few times a month but haven't had a meat sub there in years.

I think the examples used here are absolutely terrible, and I think they indicate a fundamental flaw underlying this theory. Basically, what you call "irrational" in this context, I'd call "rational but dishonest about its motives."

The purpose of having US troops in an area is not to make the locals happier. I don't see much of a reason military leadership should care about local opinion except insofar as it advances their actual objectives. This is true in both the sense that a mugger shouldn't care about his target's feelings, and a p... (read more)

Your "sunk costs" link is broken. You maybe want to link here, here, or if you're feeling evil, here.

1apophenia13y
It cost me three willpower points or so not to click the third.

On the symbolic action point, you can try making the symbolic action into a public commitment. Research suggests this will increase the strength of the effect you're talking about. Of course, this could also make you overcommit, so this strategy should be used carefully.

"Out of curiosity, do people who grow up under this sort of regime end up thinking it's normal, similarly to the way people raised in Christianity end up desensitized to the absurd-sounding nature of the beliefs about virgin birth and so on? Does it cause them to e.g. be more accepting of government regulation than average?"

Why not look at relatively more secular western Europe versus the relatively more religious US and see which population is more accepting of government regulations. That is to say that either you have it precisely backwards or there is no observable correlation.

2wedrifid13y
That does not actually follow.
0zyxwvutsr13y
Why not?
2shokwave13y
There are three possibilities: Mass_Driver has the causal flow right, Mass_Driver has the causal flow wrong, there is no causal flow. Pointing out the existence of the last two options doesn't mean the only two options are those options. It is still entirely possible after your comment that Mass_Driver has it the right way around. Therefore, it doesn't follow.
[-][anonymous]13y00

3) Over-correct your opinion by reading propaganda

Your mileage may vary; please use this tactic carefully.

I already have problems that could easily be made worse by this despite your warning. I have a hunch many on LW do.

[-][anonymous]13y00
  1. There's significant ambiguity about what counts as "changing" a belief. If you look at belief in the only way that's rational—that is, as coming in degrees—then you "change" your belief whenever you alter subjective probability. Your examples suggest that you're defining belief change as binary. I think people's subjective probabilities change all the time, but you rarely see a complete flip-flop, for good reason: significant beliefs often rest on vast evidence, which one new piece of evidence, no matter how striking, won't be apt to r

... (read more)
0shokwave13y
If you update on the evidence from .3 to .5, and then later evidence shows you that you still act as if you believe the probability is .3, then you should consider irrationally changing your belief. Of course, you risk overupdating to 0.7 or 0.9, but that is a question for expected utility, not a point against the concept entire. It may be possible to be more accurate with over-correction than simply pushing in the right direction; inundate yourself with "0.5" propaganda or something.

1) Specify a quitting point in advance.

Along this same line I try and always keep my beliefs and actions under the banner of a more general ideal or goal. For instance, if I wanted to help decrease existential risk and decided that the best way was to move to San Francisco to be closer to SIAI, then instead of simply caching the goal 'Move to SF' in my mind, I would try and cache 'Reduce existential risk by moving to SF'.

This takes extra memory, but it serves to remind you to question the validity of your subgoals in the context of your supergoals. I al... (read more)

Over-correct your opinion by reading propaganda

You could also try creating your own propaganda (also useful for Akrasia). You should have a good idea of the types of things that motivate you, so you can use that knowledge to make very focused adverts (e.g. basic posters) for yourself.

There's more on this kind of thing, advertising to yourself, over at http://www.takebackyourbrain.com/ - but it looks like it hasn't been updated in a while.

4) Admit that you're wrong to other people, whether its publicly or to close friends who are in a position to catch you not having updated your behavior. This adds social pressure to continue the change, and more people to notice when you mess up. (This could go under one or two, though.)

[-][anonymous]13y00

the Reformed Church of Dragon

Hilarious. I nearly choked on my sandwich.

I don't have much of substance to add, but I want to say: this is an excellent post, and I think it deserves front page status.

1AnnaSalamon13y
Which details did you find excellent or helpful? What might you do differently, now that you've read the post? In terms of adding substance, it generally helps my own reading to know pieces of content others are stealing, since often those help me, too. (Though there's nothing wrong with just saying what you said.)
0DSimon13y
I enjoyed the specific examples; I was a little wary after the Watch Out Politics Ahead disclaimer, but the actual examples chosen were presented in a way that made their applicability to the topic obvious, and reduced their potential impact as mind-killers. (However, take this evaluation with a grain of salt: most of the examples reflected my current political positions, so they might not seem as ideal to another.) I also liked that the proposed solutions took human biases into account, with suggestions that go beyond just identifying a common error. The first solution puts forward a specific suggestion for working around the given bias. The others propose doing some bias jujitsu, and putting the akrasic parts of our minds to work for us. This can be easy to overdo, but despite that I think it's a very useful technique, especially for newcomers who may not be as used to trying to pick apart their own thought processes. On that note, I really want to see this article on the front page because I think the topic overall would be of particular interest to newcomers. It requires no prerequisites (though it also links liberally to related sequence and non-sequence posts) or unusual terminology, and provides concrete near-mode problems and solutions.