Related: Cached Thoughts
Last summer I was talking to my sister about something. I don't remember the details, but I invoked the concept of "truth", or "reality" or some such. She immediately spit out a cached reply along the lines of "But how can you really say what's true?".
Of course I'd learned some great replies to that sort of question right here on LW, so I did my best to sort her out, but everything I said invoked more confused slogans and cached thoughts. I realized the battle was lost. Worse, I realized she'd stopped thinking. Later, I realized I'd stopped thinking too.
I went away and formulated the concept of a "Philosophical Landmine".
I used to occasionally remark that if you care about what happens, you should think about what will happen as a result of possible actions. This is basically a slam dunk in everyday practical rationality, except that I would sometimes describe it as "consequentialism".
The predictable consequence of this sort of statement is that someone starts going off about hospitals and terrorists and organs and moral philosophy and consent and rights and so on. This may be controversial, but I would say that causing this tangent constitutes a failure to communicate the point. Instead of prompting someone to think, I invoked some irrelevant philosophical cruft. The discussion is now about Consequentialism, the Capitalized Moral Theory, instead of the simple idea of thinking through consequences as an everyday heuristic.
It's not even that my statement relied on a misused term or something; it's that an unimportant choice of terminology dragged the whole conversation in an irrelevant and useless direction.
That is, "consequentialism" was a Philosophical Landmine.
In the course of normal conversation, you passed through an ordinary spot that happened to conceal the dangerous leftovers of past memetic wars. As a result, an intelligent and reasonable human was reduced to a mindless zombie chanting prerecorded slogans. If you're lucky, that's all. If not, you start chanting counter-slogans and the whole thing goes supercritical.
It's usually not so bad, and no one is literally "chanting slogans". There may even be some original phrasings involved. But the conversation has been derailed.
So how do these "philosophical landmine" things work?
It looks like when a lot has been said on a confusing topic, usually something in philosophy, there is a large complex of slogans and counter-slogans installed as cached thoughts around it. Certain words or concepts will trigger these cached thoughts, and any attempt to mitigate the damage will trigger more of them. Of course they will also trigger cached thoughts in other people, which in turn... The result being that the conversation rapidly diverges from the original point to some useless yet heavily discussed attractor.
Notice that whether a particular concept will cause trouble depends on the person as well as the concept. Notice further that this implies that the probability of hitting a landmine scales with the number of people involved and the topic-breadth of the conversation.
Anyone who hangs out on 4chan can confirm that this is the approximate shape of most thread derailments.
Most concepts in philosophy and metaphysics are landmines for many people. The phenomenon also occurs in politics and other tribal/ideological disputes. The ones I'm particularly interested in are the ones in philosophy, but it might be useful to divorce the concept of "conceptual landmines" from philosophy in particular.
Here's some common ones in philosophy:
Landmines in a topic make it really hard to discuss ideas or do work in these fields, because chances are, someone is going to step on one, and then there will be a big noisy mess that interferes with the rather delicate business of thinking carefully about confusing ideas.
My purpose in bringing this up is mostly to precipitate some terminology and a concept around this phenomenon, so that we can talk about it and refer to it. It is important for concepts to have verbal handles, you see.
That said, I'll finish with a few words about what we can do about it. There are two major forks of the anti-landmine strategy: avoidance, and damage control.
Avoiding landmines is your job. If it is a predictable consequence that something you could say will put people in mindless slogan-playback-mode, don't say it. If something you say makes people go off on a spiral of bad philosophy, don't get annoyed with them, just fix what you say. This is just being a communications consequentialist. Figure out which concepts are landmines for which people, and step around them, or use alternate terminology with fewer problematic connotations.
If it happens, which it does, as far as I can tell, my only effective damage control strategy is to abort the conversation. I'll probably think that I can take those stupid ideas here and now, but that's just the landmine trying to go supercritical. Just say no. Of course letting on that you think you've stepped on a landmine is probably incredibly rude; keep it to yourself. Subtly change the subject or rephrase your original point without the problematic concepts or something.
A third prong could be playing "philosophical bomb squad", which means permanently defusing landmines by supplying satisfactory nonconfusing explanations of things without causing too many explosions in the process. Needless to say, this is quite hard. I think we do a pretty good job of it here at LW, but for topics and people not yet defused, avoid and abort.
ADDENDUM: Since I didn't make it very obvious, it's worth noting that this happens with rationalists, too, even on this very forum. It is your responsibility not to contain landmines as well as not to step on them. But you're already trying to do that, so I don't emphasize it as much as not stepping on them.
I love the landmine metaphor - it blows up in your face and it's left over from some ancient war.
This is insightful. I also think we should emphasize that it is not just other people or silly theistic, epistemic relativists who don't read Less Wrong who can get exploded by Philosophical Landmines. These things are epistemically neutral and the best philosophy in the world can still become slogans if it gets discussed too much E.g.
Now I wasn't there and I don't know you. But it seems at least plausible that that is exactly what your sister felt she was doing. That this is how having your philosophical limbs getting blown off feels like from the inside.
I think I see this phenomena most with activist-atheists who show up everywhere prepared to categorize any argument a theist might make and then give a stock response to it. It's related to arguments as soldiers. In addition to avoiding and disarming landmines, I think there is a lot to be said for trying to develop an immunity. So that even if other people start tossing out slogans you don't. I propose that it is good policy to provisionally accept your opponent's claims and then let your own arguments do th... (read more)
"Hmm ... what do you mean by 'randomly'?"
YES. The most effective general tactic in religious debates is to find out what exactly the other guy is trying to say, and direct your questions at revealing what you suspect are weak points in the argument. Most of this stuff has a tendency to collapse on its own if you poke it hard enough -- and nobody will be able to accuse you of making strawman arguments, or not listening.
Of course that goes for all debates, not just religious ones.
An extremely low probability of the observation under some theory is not itself evidence. It's extremely unlikely that the I would randomly come up with the number 0.0135814709894468, and yet I did.
It's only interesting if there is some other possibility that assigns a different probability to that outcome.
Probable, given the background information at the time.
Before Darwin, remember that the only known powerful optimizations processes where sentient beings. Clearly, Nature is the result of an optimization process (it's not random by a long shot), and a very very powerful one. It's only natural to think it is sentient as well.
That is, until Darwin showed how a mindless process, very simple at it's core, can do the work.
Have you ever come up with what appeared (to you) to be a knock-down argument for your position; but only after it was too late to use it in whatever discussion prompted the thought? Have you ever carried that argument around in the back of your skull, just waiting for the topic to come up again so you can deploy your secret weapon?
I have. Hypothesis: At least some of the landmines you describe stem from engaging someone on a subject where they still have such arguments on their mental stack, waiting not for confirmation or refutation but just for an opportunity to use them.
The French, unsurprisingly, have an idiom for this.
The English have a word for that.
I was aware of that and was waiting for an excuse to use the idea.
If I ever had this tendency it was knocked out of me by studying mathematics. I recommend this as a way of vastly increasing your standard for what constitutes an irrefutable argument.
"Well, that's a pretty controversial topic and I'd rather our conversation not get derailed."
Sometimes you can tell people the truth if you're respectful and understated.
I find it annoying when people say that sort of thing. I want to respond (though don't, because it's usually useless) along the lines of:
"Yes, it is a controversial topic. It's also directly relevant to the thing we are discussing. If we don't discuss this controversial thing, or at least figure out where we each stand on it, then our conversation can go no further; discussing the controversial thing is not a derailment, it's a necessary precondition for continuing to have the conversation that we've been having.
... and you probably knew that. What you probably want isn't to avoid discussing the controversial thing, but rather you want for us both to just take your position on the controversial thing as given, and proceed with our conversation from there. Well, that's not very respectful to someone who might disagree with you."
My ideal approach (not that I always do so) is more to stop taking sides and talk about the plus and minuses of each side. My position on a lot of subjects does boil down to "it's really complicated, but here are some interesting things that can be said on that topic". I don't remember having problems being bombarded with slogans (well - except with a creationist once) since usually my answer is on the lines of "eh, maybe". (this applies particularly to consequentialism and deontology, morality, rationality, and QM).
I also tend to put the emphasis more on figuring out whether there is actually a substantial disagreement, or just different use of terms (for things like "truth" or "reality").
I use a similar approach, and it usually works. Make clear that you don't think you're holding the truth on the subject, whatever it means, but only that the information in your possess led you to the conclusion youre presenting. Manifest curiosity for the position the other side is expressing, even if it's cached, and even if you heard similar versions of it a hundred times before. Try to drive them out of cached mode by asking questions and forcing them to think for themselves outside of their preconstructed box. Most of all, be polite, and point out that it's fine to have some disagreement, since the issue is complictaed, and that you're really interested in sorting it out.
Right. I suppose refusing to chant slogans is a good way to avoid the supercritical slogan ping-pong.
I lean towards abort mostly out of learned helplessness, and because if I think about it soberly enough to strategize, I see little value in any strategy that involves sorting out someone's confusion instead of disengaging. Just my personal approach, of course.
Thanks for bringing up alternate damage control procedures; we'll need as many as we can get.
Taking the outside view, what distinguishes your approach from hers?
nyan's cached thoughts are better? I mean, it would be nice if we could stop relying on cached thoughts entirely, but there's just not enough time in the day. A more practical solution is to tend your garden of cached thoughts as best you can. The problem is not just that she has cached thoughts but that her garden is full of weeds she hasn't noticed.
"Outside view" refers to some threshold of reliability in the details that you keep in a description of a situation. If you throw out all relevant detail, "outside view" won't be able to tell you anything. If you keep too much detail, "outside view" won't be different from the inside view (i.e. normal evaluation of the situation that doesn't invoke this tool). Thus, the decision about which details to keep is important and often non-trivial, in which case simply appealing to something not being "outside view" is not helpful.
I agree with all that: my point was just that the question you were replying to asked about the outside view (which in this context I took to mean excluding the fact that we think our cluster of ideas is better than Nyan's sister's cluster of ideas). I'm just saying: rationalists can get exploded by philosophical landmines too and it seems worthwhile to be able to avoid that when we want to even though our philosophical limbs are less wrong than most people's.
Or to put it another way: philosophical landmines seem like a problem for self-skepticism because they keep you from hearing and responding adequately to the concerns of others. So any account of philosophical landmines ought to be neutral on the epistemic content of sloganeering since assuming we're on the right side of the argument is really bad for self-skepticism.
Um, I just wanted to parachute in and say that "cached thought" should not be a completely general objection to any argument. All it means is that you've thought about it before and you're not, as it were, recomputing everything from scratch in real time. There is nothing epistemically suspect about nyan_sandwich not re-deriving their entire worldview on the spot during every conversation.
Cached thoughts can be correct! The germane objection to them comes when you never actually, personally thought the thought that was cached (it was just handed to you by your environment), or you thought it a long time ago and you need to reconsider the issue.
This happened to me two days ago, in a workshop on inter-professionalism and conflict resolution, with the word "objective". Which, in hindsight, was kind of obvious, and I should have known to avoid it. The man leading the discussion was a hospital chaplain–a very charismatic man and a great storyteller, but with a predictably different intellectual background than mine. It wouldn't have been that hard to avoid the word. I was describing what ties different health care professionals together–the fact that, y'know, together they're taking care of the same patient, who exists as a real entity and whose needs aren't affected by what the different professionals think they are. I just forgot, for a moment, that people tend to have baggage attached to the word "objective", since to me it seems like such a non-controversial term. In general, not impressed with myself, since I did know better.
Have mainstream philosophers come up with a solution to that? Can LessWronigans learn from them? Do LessWrongians need to teach them?
Try to come up with the least controversial premise that you can use to support your argument, not just the first belief that comes to mind that will support it.
To use a (topical for me) example. someone at Giving What We Can might support effective charities "because it maximises expected utility so you're oblidged to do so, duh", but "if you can do a lot of good for little sacrifice, you should" is a better premise to rely on when talking to people in general, as it's weaker but still does the job.
Here's a heuristic: There are no slam-dunks in philosophy.
Here's another: Most ideas come in strong, contentious flavours and watered-down, easy-to-defend flavours.
But water things down enough and they are no longer philosophy.... (read more)
Worse, the idea of consequentialism (or "utilitarianism") is often taken by people with a little philosophy awareness to mean the view that "the ends justify the means" without regard for actual outcomes — that if you mean well and can convince yourself that the consequences of an action will be good, then that action is right for you and nobody has any business criticizing you for taking it.
What this really amounts to is not "the ends justify the means", of course, but the horrible view that "good intentions justify actions that turn out to be harmful."
Ambiguity on "ends" probably does some damage here: it originally referred to results but also came to have the sense of purposes, goals, or aims, in other words, intentions.
I wish we had a communal list of specific phrases that cause a problem with verbiage substitutes that appear to work. I have been struggling to avoid phrasing with landmine-speak for... I don't know how long. It's extremely difficult. I would be happy to contribute my lang-mines and replacements to a communal list if you were to create it. I'd go make the post right now, but I don't want to steal karma away from you, Nyan, so I leave you with the opportunity to choose to do it first and structure it the way you see fit.
I dunno about LW, but that's how legitimacy works in most technical volunteer communities (hackerspaces, open source, etc). I also think it's a good idea.
This is good personal advice but a bad norm for a community to hold. Can you see why?
I suspect this is best to do before the "landmine" (which I read as being similar to a factional disagreement, correct me if I'm wrong) gets set off... that is, try to explain away the landmine before it's been triggered or the person you've talked to has expressed any kind of opinion.
WTF!? Nearly every consequentialist I know is in favor of the Capitalized Moral Theory while admitting that virtue ethics and deontology might work better as everyday heuristics.
I'm not sure what that would look like. If consequentialism and deontology shared a common set of performance metrics, they would not be different value systems in the first place.
For example, I would say "Don't torture people, no matter what the benefits of doing so are!" is a fine example of a deontological injunction. My intuition is that people raised with such an injunction are less likely to torture people than those raised with the consequentialist equivalent ("Don't torture people unless it does more good than harm!"), but as far as I know the study has never been done.
Supposing it is true, though, it's still not clear to me what is outperforming what in that case. Is that a point for deontological injunctions, because they more effectively constrain behavior independent of the situation? Or a point for consequentialism, because it more effectively allows situation-dependent judgments?
So when you see a moral theory, think heuristic, not dogma.
The more I think about this the more I realize I need a term that means "Even though the people who are now jumping all over me about some philosophical subject are talking about a memetic war that's currently raging, and even though they're not mindless zombies saying completely useless things in support of their points, they've chosen to hyper-focus on some piece of my conversation not essential to the main point, and have derailed the conversation."
I want to use "philosophical landmines" for that, but now that I re-read your article,... (read more)
Compare to this other kind of landmine.
This can be a substantial problem in psychology, even when the original "memetic war" has real scientific content and consequences. Debates between behaviorism and cognitive psychology (for example) are often very attractive and useless tangents for anyone discussing the validity of clinical treatments from cognitive and/or behavioral traditions, when the empirical outcomes of treatment are the original topic at hand.
Yes, I invented the idea of "philosophical hurdles" ages ago. For every "topic" there are some set of philosophical hurdles you have to overcome to even start approaching the right position. These hurdles can be surprisingly obscure. Perhaps you don't have quite enough math exposure and your concepts of infinity are a bit off, or you're aren't as solidly reductionistic as others and therefore your concept of identity is marginally weakened - anything like these can dramatically alter your entire life, goals, and decisions - and it can t... (read more)
I call it something like correctly identifying the distinction(s) relevant to our contextually salient principles. That's more general and the emphasis is different, but it involves the same kind of zeroing in on the semantic content that you intend to communicate.
Sure, you can have principles like consequentialism, but why bring them up unless the principles themselves are relevant to what you're saying at this moment? Principles float in the conversational background even when you're not talking about them directly. Discussing the principle of the thing ... (read more)
My current strategy is that I just don't talk to most people about confusing things in general. If I don't expect that they'll actually do anything differently as a result of our having talked, all that's left is signaling, and I don't have a strong need to signal about confusing things right now.
Another excellent post, Ryan! (sorry, I blame the autocorrect)
I like your list of the common landmines (actually, each one is a minefield in itself).
Right, modeling other people is a part of accurately mapping the territory (see how I cleverly used the local accessible meme, even though I hate it?), which is the first task of rationality.... (read more)