Epistemic Status: I'm highly confident this is a phenomenon that occurs with a lot of advice people give, but I'm quite uncertain about the best way to deal with it when trying to give advice to more than one person.

 

The main thing people fail to consider when giving advice is that someone with ostensibly the same problem may require a vastly different solution than themselves. The underlying cause of the problem in many cases may be the exact opposite of what it was for you. 

Many issues in life are caused by being too extreme. Either extreme is problematic-both too much of something or too little. Often people only think about one extreme (or one possible failure mode, we’re not limited to only two!) when they give advice because that is the problem they themselves had to overcome. It fails to occur that not only might this advice not be helpful, it might be actively detrimental to a person struggling with the opposite problem. To use a concrete example, imagine a runner is trying to improve their 5K time and asks a more accomplished runner for advice. The faster runner suggests that if the questioner wants to break through their plateau they will probably need to do more high intensity interval training. After all, the faster runner can remember that was what they needed to do to progress past their own plateau. Unfortunately, the slower runner’s problem is that they are overtraining, their muscles do not have sufficient time to recover and become stronger, limiting improvement. This advice will thus be completely counterproductive, and will probably lead to injury if the runner tries too hard to follow it.

The issue is that it is instinctive to ask “how would this advice have affected me?” when evaluating possible advice to give, rather than “is this the sort of scenario in which that sort of advice would be useful, accounting for the individual receiving the advice?” From this one might be tempted to derive the following lesson: never give anyone important advice unless you have thoroughly questioned what their problem is and are very confident you understand both the problem and the underlying cause. 

If you have the resources and time to give people individual advice I think this is a reasonable principle to abide by. But we often do not have this luxury, sometimes we want to give advice to multiple people at once. Sometimes we just don’t have the time or resources to inquire deeply into the specifics of someone’s problem. This difficulty is exacerbated by the fact that even once you try to consider how to give advice that doesn’t accidentally hurt someone you may fail to imagine all the ways your advice might do harm because you underestimate how different other people are from yourself.

So how do we avoid giving people bad advice? One solution is to adopt a policy of not giving unpersonalized advice, of course, but assuming we still believe we have useful things we want to say to people how should we proceed? For audiences who understand this problem with advice, one might avoid a lot of potential damage by starting discussion of possible ways you imagine the advice might go wrong and asking a reader/listener to consider themselves before applying the advice. Unfortunately, for broader audiences this technique will probably not work unless you can take the time to explain all this because it will look like you aren’t sure of your own advice and are hedging your bets or some such. And you certainly will not always have the time to explain this. A simple disclaimer that most good advice is situational and depends on the person may help some people avoid harm, especially if you are giving advice you know to be potentially dangerous from a position of expertise or authority, though I suspect most people would ignore such warnings.

Does anyone have any other strategies to avoid/minimize the unintentional harm advice may cause? 

New Comment
7 comments, sorted by Click to highlight new comments since:

Scott Alexander wrote a post related to this several years ago: Should You Reverse Any Advice You Hear? | Slate Star Codex

I wonder whether everyone would be better off if they automatically reversed any tempting advice that they heard (except feedback directed at them personally). Whenever they read an inspirational figure saying “take more risks”, they interpret it as “I seem to be looking for advice telling me to take more risks; that fact itself means I am probably risk-seeking and need to be more careful”. Whenever they read someone telling them about the obesity crisis, they interpret it as “I seem to be in a very health-conscious community; maybe I should worry about my weight less.”

Of course, some comments noted that this meta-advice is also advice that you should consider reversing - if you're on LessWrong, you're already in a community that's committed to testing ideas, perhaps to an extreme degree.

For myself, when it comes to advice, I usually try to inform rather than to persuade. That is, I present the range of opinions that I consider reasonable, and let people make their own decisions. Sometimes I'll explain my own approach, but for most issues I just hope to help people understand a broader range of perspectives.

This does occasionally backfire - some people are already committed strongly to one side, and a summary of the opposite perspective that sounds reasonable to me sounds absurd to them. In some cases, I've trapped myself into defending one side, trying to make it sound more reasonable, while I actually believe the exact opposite. And that tends to be more confusing than helpful.

But as long as I stick to following this strategy only with friends that are already curious and thoughtful people, it generally works pretty well.

(And did you catch how I followed this strategy in this comment itself?)

Thanks for the thoughtful post. I’m not too worried about this problem because I tend to assume that people will evaluate my advice in the light of their own circumstances and needs. I have not had a problem with people blindly accepting my advice on faith, without critical thought. Maybe this would be more of a problem for a teacher of young children or famous person or CEO or someone else with a lot of prestige.

Agree, I think the problem definitely gets amplified by power or status differentials. 

I do think that people often forget to think critically about all kinds of things because their brain just decides to accept it on the 5 second level and doesn't promote the issue as needing thorough consideration. I find all kinds of poorly justified "facts"/advice in my mind because of something I read or someone said that I failed to properly consider.

Even when someone does take the time to think about advice though I think it's easy for things to go wrong. The reason someone is asking for advice may be that they simply do not have the expertise to evaluate claims about challenge X on their own merits. Another possibility is that someone can realize the advice is good for them but overcorrect, essentially trading one problem for another. 

I think most of the setup of this post, and some other points, are covered here:
https://www.lesswrong.com/posts/6NvbSwuSAooQxxf7f/beware-of-other-optimizing

The main thing people fail to consider when giving advice is that advice isn't what's wanted.

I fully agree, this post was trying to get at what happens when people do want advice and thus may take bad advice.

 

Advice comes with no warranty. If some twit injures themselves doing what I told them to (wrongly) then that's 100% on them.

I think in some cases this is generally a fair stance (though I think I would still like to prevent people from misapplying my advice if possible), but if you are in a position of power or influence over someone I'm not sure it applies (e.g. sports coaches telling all their players to work harder and not taking the time to make sure that some of them aren't being pushed to overtraining by this advice).  

 

Failing all of that, say "What choice would you make if I wasn't here?" and then barring them saying something outlandish you just say "Then do that". One way or another they'll get better at thinking for themselves.

That sounds like a very reasonable approach.