Dumbledore's Army

Posts

Sorted by New

Comments

We should not raise awareness

I think the issue is that 'raising awareness' is used to mean three separate things. (I agree that the extra simulacra levels aren't a helpful explanation.) Using awareness of breast cancer as a reasonably non-controversial example.

  1. Give people some useful knowledge or skill to help reduce the problem. Eg teach women how to examine themselves for lumps and when to seek medical advice.
  2. Raise the profile of the issue (or in some cases inform people that the issue exists) such that more resources will be devoted to solving it. Eg publishing opinion pieces informing the public how many people are affected by cancer and calling for more facilities for treatment, or encouraging people to donate money to charities researching cures.
  3. Signal that you are a virtuous person who is concerned about socially-approved causes. Eg wear a pink ribbon, or like Facebook pages from breast cancer charities.

I agree that there are a lot of people practicing virtue-signalling, while kidding themselves that they are doing level 2 profile-raising, and I also agree that a lot of the profile-raising is transparently ineffective. But I think that there are useful level-1 activities which also come under the banner of 'raising awareness' and I wouldn't want to stigmatise those. 

There are also some situations in which the level-2 activities are useful. I suspect you would disagree, but I think sexual assault is a fairly good example: a lot of people have gone to great efforts to explain to the general public that there is a widespread problem that needs action. The result has been an in-progress and partial change in social norms which may well succeed in reducing the levels of sexual assault. 

What's your best alternate history utopia?

I may be stretching the point about changing human psychology here, but: 

Education is widely considered to include learning how to be a emotionally well-adjusted and responsible adult. Schools teach things like mindfulness and intra-personal conflict resolution. For example, kids learn how to recognise when they are reacting from anger, and therefore how to take a breath and try a more mature reaction. They learn about concepts like how stress can make you start catastrophising, and how to apply some cognitive behavioural therapy to oneself or a friend to head off mental health problems before they get started. All of this is considered as normal and basic as learning how to read: it is assumed that almost any functional adult will have these life skills. 

The consequences for later life are immense and flow through every part of society. A more responsible and well-adjusted population is happier. Workers are more productive (although this may be expressed by the same amount of work being done faster so everyone has more time for the important things in life). Every kind of destructive behaviour is much less likely, whether in mild forms like chronic worrying through to extremes like domestic violence or addiction to hard drugs. People expect political leaders to be sane and reasonable, and will strongly reject those who pander to the worst human tendencies, meaning that most countries are better-run and have wiser policies.

Overconfidence is Deceit

I was thinking about this a little more, and I think that the difference in our perspectives is that you approached the topic from the point of view of individual psychology, while I (perhaps wrongly) interpreted Duncan's original post as being about group decision-making. From an individual point of view, I get where you're coming from, and I would agree that many people need to be more confident rather than less. 

But applied to group decision-making, I think the situation is very different. I'll admit I don't have hard data on this, but from life experience and anecdotes of others, I would support the claim that most groups are too swayed by the apparent confidence of the person presenting a recommendation/pitch/whatever, and therefore that most groups make sub-optimal decisions because of it. (I think this is also why Duncan somewhat elides the difference between individuals who are genuinely over-confident about their beliefs, and individuals who are deliberately projecting overconfidence: from the point of view of the group listening to them, it looks the same.)

Since groups make a very large number of decisions (in business contexts, in NGOs, in academic research, in regulatory contexts...) I think this is a widespread problem and it's useful to ask ourselves how to reduce the bias toward over-confidence in group decision-making.

Overconfidence is Deceit

Does anyone have a clear example to give of a time/space where overconfidence seems to them to be doing a lot of harm? I would say making investments in general (I am a professional investment analyst.) This is an area where lots of people are making decisions under uncertainty, and overconfidence can cost everyone a lot of money. 

One example would be bank risk modelling pre-2008: 'our VAR model says that 99.9% of the time we won't lose more than X' therefore this bank is well-capitalised. Everyone was overconfident that the models were correct, they weren't, chaos ensued. (I remember the risk manager of one bank - Goldman Sachs? - bewailing that they had just experienced a 26-standard deviation event, which is basically impossible. No mate, your models were wrong, and you should have known better because financial systems have crises every decade or two.)

Speaking from personal experience, I'd say a frequent failure-mode is excessive belief in modelling. Sometimes it comes from the model-builder: 'this model is the best model it can be, I've spent lots of time and effort tinkering with it, therefore the model must be right'. Sometimes it's because the model-builder understands that the model is flawed, but is willing to overstate their confidence in the results, and/or the person receiving the communication doesn't want to listen to that uncertainty. 

While my personal experience is mostly around people (including myself) building financial models, I suggest that people building any model of some dynamic system that is not fully understood are likely to suffer the same failure-mode: at some point down the line someone gets very over-confident and starts thinking that the model is right, or at least everyone forgets to explore the possibility that the model is wrong. When those models are used to make decisions with real-life consequences (think epidemiology models in 2020), there is a risk of getting things very wrong, when people start acting on the basis that the model is the reality. 

Which brings me on to my second example, which will be more controversial than the first one, so sorry about that. In March 2020, Imperial College released a model predicting an extraordinary death toll if countries didn't lock down to control Covid. I can't speak to Imperial's internal calibration, but the communication to politicians and the public definitely seems to have suffered from over-confidence. The forecasts of a very high death toll pushed governments around the world, including the UK (where I live) into strict lockdowns. Remember that lockdowns themselves are very damaging: mass deprivation of liberty, mass unemployment, stoking a mental health pandemic, depriving children of education - the harms caused by lockdowns will still be with us for decades to come. You need a really strong reason to impose one. 

And yet, the one counterfactual we have, Sweden, suggests that Imperial College's model was wrong by an order of magnitude. When the model was applied to Sweden (link below), it suggested a death toll of 96,000 by 1 July 2020 with no mitigation, or half that level with more aggressive social distancing. Actual reported Covid deaths in Sweden by 1 July were 5,500 (second link below).

So it's my contention - and I'm aware it's a controversial view - that overconfidence in the output of an epidemiological model has resulted in strict lockdowns which are a disaster for human welfare and which in themselves do far more harm than they prevent. (This is not an argument for doing nothing: it's an argument for carefully calibrating a response to try and save the most lives for the least collateral damage.)

 

Imperial model applied to Sweden: https://www.medrxiv.org/content/10.1101/2020.04.11.20062133v1.full.pdf 

Covid deaths in Sweden by date: https://www.statista.com/statistics/1105753/cumulative-coronavirus-deaths-in-sweden/ 

Overconfidence is Deceit

Thank you for an interesting article. It helped clarify some things I've been thinking about. The question I'm left with is: how practically can someone encourage a culture to be less rewarding of overconfidence? 

I guess I'm feeling this particularly strongly because in the last. year I started a new job in a company much more tolerant of overconfidence than my previous employer. I've recalibrated my communications with colleagues to the level that is normal for my new employer, but it makes me uncomfortable (my job is to make investment recommendations, and I feel like I'm not adequately communicating risks to my colleagues, because if I do no-one will take up my recommendations, they'll buy riskier things which are pitched with greater confidence by other analysts). Other than making sure I'm the least-bad offender consistent with actually being listened to, is there something I can do to shift the culture?

And please, no recommendations on the lines of 'find another job', that's not practical right now.