3784

LESSWRONG
LW

3783
Social & Cultural DynamicsSocial RealityTribalismRationality
Curated
2025 Top Fifty: 14%

294

How anticipatory cover-ups go wrong

by Kaj_Sotala
8th Aug 2025
7 min read
25

294

294

How anticipatory cover-ups go wrong
59[anonymous]
15dr_s
29localdeity
18habryka
18Dagon
13Kaj_Sotala
8Dagon
12Kaj_Sotala
7Dagon
3Screwtape
13David Davidson
2Kaj_Sotala
3Screwtape
11dr_s
2Ashin
5plex
4Kaj_Sotala
4Clichedude
2Jasnah Kholin
2CronoDAS
2topherhunt
2Steven
2Aden
1HoVY
1dscotese
1[comment deleted]
New Comment
25 comments, sorted by
top scoring
Click to highlight new comments since: Today at 7:40 PM
[-][anonymous]2mo5928

The fundamental problem[1] with employing these kinds of cover-ups in political or politicized public debates is that, as Matt Yglesias has convincingly explained, misinformation mostly confuses your own side.

Relevant excerpts:

It’s pretty easy to persuade a large minority of the public of something, but the people you persuade are almost certainly going to be people who would vote for you anyway. Convincing high-value persuasion targets is a lot harder. And there can be huge second-order downsides to convincing your supporters of things that are not actually true. It may feel savvy to support sloppy, misleading, or inaccurate work from your own side, but it’s often counterproductive.

[...]

My bottom line on this is that saying things that are true is underrated and saying things that are false is overrated.

We’re all acutely aware of the false or misleading things our political opponents say, and it’s easy to convince yourself in the spirit of “turnabout is fair play” that the key to victory is to play dirty, too. The real problem, though, is that not only does your side already say more false and misleading things than you’d like to admit, but they are almost certainly saying more false and misleading things than you realize. That’s because your side is much better at misleading you than they are at misleading people outside of your ideological camp, and this kind of own-team deception creates huge tactical and strategic problems.

Note that I recognize your post goes beyond mere partisan politics, but I think Matt's writing compactly summarizes the fundamental issue at play in that specific case.

  1. ^

    Beyond violating deontological commitments to truth-seeking and truth-telling and the wizard's code of honor, which only really matter to the extent you care about them in the first place

Reply
[-]dr_s1mo155

This really strikes IMO as something that has gone especially wrong with (especially liberal) politics in the last 15-ish years. Communication almost entirely aimed at riling up your own side rather than growing its ranks, which is perceived as successful but in fact does absolutely nothing other than make the loud minority louder.

Reply
[-]localdeity1mo2920

If you think your opponent will take statistics out of context, then it makes sense to try to keep those statistics hidden.

In the absence of other considerations, sure.  Some would say you have an obligation to a third party (taxpayers, citizens who have nominally put their trust in the executive branch which has in turn created the CDC), and/or to a scientist's ethos of giving the most accurate possible data, to publish the statistics regardless of whether it helps your political opponents.  Some would say that if you break that obligation, then you have betrayed the citizens and the ideals of science, and should be thrown out of your job at the very least.

Reply
[-]habryka12d186

Promoted to curated: As with many good posts on LW, the most valuable thing I got from this post was the concept handle of "anticipatory cover-up". It feels like an important thing to have a handle for, and the post provided enough good examples to make what it's pointing at reasonably clear. I also kind of wish it had some examples (if they exist) of times when anticipatory cover-ups work (though of course that wasn't the aim of the post). 

Reply
[-]Dagon2mo1820

I think all of these indicate insufficient strategic thinking on behalf of the information-controlling party.

If you think your opponent will take statistics out of context, then it makes sense to try to keep those statistics hidden

No.  Only if you ALSO think the statistics will stay hidden, and also somehow think that the hiding can be done well enough that it won't be worse than the data.  These are patently false in most cases.  It is both a moral and a strategic failure to attempt this.   

 It ends up going well.  ... She would like to avoid this kind of problem in the future

This would benefit from a clear problem statement.  And there are plenty of avenues for Alice to be more deeply involved in the prep which are NOT simply feedback that this was uncomfortable.  Ask for the outline, the prelims, offer to help early.   Alternately, if this is something that can be done well in a day, don't give him weeks to do it.

A small group (subculture, religious sect, police force, etc.) within broader society sees one of their own doing something bad, such as engaging in abuse.

This is tricker, especially for groups already facing persecution.  Probably the correct reaction is expulsion from the group and letting the bigger society handle the individual as a no-longer-supported-by-group individual. 

This led to a dynamic where I became reluctant to acknowledge her having any correct insights into me that I had initially disagreed with. ... we're no longer friends.

You could have saved some time by skipping to the end, perhaps by confronting/discussing with her that you hate it when she seems to be manipulating you, even if she thinks it's for your benefit.   

“if you are withholding information because of how you expect the other party to react, be aware that this might just make everything worse”.

I'd be stronger in my recommendation.

In all of these cases, cover-up is a short-term attempt to avoid a conflict, which cannot work for very long, and impedes actual shared knowledge and discussion about any underlying disagreement.  It's almost always a mistake which reduces overall trust and erodes future ability to resolve disagreements.

Note: I admit that do this myself sometimes - I hate direct conflict, and for low-stakes things it often does work by allowing the topic pass by without having to deal with it.  I'm not saying "this is always wrong", I'm saying "It's usually wrong for high-stakes topics that won't go away until addressed".  I endorse lying and evasion in many situations, but only as a last resort when it's clear that other communication mechanisms are unworkable.

Reply1
[-]Kaj_Sotala2mo136

Only if you ALSO think the statistics will stay hidden, and also somehow think that the hiding can be done well enough that it won't be worse than the data.  These are patently false in most cases.  It is both a moral and a strategic failure to attempt this.   

There's an obvious selection bias in that we hear about the cases where people fail to cover up something, and don't hear about the cases where they succeed. (The bit where the anti-vaccine group filed a FoIA lawsuit to compel release of internal documents is a good example - I'd expect that in the vast majority of cases, an organization's internal files merely remain internal, without any court compelling them to be released.)

I would like to believe that the truth always comes out, so one might as well always be transparent and tell the truth. But that belief feels like it would be a little too convenient, and I don't know of a strong reason to believe that it's true.

You could have saved some time by skipping to the end, perhaps by confronting/discussing with her that you hate it when she seems to be manipulating you, even if she thinks it's for your benefit.   

Sometimes confronting the person or leaving the relationship work. Sometimes confronting just makes them more hostile (was true in this case), and/or there are reasons why it's difficult or impractical to leave.

Reply
[-]Dagon2mo80

Acknowledged that it's often a bit less obvious what will work or not work than I wish.  I do think it's often a matter of timeframes, though - tactics that may work in the short term are usually different from what's durable.  It's of course up for debate in each situation how to weigh the short- and long-term results when they're different.

One thing I should mention - I often frame this in terms of "is it in my interest to make this private knowledge into common knowledge (as in I get to know that you know, and you know I know you know etc.)"  In many examples, this simplifies the conflict to actual points of disagreement over values, not on private estimates of a fact (whether that fact is a government statistic, a belief about a friend's agency, or a preference in work style).   

Reply
[-]Kaj_Sotala1mo121

It's of course up for debate in each situation how to weigh the short- and long-term results when they're different.

So I'm not a Wikipedia editor or otherwise familiar enough with its politics to know how influential or canonical these essays are, but it has two articles that look like they've been around for quite a while and seem to basically be saying "sometimes it's better for trusted Wikipedia admins to cover up information from other people". I'd guess that the fact that these pages have been around for a while is some evidence for such policies at least sometimes working in the long term.

Wikipedia:Don't stuff beans up your nose

As an old story goes:

The little boy's mother was going off to the market. She worried about her son, who was always up to some mischief. She sternly admonished him, "Be good. Don't get into trouble. Don't eat all the chocolate. Don't spill all the milk. Don't throw stones at the cow. Don't fall down the well." The boy had done all of these things on previous market days. Hoping to head off new trouble, she added, "And don't stuff beans up your nose!" This was a new idea for the boy, who promptly tried it out.

In our zeal to head off others' unwise actions, we may put forth ideas they have not entertained before. As the popular saying goes, "don't give 'em any ideas".

For example, don't give potential vandals examples of how to cause disruption. This may will tempt them to do it.[Note 1]

In a similar vein, there are many areas of the encyclopedia that rely on, or benefit from, some level of security through obscurity, such as WP:SPI. For this reason, specific cases and abuse mitigation are often left undiscussed on-wiki, and this essay is sometimes cited in such situations (often using the shortcut WP:BEANS) to drop the hint that further public explanation of a matter could be unwise. An essay explaining this in more detail is Wikipedia:There's a reason you don't know.

Wikipedia:There's a reason you don't know

Certain things on Wikipedia happen with little or no explanation. These include suppressions, checkuser blocks, many revision deletions, and many actions of the Arbitration Committee. Because most actions are logged publicly on Wikipedia, and their rationales well-documented, some users may get upset that they don't know why these things happen.

But things are only obscured for a small number of reasons. Sometimes they concern private information that, for legal reasons, only certain users can access. Sometimes excessive detail would help bad actors do more bad things or reward malfeasance with attention. And sometimes it's for the best interests of the editors affected, such as minors or those experiencing mental health issues. If you don't know why something happened, there's probably a reason. And it's probably a good reason. And by butting in without knowing full context, you could cause serious harm.

If you have concerns about the reasoning for something, there are procedures for questioning a decision without publicizing private information. Assume Good Faith applies to opaque actions too. If, say, an admin blocked someone without stating a reason, and this concerns you, usually the best course of action will be to email them and politely ask why. If you are unable to resolve your concerns this way, the Arbitration Committee is empowered to resolve any disputes involving private evidence. On a global level, the Ombuds commission also can hear complaints involving alleged misuse of checkuser and oversight privileges.

Don't be the jerk who prolongs someone's mental anguish, or gives attention to a long-term abuser, or helps a vandal evade their block, just because you were curious or leapt to the conclusion of admin abuse. Follow the proper channels to inquire about something or appeal a decision, and assume good faith of the user who made it.

Also, if we consider the category of anticipatory cover-ups to include things like "company trade secrets or information that is classified for national security reasons" then those kinds of policies have also been around for a long time.

Reply
[-]Dagon1mo70

It's worth considering, in each of the examples in the post, how different it might have been if the policy had been "withhold the object-level data, but be explicit and forthright about the fact that you're hiding it and the reasons for doing so".

Reply
[-]Screwtape1mo30

First time I'd seen There's A Reason You Don't Know. I'd be fascinated to hear how well that works for them, because oh boy does it not really work some groups. (It's rationalists, we're some groups)

Reply2
[-]David Davidson1mo132

>both sides were acting reasonably, given the assumption that the other side is untrustworthy.

Insofar as the possibility that a liar is a "trustworthy liar" is not a common consideration for people who value honesty, that they are untrustworthy would seem to directly follow from the fact that they were lying. It is less of an "assumption" and more of a conclusion.

Trust is much more easily lost than earned. 

Reply
[-]Kaj_Sotala1mo20

Good point, "untrustworthy" wasn't the best term to use there. Maybe something like "malicious" would be better? Since lying makes you untrustworthy kind of by definition, as you point out, but it doesn't necessarily make you malicious (lying to the Gestapo about the Jews in your attic and so on - though I guess that from the Gestapo's point of view, this is also being malicious).

Reply
[-]Screwtape1mo30

There's something here about how mistakes get treated, though I don't know if I can articulate it well at the moment.

Like, if I quote two or three sentences of a report, did I pick a reasonable piece of the concluding paragraph or did I cherry pick the best possible sentences for my argument? If one column got left out of the dataset and it turned out that column made my case weaker, was that a data parsing mishap or a deliberate attempt to suppress information?

People often remember best the parts of information that agree with them. (That's an assertion I don't have citations for at the moment, but I think it's true.) So a little bit of that kind of thing doesn't mean they're doing it deliberately. But if I pick up on a lot of it, more than usual, then I get more suspicious.

Reply
[-]dr_s1mo113

If you think your opponent will take statistics out of context, then it makes sense to try to keep those statistics hidden.

 

Only if you apply the most naive first order considerations, which this article is about. I don't expect much of antivaxxers, but if the average Less Wrong user can understand this, so should an organization chock full of the supposedly smartest and most capable medical scientists in the US, yet here we are.

Reply
[-]Ashin10d20

I dunno, it's not obviously the case, because I expect the average LWer to be more competent in this than the aggregate of an organization made up of the supposedly smartest and most capable medical scientists in the US. To be clear, not that individuals in that organization are stupider than the average LWer, but that the organization as a whole might not be nimble enough to thread the needle when it comes to this kind of policy-making.

Reply
[-]plex2mo50

This is a special case of Control Vs Opening, which I wrote up badly with some Claude help and never got it published, but the doc has been transformative to at least one person. It covers ways out of the dynamic across a few domains. Hint: how would you disentangle two parts which had this kind of dynamic :)

(you're very welcome to take the idea and run with it/borrow bits of that post, I think you've put a lot more skill points into writing than me and I'd love your audience to have these ideas. Multi-agent models of mind was a major part of its inspiration.)

Reply
[-]Kaj_Sotala1mo40

Cool! I might do something with that. :)

Reply11
[-]Clichedude12d42

for number 4, did you ever make deductions about her behavior and see whether she agreed you had insight into her?
if not, it sounds like a simple ego game where they were trying to play higher than you. 
if they can't acknowledge that you can have insight into them, there's little reason to do the opposite either. 
and if these intuitions are correct, its good you're no longer friends.

Reply
[-]Jasnah Kholin3d20

"By acting on their assumptions, both confirmed the opposing side's existing interpretation of being untrustworthy."

That's Truth Or Dare dynamic, or defect-defect equilibrium. 

It's also come from manipulative and antagonistic relationship to people - the cooperative thing is to inform them and let them make their opinion. giving people the information you think will make them think the right thoughts is not cooperative thing to do. 

which is to say - i disagree about the reasonableness of the anticipatory coverup side.

It seems you think all those examples are examples of the same pattern. but it doesn't look like that to me.  the first example is the best, and it's defect-defect equilibrium. the second looks to me first and foremost failure of exploration, of trying different things and see what happened, and secondly failure to come to the right conclusions from the evidence.

the third is not a valid map. aka, i don't think this is what happening in those cases. i will not write here my full model of those situations, but reputation-to-outsiders take very small part of this model, and i don't sure it's really needed for explanation at all, when status and scapegoat dynamics explain it good enough. in the other hand, reputation-to-outsiders needed to explain why sometimes the offender does get punished. so, my model here is sort of opposite to yours. 

4 is just bucket error, and also sort of not-cooperativeness that prevented by Law, in way similar to 1.

(I had somewhat similar dynamic with my parents, and then tried and it turned out that my parents doesn't use any acknowledgment of their rightness against me, and that most people I encountered don't. but that another story.)

also 4 is failure on the virtue of empiricism, but i don't think i would get over the uncooperativeness and lack of respect from the friend to try solve it the empirical way. and don't think it's good idea to do that, in any case.

so as in part 5, I'm not convince that "anticipatory coverup" is real cluster. or, to be more precise, the first example point to real cluster in politics. the other parts have weak component of that, and my clustering algorithm cluster it with other things, or refuse to cluster them at all. also, the dynamic in (1) look to me like example of Truth Or Dare dynamic (per Duncan's post), and Duncan's description of Trust Equilibrium and Mistrust Equilibrium is better model for those situations, as anticipatory coverup is only one way that self-fulfilling mistrust can manifest. 

Reply
[-]CronoDAS11d20

I'm reminded of a related reason for "anticipacitory coverups"...

Remember, you can't be wrong unless you take a position. Don't fall into that trap.

-- Dogbert's Top Secret Management Handbook

Reply
[-]topherhunt11d20

It would be nice to end this post with a recommendation of how to avoid these problems. Unfortunately, I don’t really have one, other than “if you are withholding information because of how you expect the other party to react, be aware that this might just make everything worse”.


Maybe this is me being naive, but this seems like a topic where awareness of the destructive tendency can help defeat the destructive tendency. How about this, as a general policy: "I worry that this info will get misinterpreted, but here's the full information along with a brief clarification of how I feel it should and shouldn't be interpreted"?

To hostile listeners, you've given slightly less ammo than in the likely scenario where they caught you concealing the info. To less-hostile listeners, you've (a) built credibility by demonstrating that you'll share info even when it doesn't strengthen your cause, and (b) by explicitly calling out the potential misinterpretation you're anticipating, you may make listeners more resilient against falling for that misinterpretation (inoculation / prebunking).

- By erring on the side of transparency while publicly acknowledging certain groups' likelihood of coming to a distorted conclusion, I bet the CDC would have avoided a disastrous erosion of public trust and reinforcement of the "don't trust the experts" vibe.
- By bringing up Bob's evasive communication during the client prep and the anxiety it created for her, Alice would have deepened trust between them (granted, at the risk of straining the relationship if he did turn out to be irredeemably thin-skinned).
- ...OK actually the cult/sect situation seems more complex, it seems to have more of the multipolar-trap (?) quality of "maybe no single individual feels safe/free to make the call that most people know would collectively be best for the group". 

It still seems to me that awareness of this trap/fallacy and its typical consequences can help a person or group make a much less fatal decision here.

Reply
[-]Steven12d20

You can keep talking more. You can repeat the proper analysis for your vaccine, talk about your own behavior, talk about why other people analyzing your behavior is either good or bad. You don’t have to concede the public square to someone else because you’re concerned they will misinterpret things and in fact these examples seem like situations where you can and should talk your way out of them

Reply
[-]Aden1mo20

I see some comments here that include something roughly like, "the author's premise in the first paragraphs, that the prophylactic concealment of information from untrustworthy parties is reasonable, is false and here is why ...". 

For one thing, I think refuting that premise is a large part of the point of this post. 

For another, I think that the author's comments and examples are pretty leading and would have done a great deal to assist the reader in concluding that this premise is false without very much reading of the post.

Sometimes there is discussion on this website about various infohazards and what should be done about them; this leads to perfect circumstances for both the overall conclusions of this post and the nuisances it alludes to to be applied for more rational discussion. I am nearly certain I recall this not always happening.

So please don't confuse the obviousness of the idea with the obviousness of applying it in practice. Certainly most of the conclusion of this post was obvious to me before even finishing the first example, but I am still determined to make use of the fact that I have read this post to be more rational than I otherwise would have in the future.

Reply
[-]HoVY11d10

This reminds me of https://www.lesswrong.com/posts/5FAnfAStc7birapMx/the-hostile-telepaths-problem

Reply
[-]dscotese12d10

A recommendation for avoiding these problems is to tolerate smaller versions of the problems in the interest of using the fallout to demonstrate the importance of avoiding the errors the hiding is meant to prevent. The position of any authority in this kind of matter is tenuous because they have coercion to back them up, and this causes some resistance to anything they say or do, a relatively larger portion of which is, itself hidden. See Plex's Control Vs Opening, mentioned in his comment.

Reply
[+][comment deleted]11d10
Moderation Log
More from Kaj_Sotala
View more
Curated and popular this week
25Comments
Social & Cultural DynamicsSocial RealityTribalismRationality
Curated
Deleted by HoVY, 09/12/2025
Reason: Dup

1.

Back when COVID vaccines were still a recent thing, I witnessed a debate that looked like something like the following was happening:

  • Some official institution had collected information about the efficacy and reported side-effects of COVID vaccines. They felt that, correctly interpreted, this information was compatible with vaccines being broadly safe, but that someone with an anti-vaccine bias might misunderstand these statistics and misrepresent them as saying that the vaccines were dangerous.
  • Because the authorities had reasonable grounds to suspect that vaccine skeptics would take those statistics out of context, they tried to cover up the information or lie about it.
  • Vaccine skeptics found out that the institution was trying to cover up/lie about the statistics, so they made the reasonable assumption that the statistics were damning and that the other side was trying to paint the vaccines as safer than they were. So they took those statistics and interpreted them in exactly the way that the authorities hadn't wanted them to be interpreted, ignoring all protestations to the contrary.
  • The authorities saw their distrust in the other side confirmed - the skeptics took the statistics out of context, just as predicted - and felt like their only mistake had been in not covering up the information well enough.

I’ve lost the link to the original discussion, but searching for cases fitting this pattern afterward brought up several examples:

  • In February 2022, the CDC released data on booster shot effectiveness but excluded data for adults aged 18 to 49, and a CDC spokesperson specifically mentioned a fear of misinterpretation as the reason.
  • Also in February 2022, Public Health Scotland announced that it would stop releasing vaccine status data due to the statistics being overly simplistic and being taken out of context.
  • In October 2022, a Freedom of Information Act lawsuit filed by an anti-vaccine group forced the CDC to release information from an internal safety monitoring system. The group’s analysis of the data claimed that vaccines were causing a huge amount of hospitalizations. Health authorities responded that the data did not establish such a causation, and that proper analyses indicated much lower rates of adverse effects.

What's notable to me is that both sides were acting reasonably, given the assumption that the other side is untrustworthy.

If you think your opponent will take statistics out of context, then it makes sense to try to keep those statistics hidden. And if your opponent is hiding some statistics, then it makes sense to assume that they're doing it because those statistics contain truths that are inconvenient for them.

By acting on their assumptions, both confirmed the opposing side's existing interpretation of being untrustworthy. They treated the other as a hostile actor and took hostile actions in return, which turned the opponent even more hostile.

2.

Alice is the manager for Bob, who is working on an important presentation for a client. Whenever Alice asks how the presentation is coming along, Bob fails to answer or avoids the question.

Alice starts getting increasingly stressed out about this, but then Bob pulls the presentation together at the last minute. It ends up going well.

Alice can tell that Bob is very anxious and sensitive to negative feedback. She would like to avoid this kind of problem in the future, but she gets the sense that he is already feeling guilty about the way he handled the issue. Alice is concerned that if she were to bring it up, then Bob would feel disproportionately bad about it. He did manage to do the job at the end, after all.

So when she gives him feedback, she holds back and avoids saying anything about how things leading up to the presentation went.

Bob on his part can tell that Alice is holding back something. It’s obvious to him that he handled the work leading up to it poorly, but Alice seems to avoid that topic. That must be because she doesn’t expect that he could handle being told just how badly he handled it. So, he reasons, she must think that his performance was completely unacceptable. Probably Alice just wants to avoid an unpleasant confrontation now, but tomorrow security will show up at this desk to let him know that he’s been fired and escort him out.

Alice sees Bob getting more anxious after their discussion, and worries that maybe she accidentally let slip something critical-sounding about the lateness after all. She concludes that she needs to be even more careful to only stick to positive feedback in the future.

Because they never talk about Bob’s lateness and what he might do differently in the future, his pattern of only completing important presentations at the last minute continues.

3.

A small group (subculture, religious sect, police force, etc.) within broader society sees one of their own doing something bad, such as engaging in abuse.

The people within the group think, "This was an isolated exception, most of us would never do anything like this. But if the rest of society hears about this, they will think that all of us are bad. So we need to deal with this internally and not reveal it to outsiders."

The attempt to keep it internal fails and outsiders find out about the case. They think that the group is all bad, since it was trying to cover up abuse.

Alternatively: the group actually does have a systemic problem, this isn't just an isolated incident. The cover-up succeeds for now, even from most other members of the group. Because most of the group is kept in the dark, the next time something like this happens, the people who find out about it go "This has never happened before, we need to cover this up so this isolated incident doesn't ruin our reputation".

As a result, more people get harmed because most people in the group don’t realize that there’s a problem they should do something about. The original perpetrators might even be allowed to stay in the group. When things finally blow up, the group is rocked not only by one scandal, but a series of them.

4.

There was a time when my relationship with a particular friend had turned very adversarial.

From my perspective, she sometimes had good insights into my behavior. At the same time, she also had a strong need to have me do things that she wanted, and an inability to understand why I didn't want to do them. As a result, she would often take her correct insights about me to argue that she understood me better than I understood myself and that it would be good for me to do what she wanted.

I’ve forgotten the exact details and don’t particularly want to go digging for them, but the rough shape of the examples was something like this. She might notice that I was repeating a particular pattern in my romantic relationships that I was reluctant to acknowledge. Later, she might argue that I was irrational and self-deluded when I thought that the two of us shouldn’t be housemates, or when I didn’t want to engage in a co-writing project together with her. As a part of her argument, she would reference the pattern that she’d noticed earlier as evidence of my general irrationality.

This led to a dynamic where I became reluctant to acknowledge her having any correct insights into me that I had initially disagreed with. This was the case even when it was clear to me that she had been correct. I felt that if I acknowledged any single insight, she would weaponize it to hold that she was always correct about what I should do whenever we disagreed.

(We’re no longer friends.)

From her perspective, the situation was simpler. To her, it must have looked like she had good insights into my behavior, which let her see how irrationally and self-destructively I was behaving when I didn't listen to her and do what she told me to do. Even when it became clear that she’d been right about something, I continued to obviously self-deceive and rationalize reasons not to accept her insights!

Therefore, I was too irrational to understand my own behavior, proving that she did know better than me what I needed.

5.

You could call this an anticipatory cover-up:

  • One party anticipates that another will misuse or misinterpret some piece of information.
  • Because of this expectation, they try to withhold or cover up the information.

It’s easy for it to go poorly:

  • The other party sees the cover-up and interprets it as confirming their negative expectations.
  • There's a resulting spiral of mistrust that makes everything worse.
  • Important information being left unshared might also have other negative consequences, such as problematic behaviors continuing or other experts being unable to fully evaluate the effectiveness of vaccines.

In some cases, this is relatively benign and could be cleared up with further discussion. Bob would probably be relieved to hear the true reasons for Alice's actions. In other cases, the cover-up actively makes each party model the other as an adversarial actor. This makes them unlikely to trust even truthful explanations of the other party's actions afterward.

Sometimes, the party covering up the information might say something like “I’ve got good reasons to hide this information, trust me”. Unfortunately, it is hard to trust someone when they clearly don’t trust you. It could be that they have a good reason to cover up the information, but it could just as well be that the information is genuinely damning. Why should you trust them when they are clearly trying to control your actions by restricting the information that you get?

On the other hand, sometimes people will misinterpret or misuse information that they get. Just including an explicit warning against possible misinterpretations won’t help if the other party has a motive (justified or not) to ignore those warnings. There may also be legal reasons for why some information cannot be released, such as the possibility of a libel suit that may be expensive and effortful to fight even if the information is true.

It would be nice to end this post with a recommendation of how to avoid these problems. Unfortunately, I don’t really have one, other than “if you are withholding information because of how you expect the other party to react, be aware that this might just make everything worse”.

This article was first published as a paid piece on my Substack one week ago. Most of my content becomes free eventually, but if you'd like me to write more often and to see my writing earlier, consider getting a subscription! If I get enough subscribers, I may be able to write much more regularly than I've done before.