So a contemporary example is the current Israel-Palestine conflict. (I was playing board games with someone recently and one person brought it up and another person responded with "please let's not talk about that" in the style of 3, since it was basically "only downside" for us to know what each other thinks about that.)
I wonder how much of this is desiring to not collapse the wave-function? There are certain contexts where I wouldn't want to talk about Israel-Palestine because I might get rounded off to "pro-Israel" or "pro-Palestine", including by myself, in a way that I wouldn't endorse.
I mean that they probably have heard prescriptions from their leaders like "He who does not believe in God and does not accept God into his heart, immediately accepts the Devil into his heart, and by doing so is in a conflict with you, and you must fight this sinner at every opportunity."
I'm pretty sure this is wrong as a statement about the distribution of Christians.
I agree, but is it a wrong statement about the distributions of Christians who would be unwilling to work with me if I mentioned that I don't believe in any theistic gods?
I know sincere intelligent Christians who would just be relieved and respect that you've actually thought about the question of deism, and see that as a positive sign of intelligence, maybe even truth seeking?
I certainly think there are ways to comport yourself as a professional which have very little in common with Scott's conception of a blankface, although pretending to professionalism is a classic blankface strategy.
A blankface is anyone who enjoys wielding the power entrusted in them to make others miserable by acting like a cog in a broken machine, rather than like a human being with courage, judgment, and responsibility for their actions. A blankface meets every appeal to facts, logic, and plain compassion with the same repetition of rules and regulations and the same blank stare—a blank stare that, more often than not, conceals a contemptuous smile.
A professional may be caught in a broken machine or bureaucracy; your responsibility (and the call for courage and judgement) then is to choose voice and eventually exit rather than loyal complicity.
I have historically found myself with few allies in the comment section of the EA Forum for most of the history of EA, when I tried to stand up for letting people speak their mind and not be subject to crippling PR constraints.
Also I have found myself supportive of people who I thought were in the EA ecosystem for the values and for doing what's right, and then when I pointed out times that the ecosystem as a whole was not doing that, someone said to me "I try with a couple of percentage points of my time to help steer the egregore, but mostly I am busy with my work and life", and I had the wind knocked out of my sails a bit.
And yet I think it often has outsized effects and the costs aren't that big.
I think I never really responded to this but also it was probably the main generator of Ben's opinion?
I'm not sure whether I would have said my initial "we" statement about EAs. (Part of this is just being less confident about what EA social dynamics are like; another is thinking they are less fractious than rationalists.)
There are pieces of evidence that are not problematic for consensus reality. People disagree over these on differing priors for how to handle certain forms of evidence. Then there are secrets. Secrets aren't random but are over represented in the category of things that are uncomfortable tensions in consensus reality. Disagreement over differing secrets and differing interpretations of secrets (bc there can't be widespread consensus about secrets but definition) are difficult because they mix together the factual nature of the secrets with differing opinions about
What the relationship of the secret to consensus reality is.
What the relationship of the secret to consensus reality ought to be.
Both of these both in particular and in general.
From these conversations going poorly, myself and I imagine others wind up conditioned to have fewer disagreements, especially outside of high trust contexts. Which tends to keep the secrets smaller.
I wrote:
Vaniver wrote:
I wrote:
Start of Dialogue
Returning for summary and takeaways after lunch