Your experience also fits nicely with Robin Hanson's description of how a wider range of abilities (and the interest in those) is a marker of high status:
Great idea and nicely put, thanks!
I understand the objection of previous commenters that the post's idea seems a bit backwards because people with impostor syndrome themselves think that they have *too little* skill for their position, not too much.
But I think these objections take the self-made narrative of those experiencing impostor syndrome too serious. Our instincts for navigating power hierarchies arguably are much, much older than our ability to spin elaborate self-concepts. I imagine a causal relationship like this:
skill/dominance-mismatch -> fear (flight impulse) -> elaborate explanation for the felt fear
We are just super bad at explaining our basal feelings. Those explanations are usually overfitting.
Here's another example that often makes me laugh: The employees of my local organic grocery store have this habit of signaling strongly to each other how little they know about handling the shop's technical devices.
Fascinating! Since falling asleep is arguably a bodily process as well, I wonder if you also have observations about the bodily sensations during the stages? Or do you try to be exclusively aware of the visuals and try to not to be aware of the body?
Is the article a fair and much-needed outside piece of criticism that we should take seriously?
I’m still thinking about the question if (or on which level) Cade Metz’ criticism of the Rationality scene could be right. Because the counter position that he'd be wrong in every regard on all levels of analysis seems to be a too strong one.
Scott Aaronson summarized the NYT article’s central thesis as warning against the Rationality scene with its openness to ideas as a kind of a “gateway drug” to dangerous beliefs. And generally it doesn’t seem too controversial to assume that ideas can be interesting and potentially valuable as well as dangerous too [vaguely gesturing in the direction of history]. It just feels so off to be warned against someone like Scott Alexander. But could the warning be steelmanned somehow?
Openness as a personality trait is not only associated with openness to new ideas but also with a pronounced sense for aesthetics. I imagine aesthetics to be a kind of rather generic, low-level heuristic for what’s good for us. Like with our built-in appreciation of abundance in nature. (I would suspect aesthetics to be evolutionarily tuned as well as culturally honed.) If my heuristics are functioning well, then I can afford to open up to a lot of new experiences and ideas because my time-honoured aesthetics will tend to guide me to the good ones - even before memetic evolution ran its course.
But the reverse can be argued as well: If my aesthetics are not reliable, that may not only lead to arguably questionable but harmless choices in music, clothes, and home decor, but may actually make me quite helpless in a marketplace of ideas, potentially resulting in the adoption of destructive ideologies.
Personally, I find that observation surprising, now that I think about it. It means that the Rationality community may actually be spoiled with its pool of sharp thinkers and aesthets. Spoiled not only with respect to stimulating discussions but also with respect to the openness the community can afford without degenerating into ideology. It could mean that not every audience could be trusted in the same way. I don’t like that conclusion very much, politically and culturally. But I see how you could make a point for it. And how, as a consequence, there may be even some value, on a societal level, to be wary of a group of extraordinarily open people. Even when that group tends to be right a lot, paradoxically, because you need to trust your aesthetics a lot to open up to them.
Is the article a fair and much-needed outside piece of criticism that we should take seriously? We talk a bigger game about accepting and integrating outside criticism than many communities. Maybe this is our chance to really put that into practice?
A "fair and much-needed outside piece of criticism" would arguably take advantage of its outside perspective to point out community taboos and blind spots. Reading about your blind spots should, almost per definition I guess, make your reading stumble in strange and unpredicted ways. But the NYT article is depressingly predictable in its attempt to discredit reputation by alluding to vague links to right-wing positions and figures. The predictability reaches almost comical levels where the author isn't even shy to quote the very sentences that Scott already highlighted and tagged as "These are the sentences that can be taken out of context to discredit me if you are insincere. Please don't do it. But honestly, we all know you will do it. So whatever."
But apart from the politics and the signaling games, it still seems like a worthy exercise to look for object-level claims in the article. I found one:
Slate Star Codex was a window into the Silicon Valley psyche. There are good reasons to try and understand that psyche, because the decisions made by tech companies and the people who run them eventually affect millions.
That might be a valid point.
I like that question. It seems related to the question of "integration" in psychotherapy. Like when you made a valuable System I / bottom-up experience, how can you support remembering it? One technique I remember is to connect the felt sense with a more symbolic anchor - a gesture, image, sound, bodily position, place, situation, etc. And it sounds like a nice experiment to try "spaced integration" by repeatedly recalling the felt sense through the anchor.
As often with these religious stories and images, I find that I can more easily relate to their (assumed) essence by not so much applying them to the large-scale outer world of people, politics, groups, etc. but to a much more fine-grained world of inner mental experiences.
For the described kind of faith, it reminds me of situations experienced through mediation or with psychoactive substances. Like when there is a lot of terrifying mental turmoil that takes you to places about "atrocities" that you supposedly have committed, are committing, and are going to commit. It's agonizing and you don't want to go there, obviously. But for some reason, at some point, a part of your mind breaks down and you surrender to the horrors. That very moment everything stops, nothing happens, and it turns out it was only a thought. A thought that was the world to you at that moment - but in the end just a thought.
Now, the more often you experience such a situation the more you may have faith that you can actually follow the "commands" of the mind, even when they appal you, somehow knowing far from the back that it in the end it hopefully will turn out alright again. Inschallah!
And starting from that extreme case, this quality of faith then becomes a more gradual one that can be re-discovered and experienced with more and more subtlety in everyday life. Let's say, while cooking, you got the idea of adding some fancy new spice but a part of you is afraid you are going to kill your dinner. I suspect that this is also kind of the faith described in the Abraham story?