In Think Like Reality, I put forth the astonishing and controversial proposition that when human intuitions disagree with a fact, we need to either disprove the "fact" in question, or try to reshape the intuition. (Well, it wouldn't have been so controversial, but like a fool I picked quantum mechanics to illustrate the point. Never use quantum mechanics as an example of anything.) Probability theory says that a model which is consistently surprised on the data is probably not a very good model.
Matt Shulman pointed out in personal conversation that, in practice, we may want to be wary of people who don't appear surprised by surprising-seeming data. Some people affect to be unsurprised because it is a fakeable signal of competence. Well, a lot of things that good rationalists will do - such as appearing skeptical and appearing to take other people's opinions into account - are also fakeable signals of competence. But, in practice, Matt's point is still well-taken.
People may also appear unsurprised (Matt points out) if their models are so vague that they don't understand the implications one way or the other. (Rob Spear: "It doesn't matter to the general public whether reality has 11, 42, or 97.5 dimensions... The primary good that most modern physics provides to the people is basically light entertainment.") Or they may appear unsurprised if they fail to emotionally connect to the implications - "Oh, sure, an asteroid is going to hit Earth... but personally I don't think humanity really deserves to survive anyway... are you taking Sally to her doctor's appointment tomorrow?"
Or Cialdini on the bystander effect:
We can learn from the way the other witnesses are reacting whether the event is or is not an emergency. What is easy to forget, though, is that everybody else observing the event is likely to be looking for social evidence, too. Because we all prefer to appear poised and unflustered among others, we are likely to search for that evidence placidly, with brief, camouflaged glances at those around us. Therefore everyone is likely to see everyone else looking unruffled and failing to act.
So appearing unsurprised, or pretending to yourself that you weren't surprised, is both personally and socially detrimental. By saying that a consistently surprised model is a poor model, I didn't intend to make it more difficult for people to admit their surprise! Even rationalists are surprised sometimes - the important thing is to throw away the model, reshape your intuitions, and otherwise update yourself so that it doesn't happen again.
Think Like Reality wasn't arguing that we should never admit surprise, but that, having been surprised, we shouldn't get all indignant at reality for surprising us - because that just keeps us in the mistaken frame of mind that was surprised in the first place; instead, we should try to adjust our intuitions so that reality doesn't seem surprising the next time. That doesn't mean rationalizing the events in hindsight using your current model - hindsight bias is detrimental to this process because it leads you to underestimate how surprised you were, and hence adjust your model less than it needs to be adjusted.
we shouldn't get all indignant at reality for surprising us
A feeling I entirely agree with. Reality is out there, and finding a way of dealing with it is essential - whether though updated intuition, or conscious reasoning.
Never use quantum mechanics as an example of anything.
I'd heard that saying before, but never truly realised why until your post...
It is a good point that how often we are surprised would be a signal of the low quality of our initial model, so people might want to pretend to not be surprised so as to give the impression they had had a good model. So if you can catch someone playing this faking game, acting unsurprised when they had no plausible way to have expected what they saw, you should take that as a bad sign about them.
In the previous thread I think the controversy arose because of the fact that so many people who do understand physics, still think QM is weird. As someone pointed out, just because we have as of yet not reshaped our intuition that doesn't prevent us from disregarding our intuition and understanding the subject at hand.
That point, coupled with what I think is your point...namely that if presented with a "fact", and our intuition disagrees with said fact, one of them is wrong...is a fairly obvious conclusion to draw which probably confused people as to what you were actually saying. (Like if on a blog about construction someone made a post pointing out that nails are used to hold wood together. If the author of this post made drawn out, poorly fitting analogies, and this blog is known for well thought out points and arguments, readers would have a hard time realizing that the whole point was that nails hold wood together...they'd get lost in the trees looking for the meaning to a post where the appropriate response is "duh".)