Some Thoughts Are Too Dangerous For Brains to Think

With bigotry, I think the real problem is confirmation bias. If I believe, for example, that orange-eyed people have an average IQ of only 99, and that's true, then when I talk to orange-eyed people, that belief will prime me to notice more of their faults. This would cause me to systematically underestimate the intelligence of orange-eyed people I met, probably by much more than 1 IQ point. This is especially likely because I get to observe eye color from a distance, before I have any real evidence to go on.

In fact, for the priming effect, in most people ... (read more)

Those are real and important effects (that should probably have been included in the original post).

A problem with avoiding knowledge that could lead you to discriminate is that it makes it hard to judge some situations - did James Watson, Larry Summers and Stephanie Grace deserve a public shaming?

11lmnop10yThis is exactly the crux of the argument. When people say that everyone should be taught that people are the same regardless of gender or race, what they really mean isn't that there aren't differences on average between women and men, etc, but that being taught about those small differences will cause enough people to significantly overshoot via confirmation bias that it will overall lead to more misjudgments of individuals than if people weren't taught about those small differences at all, hence people shouldn't be taught about those small differences. I am hesitantly sympathetic to this view; it is borne out in many of the everyday interactions I observe, including those involving highly intelligent aspiring rationalists. This doesn't mean we should stop researching gender or race differences, but that we should simultaneously research the effects of people learning about this research: how big are the differences in the perception vs the reality of those differences? Are they big enough that anyone being taught about gender and race differences should also be taught about of the risk of them systematically misjudging many individuals because of their knowledge, and warned to remain vigilant against confirmation bias? When individuals are told to remain vigilant, do they still overshoot to an extent that they become less accurate in judging people than they were before they obtained this knowledge? I would have a much better idea how to proceed both as a society and as an individual seeking to maximize my accuracy in judging people after finding out the answer to these questions.

Some Thoughts Are Too Dangerous For Brains to Think

by WrongBot 3 min read13th Jul 2010318 comments

18


[EDIT - While I still support the general premise argued for in this post, the examples provided were fairly terrible. I won't delete this post because the comments contain some interesting and valuable discussions, but please bear in mind that this is not even close to the most convincing argument for my point.]
A great deal of the theory involved in improving computer and network security involves the definition and creation of "trusted systems", pieces of hardware or software that can be relied upon because the input they receive is entirely under the control of the user. (In some cases, this may instead be the system administrator, manufacturer, programmer, or any other single entity with an interest in the system.) The only way to protect a system from being compromised by untrusted input is to ensure that no possible input can cause harm, which requires either a robust filtering system or strict limits on what kinds of input are accepted: a blacklist or a whitelist, roughly.
One of the downsides of having a brain designed by a blind idiot is that said idiot hasn’t done a terribly good job with limiting input or anything resembling “robust filtering”. Hence that whole bias thing. A consequence of this is that your brain is not a trusted system, which itself has consequences that go much, much deeper than a bunch of misapplied heuristics. (And those are bad enough on their own!)
In discussions of the AI-Box Experiment I’ve seen, there has been plenty of outrage, dismay, and incredulity directed towards the underlying claim: that a sufficiently intelligent being can hack a human via a text-only channel. But whether or not this is the case (and it seems to be likely), the vulnerability is trivial in the face of a machine that is completely integrated with your consciousness and can manipulate it, at will, towards its own ends and without your awareness.
Your brain cannot be trusted. It is not safe. You must be careful with what you put into it, because it  will decide the output, not you. We have been warned, here on Less Wrong, that there is dangerous knowledge; Eliezer has told us that knowing about biases can cause us harm. Nick Bostrom has written a paper describing dozens of ways in which information can hurt us, but he missed (at least) one.
The acquisition of some thoughts, discoveries, and pieces of evidence can lower our expected outcomes, even when they are true. This can be accounted for; we can debias. But some thoughts and discoveries and pieces of evidence can be used by our underhanded, untrustworthy brains to change our utility functions, a fate that is undesirable for the same reason that being forced to take a murder pill is undesirable.
(I am making a distinction here between the parts of your brain that you have access to and can introspect about, which for lack of better terms I call “you” or “your consciousness”, and the vast majority of your brain, to which you have no such access or awareness, which I call “your brain.” This is an emotional manipulation, which you are now explicitly aware of. Does that negate its effect? Can it?)

A few examples (in approximately increasing order of controversy):

Identity PoliticsPaul Graham and Kaj Sotala have covered this ground, so I will not rehash their arguments. I will only add that, in the absence of a stronger aspect of your identity, truly identifying as something new is an irreversible operation. It might be overwritten again in time, but your brain will not permit an undo.
Power Corrupts: History is littered with examples of idealists seizing power only to find themselves betraying the values they once held dear. No human who values anything more than power itself should seek it; your brain will betray you. There has not yet been a truly benevolent dictator and it would be delusional at best to believe that you will be the first. You are not a mutant. (EDIT: Michael Vassar has pointed out that there have been benevolent dictators by any reasonable definition of the word.)
Opening the Door to Bigotry: I place a high value on not discriminating against sentient beings on the basis of artifacts of the birth lottery. I’ve also observed that people who come to believe that there are significant differences between the sexes/races/whatevers on average begin to discriminate against all individuals of the disadvantaged sex/race/whatever, even when they were only persuaded by scientific results they believed to be accurate and were reluctant to accept that conclusion. I have watched this happen to smart people more than once. Furthermore, I have never met (or read the writings of) any person who believed in fundamental differences between the whatevers and who was not also to some degree a bigot.
One specific and relatively common version of this are people who believe that women have a lower standard deviation on measures of IQ than men. This belief is not incompatible with believing that any particular woman might be astonishingly intelligent, but these people all seem to have a great deal of trouble applying the latter to any particular woman. There may be exceptions, but I haven’t met them. Based on all the evidence I have, I’ve made a conscious decision to avoid seeking out information on sex differences in intelligence and other, similar kinds of research. I might be able to resist my brain’s attempts to change what I value, but I’m not willing to take that risk; not yet, not with the brain I have right now.
If you know of other ways in which a person’s brain might stealthily alter their utility function, please describe them in the comments.

If you proceed anyway...

If the big red button labelled “DO NOT TOUCH!” is still irresistible, if your desire to know demands you endure any danger and accept any consequences, then you should still think really, really hard before continuing. But I’m quite confident that a sizable chunk of the Less Wrong crowd will not be deterred, and so I have a final few pieces of advice.
  • Identify knowledge that may be dangerous. Forewarned is forearmed.
  • Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
  • Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
  • Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
  • Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
Just kidding! That would be impossibly ridiculous.

18