by [anonymous]
2 min read8th Jan 201116 comments

31

Are there any occasions when it's a good idea to avoid exposing yourself to a certain piece of information?  As rationalists, we probably do a lot less of this than the average person (because we're curious about reality and we don't mind having our preconceptions destroyed by new knowledge) but is all self-exposure to information safe?

Possible reasons, from least to most controversial.

1.  Exposing yourself to information that's distracting; the act of reading the information is not the current best use of your time (TvTropes.)

2.  Exposing yourself to information that puts you or others in direct danger because you know too much (illegally reading classified gov't secrets; personally investigating a crime)

3. Exposing yourself to information that's likely to cause dangerous psychological damage (graphic depiction of rape if you're a rape survivor; writing that romanticizes suicide if you're depressive; pro-anorexia blogs)

4.  Exposing yourself to information that violates someone's privacy or rights (reading a secret diary; going through someone's mail)

5. Exposing yourself to information that might alter your character in a way you don't currently want (the argument that repeatedly seeing violence in video or photographic form makes us less compassionate)

6.  Exposing yourself to "escapist" media that make the real world seem less appealing by comparison, and thus make you less happy (porn; some science fiction and fantasy; romance; lifestyles of the rich and famous.  A variant: plot spoilers, which can also ruin future enjoyment.)

7.  Exposing yourself to persuasive arguments that might make you do things you currently consider morally wrong (Mein Kampf, serial killers' manifestoes)

8. Exposing yourself to content that might persuade you to do things you don't currently want to do (advertising, watching the Food Network if you're dieting/fasting; the sirens' song, if you're Odysseus)

9.  Exposing yourself to effective, emotionally manipulative arguments for things you're currently confident are false (possibly religious apologetics)

10.  Exposing yourself to "cynical" true information that lowers your utility/motivation/happiness in everyday life (public choice theory, if you're a civil servant; accounts of unsuccessful and dissatisfied grad students/law students, if you're a student)

11.  Exposing yourself to content that's "disgusting" or "degrading" in your view (Two Girls One Cup; Tucker Max; gangsta rap)

 

I think 1-4 are no-brainers, and 5-11 are possibly good ideas but I'm less confident.  I think 7, 10, and 11 can have negative consequences. I think 9 is rarely if ever necessary.

Do you do any of these things?  Which do you think are good reasons to self-censor?  Any other ones?

I don't think we can really discuss censorship until we know what we think about self-censorship.  I'd want to know what kinds of information people don't want to be exposed to, before I started restricting other people's access to information.  Arguments for censorship often reduce to arguments for self-censorship (claims that there are some kinds of content that people regret being exposed to.)  There are semi-voluntary methods for enabling self-censorship, that stop short of actual censorship.  For instance: trigger warnings, rot13, site-blocking software, MPAA ratings.  Whether or not to censor something (where by "censor" I just mean "restrict access to"; private websites "censor" when they delete or hide information) depends both on how much harm it's likely to cause if read, and how able/likely people are to voluntarily avoid it.

 

New Comment
16 comments, sorted by Click to highlight new comments since: Today at 9:56 PM

Exposing oneself to information which might cause time-wasting. I really didn't need to know that my karma score is over half of what's needed to get on the top ten list.

It would also be just as well if I don't know the ten best flash games.

I'd remove #4, not only because piracy is a controversial issue (in an inefficient mkt) but also because it's not the information that hurts you, but the act of theft. Seems like it belongs in a different bucket.

[-][anonymous]13y30

Exposing yourself to information that might alter your character in a way you don't currently want (the argument that repeatedly seeing violence in video or photographic form makes us less compassionate)

Exposing yourself to persuasive arguments that might make you do things you currently consider morally wrong (Mein Kampf, serial killers' manifestoes)

Exposing yourself to content that might persuade you to do things you don't currently want to do (advertising, watching the Food Network if you're dieting/fasting; the sirens' song, if you're Odysseus)

This is too vague. I think you don't realize what most people will use this way of reasoning for.

example: X is bad because the Koran said so and I don't want to learn positive information about behaviour X lest I be tempted to engage in it.

Its easy to fit this for apologia for religious prescriptions, world-views (blank slateism, antivax, ...) and ideologies of all stripes (luddite environmentalist, communist, fascist,)

Exposing yourself to effective, emotionally manipulative arguments for things you're currently confident are false (possibly religious apologetics)

One should think really hard if he isn't already in a emotional state that might make him more prone to biases in this kind of thinking. Anyway to nitpick all arguments are emotionally manipulative, this includes those made in such a way to avoid a emotional response.

Exposing yourself to "cynical" true information that lowers your utility/motivation/happiness in everyday life (public choice theory, if you're a civil servant; accounts of unsuccessful and dissatisfied grad students/law students, if you're a student)

If you know its cynical but true and therefore don't want to explore it aren't we really talking about belief in belief? I don't know how much of that I want regardless of its effects on my happiness.

Exposing yourself to content that's "disgusting" or "degrading" in your view (Two Girls One Cup; Tucker Max; gangsta rap)

I could easily imagine gleaning some insight from Tucker Max or gangster rap (well sort of).

Overall I don't by default see anything wrong with the list listed as a value preserving methods, but I just think people aren't really as attached to all of their values as much as they think. I'm pretty sure we humans are easy to fool about what our "real" values are.

Interesting topic! Here's my thinking:

3, 11, and possibly 5: These are cases where the damage is caused by aspects of your hardware you can't currently change, and so self-censorship is good. The usual points about "information shouldn't have a negative value" don't apply because the negative value is not due to your decision theory or how you consciously handle evidence.

2 and 4 should be avoided because they have instrumental harm that outweighs the epistemic gain (except remove the IP bit for the reason Dr_Manhattan gave).

1, 6, and 8 have in common that they provoke the dynamic inconsistency behind akrasia, and are good candidates for self-censorship on this basis.

7, 9, and 10 should definitely not be avoided, as doing so would knowingly leave open a "security hole" in your reasoning.

Also, where would PUA go? I'd say 5 or 7 for men, 9 or 11 for women

[-][anonymous]13y10

I was thinking of Roissy's website for both 9 and 11, actually. I used to read it, in the interest of "exposure to different views." Then I quit, reasoning that by now I understand his viewpoint pretty well, but the website continues to make me nauseous, and continues to prime me to emotionally believe things that I know are not true (e.g. "All men think I'm ugly!") At some point, the amount I'm learning is too small to justify other harms.

[-]Jack13y00

At some point Roissy started having interns write a lot of the posts and they got more political and way less interesting while staying the same amount of offensive. That led me to stop reading altogether.

When I did read it it primed me to believe some troubling, false things too- though since I'm a straight male it was things like "I'm pathetic for not sleeping with more beautiful women". I don't regret reading enough to learn his perspective, though.

Bottom line. When you take the time to learn A piece of information you have scarified learn B, C, D, … in the same time slot. This opportunity cost creates that you prioritize you learning. If you let someone censor you, you are letting them set your priorities. Rarely can other people be trusteed to know what priorities are best for you; at least until we have a better understanding of neurology.

  1. Exposing yourself to information that's distracting; the act of reading the information is not the current best use of your time (TvTropes.)

I think of this case not as a case for censorship but for a case of priorities. TvTropes is fine as long as you have worked out all of the priorities ahead of TvTropes first.

  1. Exposing yourself to information that puts you or others in direct danger because you know too much (illegally reading classified gov't secrets; personally investigating a crime)

Well the importance of the information knocks it up the list in priorities and the potential down sides knock it down the list. If methods/techniques to circumvent the downsides are learned then it gets knocked back up the list.

  1. Exposing yourself to information that violates someone's privacy or rights (reading a secret diary; looking at pirated content)

There are good reasons in normal circumstance not to pry into other peoples privacy. In general it is so low a priority that finding out what life is like with out break Y persons privacy has a higher priority.

  1. Exposing yourself to "escapist" media that make the real world seem less appealing by comparison, and thus make you less happy (porn; some science fiction and fantasy; romance; lifestyles of the rich and famous. A variant: plot spoilers, which can also ruin future enjoyment.)

You are human you need to enjoy yourself to be productive some amount of escapist media is just fine. I am not going to claim it is only ok if it directly increase you productivity, I think it is a complex issue that I am not going to get into further.

  1. Exposing yourself to information that's likely to cause dangerous psychological damage (graphic depiction of rape if you're a rape survivor; writing that romanticizes suicide if you're depressive; pro-anorexia blogs)

  2. Exposing yourself to information that might alter your character in a way you don't currently want (the argument that repeatedly seeing violence in video or photographic form makes us less compassionate)

Does X information bring benefit to you? Or rather the real problem is can you tell if they bring you benefit before they cause you damage. If you don't know if they can cause you damage then reading about X is probably not as important as find out how susceptible you are to psychological damage. Then once you have answered that question you can determine the priority of X information.

  1. Exposing yourself to persuasive arguments that might make you do things you currently consider morally wrong (Mein Kampf, serial killers' manifestoes)

  2. Exposing yourself to content that might persuade you to do things you don't currently want to do (advertising, watching the Food Network if you're dieting/fasting; the sirens' song, if you're Odysseus)

  3. Exposing yourself to effective, emotionally manipulative arguments for things you're currently confident are false (possibly religious apologetics)

If you think that you morality can be so easily change then learning not to be unduly influence by arguments should be high priority for you. You are exposed to arguments that while irrational can be convincing if you do not think them through properly.

  1. Exposing yourself to "cynical" true information that lowers your utility/motivation/happiness in everyday life (public choice theory, if you're a civil servant; accounts of unsuccessful and dissatisfied grad students/law students, if you're a student)

If you are demotivated by reality then it should be high priority to learn how to be more realistic.

  1. Exposing yourself to content that's "disgusting" or "degrading" in your view (Two Girls One Cup; Tucker Max; gangsta rap)

Same as one.

Approximate Groupings

1, 11

2

3, 5, 7, 8, 9

4,

6,

10

[-]Jack13y00

Great post.

  1. Exposing yourself to effective, emotionally manipulative arguments for things you're currently confident are false (possibly religious apologetics)

How about exposing yourself to effective, emotionally manipulative arguments for things you're currently confident are true. This seems equally dangerous.

Currently I only really do 1 and 11. Though mostly 1 works by making my procrastination more valuable- like spending it on Less Wrong instead of TVTropes or political blogs. I still watch way too much TV.

I have a really nasty habit of harmful self-censorship, like not reading bills because they'll make me unhappy/uncomfortable. As you can imagine this is a really bad form of irrationality so I'm weary of finding excuses to self-censor. But I wonder if paying more attention to what information I allowed myself to consume would help me prevent this.

I agree with 1-4, 8, and 11, disagree with 6, 7, and 10, don't see much reason to worry about 9, and am ambivalent about 5.

Anecdote relating to 7 and/or 9: I recently noticed that I had major motivated cognition and related problems with my political views, in the form of being biased towards the left. As an experiment and an attempt to fix the problem, I read The Fountainhead and tried to do so neutrally or favorably, fully aware that it's propaganda. It shook up my beliefs a bit, and now I'm more able to consider both left- and right-wing opinions without making everything favor my current beliefs. From the inside view at least, it feels like exposing myself to propaganda paid off.

Haha, I found this thread to be both useful and funny.

In any case, there are some cases where such information will leave your "map" of the territory astray. But many of us have numerous strategies to prevent this information from leaving our "map" astray.

Personally, I actually like to expose myself to a lot of this information (I generally trust myself not to let these sources of information affect me in a negative way, but this is dependent on my using the Internet as a means of finding other sources of information that may criticize them - which entails that these other sources of information are true and findable). I often like to consume this information since a lot of it has to do with the "boundary values" of society, so they're interesting ways to see how social norms react to unusual stimuli (many legal cases, for example, are examples of highly unusual boundary-value cases that really test the laws) that it ordinarily isn't used to.

Most of these sources of information are about people with "extreme" values, after all, so it's easy not to be affected by them. But there are other sources of information that aren't as extreme. For example:

That being said, there is one potentially dangerous source of information to fall for. And that source of information is - nootropic supplements/medications. In this case, I almost always rely on the experiences I read from imminst.org and other drug forums. But in many cases, there is vigorous debate, and I'm not sure what exactly to do. There's a chance that I might cause substantial brain damage (to myself) by mixing in both Adderall and piracetam (as someone on imminst.org suggested), but someone else on imminst said that the combination was a recipe for glutamate neurotoxicity (at a massive level). I'm not sure what to do now, but I do feel guilty about wasting my piracetam and might ultimately bring myself to take it with adderall, based on those imminst posts I've read (just because I probably trust imminist forums more than I should).

The demands for scientific rigor force science to be conservative. The FDA requires that drugs be tested only for a specific treatable ailment, but not for enhancing cognition. As a result, some (brave) people self-experiment, and post their experiences on places like imminst.org. And since I might be too impatient to wait for results of safety from scientific journals (safety usually only tests acute effects, not the chronic ones that may be more harmful), I might end up taking a nootropic before I know that it's really safe for me. It's a risk, of course, and one that could easily come from falsified information from a place like imminst (which, again, I probably trust more than I should).

For example, if I had a monthly supply of modafinil (back in my more impulsive days), then I probably would have pulled all-nighters with it more than half the year. This may have had the potential to impose some extremely negative (and lasting) effects on my cognition (not the modafinil itself, but pulling all-nighters with it).

And of course, since each drug affects different people differently, and there aren't a huge number of people who self-experiment with drugs and who ALSO post them on imminst (so N is very small), I can't be completely confident that these drugs will be harmless in the way that it's harmless to other people. But I'm prone to impulsivity, so I may be better off if I self-censor myself from such things. That being said, I'm still convinced that most nootropics are completely safe.

In other words....

Even for a rationalist, it's hard to make decisions about actions where the utility (MOST OF THE TIME) is a slight positive, but where there is a very small chance that the utility could be a VERY STRONG negative. Now, driving is one of those actions (since you're far more likely to die from a driving accident than from doing anything if you chose to avoid driving whenever possible, even though most of the time, driving at will incurs a slight positive utility). But many novel actions that transhumanists like to pursue - these actions also fit in the category - actions like taking nootropics (especially in combination with other nootropics or medications)

And here's the thing: the human brain is horrible at quantifying events of very low probabilities. Even rationalists are horrible at doing this. While we, at least, know A LOT of people who drive, and few of us know people who ended up killed or horribly disfigured through car accidents (so we can estimate some probability of dying from an accident), we don't have such large sample sizes when it comes to trying other new things (like nootropics). We might know that P(disaster|novel action like taking nootropics) < 0.05. But it might be much lower than that, or it might be slightly lower than that.