One of the great things about the internet is that there is a social group for almost every interest.  Pick an unusual hobby or ideology, and there is probably an online community centered around it.  This is especially wonderful for those of us who never quite fit in to mainstream society.

But there's also a downside to this aspect of the internet, which is that the more we immerse ourselves in these small online communities, the less exposure we get to the rest of the world.  And the less exposure we get to the rest of the world, the easier it is for us to hold onto false beliefs that the rest of the world rejects.  (Of course, it's also easier to hold onto true beliefs that the rest of the world rejects.)

For instance, suppose you believe that pasteurizing milk makes it less healthy, and we should all drink our milk raw.  (I picked this example because it's something decidedly non-mainstream that I believe with high probability.)  I'm fairly susceptible to social pressures, so at least for me, my belief in this proposition goes up when I'm hanging out with intelligent people who agree with me, and it goes down when I'm hanging out with intelligent people who look at me like I'm insane when I claim such a thing.  They don't need to state evidence in either direction to influence my belief-probability, though that certainly helps.  The important thing is that I think they're smart and therefore I trust their opinions.

Unsurprisingly, if I spend most of my time hanging out with normal, intelligent, scientifically-minded Americans, I start to question my beliefs regarding raw milk, but if I spend all my time on raw-milk-promoting websites, then my belief that raw milk is good for us is reaffirmed.

We like having our beliefs affirmed; it makes us happy when other people think we are right about things.  We'd rather seek out people who agree with us and can relate to our mindsets than seek out groups where everyone disagrees with us strongly.  This is normal and reasonable, and it's why all of us rationalists are hanging out here on LessWrong instead of lurking in creationist forums.  However, it does put us at risk of creating feedback loops: unusual ideas are proposed by people we respect, we affirm those ideas, others repeat those ideas, and so their prevalence and repetition causes them to be repeated more.  Many of those who disagree are hesitant to voice their disagreements for fear of rejection.  As a result, LessWrong perpetuates many ideas that the rest of the world considers somewhat odd.  Also, the rest of the world perpetuates many ideas that we at LessWrong consider extremely odd.

I'm not saying anything new here, I know.  Everything I've written so far has been discussed to death on LessWrong, and if I were less lazy this article would be full of links to the sequences.  If I recall correctly, the sequences recommend countering this problem by recognizing that we have these biases, and consciously trying to correct for them.

I try to do this, but I also tend to employ an additional solution to this problem.  Because I recognize that I'm easily influenced by others' beliefs, I make sure to expose myself to a myriad of different belief systems.  For instance, in politics, I read blogs by liberal feminist scientists as well as conservative anti-feminist traditionalists.  Since I respect the authors of all the blogs I read, and recognize that they are intelligent people who have thought deeply about their perspectives, I can't easily dismiss either perspective outright as lies spoken by a moron.  Since their beliefs differ so radically, I also can't just fall into the trap of believing everything I read.  So I'm forced to really think about the ideas, and question why their proponents believe them, and consolidate them (and other thoughts I might have) into my own coherent worldview.

Thus, I consider it important to be exposed to the ideas of people I disagree with.  Meeting intelligent people who think differently than I do keeps my mind open, and reminds me that there are things about the world that I don't know yet, and keeps me from overestimating the probability that my beliefs are true.

Unfortunately, search engines like Google are making it more difficult for me to do so.  About a week ago, I attended a lecture on information retrieval, and I was shocked to find out exactly how much our Google searches are customized to our own preferences.

Suppose John and Mary both Google something like "creationism".  Now suppose that John is an atheist who reads a lot of atheist forums, and Mary is a fundamentalist Christian who spends most of her time on Christian forums.  John's Google results might contain a lot of links to people on his favorite atheist website talking about how much creationism sucks, and Mary's Google results might contain a lot of links to her friends' blogs talking about how God created the earth.

In this example, John and Mary are both having their beliefs reaffirmed, because Google is presenting them with things they want to hear.  They will not be exposed to opposing viewpoints, and will be much less likely to change their minds about important issues.  In fact, their beliefs in their own viewpoints will probably grow stronger and stronger each time Google gives them back these results, and they will become less and less aware that another viewpoint exists.

Of course, this might happen without Google filtering its search results.  John might deliberately avoid reading the views of creationists, or dismiss them outright as moronic, or not ever Google anything that might lead him to their webpages, because he is convinced of his beliefs and would rather have them affirmed than contradicted.  Since he would skip past the fundamentalist Christian blog results anyway, Google is doing him a service by ranking the stuff he cares about higher.

But at least for me, this Google filtering is a bad thing.  I want to see other webpages which present other viewpoints, instead of being led back to the same places over and over again.  And if Google doesn't show them to me when I search for them, and I don't realize that my Google search results are being customized, I might never realize there's something I'm missing, or go to look for it.

I'm probably making this sound more dire than it actually is.  Obviously, I can try other search terms, or just ignore websites I've already been to.  Or I can follow links on other websites and wander off into regions of the internet without the help of Google.  But I still have a visceral reaction against search engines customizing their results to fit my individual ideological preferences, because they are perpetuating my biases without giving me any direct control over which pieces of information I receive.

What do you guys think?

New Comment
17 comments, sorted by Click to highlight new comments since: Today at 8:15 AM

In English class I read an article that was arguing that increased mobility in the United States was leading to further political fragmentation and and division. Basically, the idea was that after college people move to areas with lifestyles and views that they like more, and that as a result some areas are becoming super liberal while others are becoming more conservative.

It was backed up by data about how elections used to be closer on the district-by-district level, and that people could reasonably expect that their next door neighbors might have a different political opinion than each other.

The scariest part about the article was when it stated that, when you get a group of somewhat ideologically homogenous people together, rather than tending towards the center of the group, they tend towards a more extreme version of the group's beliefs.

It seems like the internet would have a similar effect, on a finer scale. Rather than packaged ideologies being reinforced, it seems easier to reinforce things on an idea-by-idea level.

elections used to be closer on the district-by-district level

Alternate explanation for part of this effect: the losers in gerrymandering will lose whatever barely-dominant districts they had, in exchanges for a mix of fewer strongly dominant (nearly unanimous) ones, and barely-losing (45%-55%) ones.


Somewhat related: an excellent book on this phenomenon is The Big Sort.


For instance, suppose you believe that homogenizing milk makes it less healthy, and we should all drink our milk raw. (I picked this example because it's something decidedly non-mainstream that I believe with high probability.)

Homogenization or pasteurization? If it's the former, that's... unusual. I don't think I've seen anyone complain about that before. If it's the latter, I submit for your consideration that if you weren't aware of the difference between these two things (at least, not sufficiently aware to avoid mixing them up), then perhaps you don't know as much about them as you thought you did.


Oops, you are right; I meant to type pasteurization! I also think that homogenizing milk is bad, but I believe that with lower probability. I'll edit my post, and thanks for the correction. =)

If you really don't want Google to optimize its results to your biases, you could sign out of your Google account when you make a search. Or, if you look at enough sites that oppose your beliefs, Google will probably start putting them in your search results.

On a related note, one of the great things about LW is the many posts criticizing our favorite beliefs. It's possible to read LW extensively and not be led into conformity as strongly as with other sites, because several of our members comment primarily to speak against our core ideas and mission. While that can be annoying at times, I appreciate it in the long run for acting as epistemic hygiene.

Yep. Look at "filter bubbles"

Also, you can turn off the tailoring of search results based on your past history on google. -

Thanks! I did that and now my results are the same when I'm signed out as when I'm signed in, so I think it worked. But Less Wrong still comes up as the fifth or so hit for "rationality", so I'm not sure.

I think it helps a lot to notice when one's belief seems to have drifted as a result of social pressure, and then either do in-depth research or decide to remain ignorant about the belief and hold no opinion.

Here are some possibly useful comments by Matt Cutts (a Google employee) about search personalization.

Thanks for posting this, I didn't know Google did this. Next time I'm looking for opposing views to my norm I'll be sure to sign out.

That's probably not enough. I don't have a source on me now, but search engine results are customised even for people who aren't logged in. Your browser and IP address amongst other factors are used to customise your search results.


Here we go:

Signed-out customization: When you're not signed in, Google customizes your search experience based on past search information linked to your browser, using a cookie. Google stores up to 180 days of signed-out search activity linked to your browser's cookie, including queries and results you click.

Information on disabling customisation when signed out.

[Further edit] Apparently there is a link at the bottom of a search results page which tells you how your search was customised, and allows you to research your terms uncustomised.


But at least for me, this Google filtering is a bad thing. I want to see other webpages which present other viewpoints...

I agree with you. Which is part of why I use, and keep in my bookmarks for when I want google.

Unsurprisingly, if I spend most of my time hanging out with normal, intelligent, scientifically-minded Americans

Why specifically Americans?

Still, an interesting argument - although one should in general be wary of using popularity in search engines as an accurage gauge of anything at all, even popularity in masses.

If ey spends most of eir time in America, ey will be hanging out with Americans.

If you're going to use plural verbs with your nonstandard pronouns, you might as well just use the singular they.

Fixed; I'm much more used to singular 'they' than Spivak pronouns.