Today's post, Mutual Information, and Density in Thingspace was originally published on 23 February 2008. A summary (taken from the LW wiki):

 

You draw your boundary around a volume of space where there is no greater-than-usual density, meaning that the associated word does not correspond to any performable Bayesian inferences. Since green-eyed people are not more likely to have black hair, or vice versa, and they don't share any other characteristics in common, why have a word for "wiggin"?


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Entropy, and Short Codes, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
3 comments, sorted by Click to highlight new comments since: Today at 12:14 AM

Since green-eyed people are not more likely to have black hair, or vice versa, and they don't share any other characteristics in common, why have a word for "wiggin"?

I suppose it could still be a useful word if you bring human values into the mix. Suppose I really like people with green eyes and black hair; it might be nice to have a word for them.

[-][anonymous]12y40

You can take "being liked by you" as a characteristics of "wiggins". So the underlying message of the article still holds. Only in the original article it didn't become quite clear that the bayesian inference you can make does not neccessarily have to be perceived by humans as being about the object that you name. In this case the word allows someone to make a bayesian inference about you. So the associated word does correspond to a performable bayesian inference and is thus justified as the article claims.

Talk of "density in thingspace" sweeps a lot of interesting problems under the rug. Thingspace is very high-dimensional. What distance metric are we supposed to use? EY writes:

I believe that in the field of statistical learning, for algorithms that actually do depend on distance metrics, the standard cheap trick is to "sphere" the space by making the standard deviation equal 1 in all directions.

An alternative (which I use at work) is to make the overall range 1 in all directions. Doubtless there are other, equally arbitrary choices available.