It is so painful to have an easily available possible world in which you find LessWrong earlier than in the real world. I ran into LW/OB five times since I was 16 and didn't stick around until I was 21. I can't imagine what I would be like with five years of exposure to the important things that I've been exposed to in the past six months, as well as having grown alongside the community, seeing as how I came around near the time that LW began.

I also didn't stick with LW at the first time. I found an article linked from somewhere, I believe it was "Well-Kept Gardens Die By Pacifism", I was impressed, but then I left. A year or two later, I again randomly found an article, then I saw it was the same website as the previous one, so I was like "Oh, this website contains multiple interesting articles" and started clicking on random links in text. Then I cautiously posted a few comments in the Open Thread -- some got downvotes, some got upvotes -- and kept reading...

So, somewhere ... (read more)

Beyond Statistics 101

by JonahS 2 min read26th Jun 2015132 comments

19


Is statistics beyond introductory statistics important for general reasoning?

Ideas such as regression to the mean, that correlation does not imply causation and base rate fallacy are very important for reasoning about the world in general. One gets these from a deep understanding of statistics 101, and the basics of the Bayesian statistical paradigm. Up until one year ago, I was under the impression that more advanced statistics is technical elaboration that doesn't offer major additional insights  into thinking about the world in general.

Nothing could be further from the truth: ideas from advanced statistics are essential for reasoning about the world, even on a day-to-day level. In hindsight my prior belief seems very naive – as far as I can tell, my only reason for holding it is that I hadn't heard anyone say otherwise. But I hadn't actually looked advanced statistics to see whether or not my impression was justified :D.

Since then, I've learned some advanced statistics and machine learning, and the ideas that I've learned have radically altered my worldview. The "official" prerequisites for this material are calculus, differential multivariable calculus, and linear algebra. But one doesn't actually need to have detailed knowledge of these to understand ideas from advanced statistics well enough to benefit from them. The problem is pedagogical: I need to figure out how how to communicate them in an accessible way.

Advanced statistics enables one to reach nonobvious conclusions

To give a bird's eye view of the perspective that I've arrived at, in practice, the ideas from "basic" statistics are generally useful primarily for disproving hypotheses. This pushes in the direction of a state of radical agnosticism: the idea that one can't really know anything for sure about lots of important questions. More advanced statistics enables one to become justifiably confident in nonobvious conclusions, often even in the absence of formal evidence coming from the standard scientific practice.

IQ research and PCA as a case study

In the early 20th century, the psychologist and statistician Charles Spearman discovered the the g-factor, which is what IQ tests are designed to measure. The g-factor is one of the most powerful constructs that's come out of psychology research. There are many factors that played a role in enabling Bill Gates ability to save perhaps millions of lives, but one of the most salient factors is his IQ being in the top ~1% of his class at Harvard. IQ research helped the Gates Foundation to recognize iodine supplementation as a nutritional intervention that would improve socioeconomic prospects for children in the developing world.

The work of Spearman and his successors on IQ constitute one of the pinnacles of achievement in the social sciences. But while Spearman's discovery of IQ was a great discovery, it wasn't his greatest discovery. His greatest discovery was a discovery about how to do social science research. He pioneered the use of factor analysis, a close relative of principal component analysis (PCA).

The philosophy of dimensionality reduction

PCA is a dimensionality reduction method. Real world data often has the surprising property of "dimensionality reduction":  a small number of latent variables explain a large fraction of the variance in data.

This is related to the effectiveness of Occam's razor: it turns out to be possible to describe a surprisingly large amount of what we see around us in terms of a small number of variables. Only, the variables that explain a lot usually aren't the variables that are immediately visibleinstead they're hidden from us, and in order to model reality, we need to discover them, which is the function that PCA serves. The small number of variables that drive a large fraction of variance in data can be thought of as a sort of "backbone" of the data. That enables one to understand the data at a "macro /  big picture / structural" level.

This is a very long story that will take a long time to flesh out, and doing so is one of my main goals. 

19