# The central limit theorems

There are multiple versions of the central limit theorem. One common version says that, as , the mean of means looks like a Gaussian distribution. People in industry sometimes ignore the "goes to infinity" part, and assume that if is large, that's good enough. But is that true? When is it safe to assume that your means have converged? I heard of a case recently where the mean of means was assumed to have converged, hadn't, and the application it was a part of suffered accuracy problems for months without anyone knowing. This sequence will explore: under what conditions does the central limit theorem "fail" like this?