The notion of intelligence as compression is an old one; I believe Marcus Hutter was the first to formalize it back in the early 2000s (this is also where AIXI comes from). The problem with Hutter's formalism is that his definition of compressibility (Kolmogorov complexity) is uncomputable; "find the shortest Turing machine that outputs X" requires unbounded resources even if you have a halting oracle.
I believe that, in this paradigm, the NAH is fundamentally saying: well, for "natural" data, compressibility is computable; there's some minimal representation to which any sufficiently powerful (yet still finite) model will converge. The problem is, therefore, to figure out what a sufficiently powerful model looks like.
Predictions over the course of my reading:
Guessed "dath ilan" from title.
Switched prediction to "Amenta" at mention of red hair.
Switched back to "dath ilan" (with much higher certainty) at the first use of capital-C "Civilization".
While I like his idea, it's still not computable.