Levi Finkelstein has not written any posts yet.

Levi Finkelstein has not written any posts yet.

IMO the main concept to deeply understand when studying information theory is the notion of information content/self-information/Shannon-information. Most other things seems to be applications or expansions on this concept. For example entropy is just the expected information content when sampling from a distribution. Mutual information is the shared information content in two distributions. KL-divergence describes how much information content you're getting relative to your choice of encoding. Information gain is the difference in information content after and before you drew a sample.
For this I would recommend this essay written by me. I would also recommend Terrence Tao's post on internet anonymity. Or if you've seen Death Note, Gwern's post on the mistakes of Light. Also this video on KL divergence. And this video by intelligent systems lab.
The biggest problem I've had with spaced repetition and mind mapping is that it's very difficult and time consuming to represent non-trivial information in such a way that you won't be fatigued over time (both in the creation and re-studying). In my experience they're both skills you have to spend a lot of time on for them to be time/energy efficient, and often it's a better use of your time to just read more and think more.
I think SRS especially is a crazy good learning hack, and it's a curious question why the seemingly low-hanging fruit hasn't been picked by more people. I think one large reason is because using SRS comes... (read more)