Levi Finkelstein
Message
Interested in AI, math, philosophy, psychology. Website: http://levifinkelstein.xyz/
7
1
IMO the main concept to deeply understand when studying information theory is the notion of information content/self-information/Shannon-information. Most other things seems to be applications or expansions on this concept. For example entropy is just the expected information content when sampling from a distribution. Mutual information is the shared information content in two distributions. KL-divergence describes how much information content you're getting relative to your choice of encoding. Information gain is the difference in information content afte...
The biggest problem I've had with spaced repetition and mind mapping is that it's very difficult and time consuming to represent non-trivial information in such a way that you won't be fatigued over time (both in the creation and re-studying). In my experience they're both skills you have to spend a lot of time on for them to be time/energy efficient, and often it's a better use of your time to just read more and think more.
I think SRS especially is a crazy good learning hack, and it's a curious question why the seemingly low-hanging fruit hasn't been pick... (read more)