If you look at top chess players, they can tell you all the moves that were played in a huge amount of historical chess matches. However, they way they get there is not by putting all moves through Anki cards or doing something that would look to a layperson like memorizing. It's more that for them going through the moves of a game is like reading a highly memorable story with a lot of drama where each move has a lot of meaning. Their brains see so much meaning in the individual moves that they are remembered easily.
I think a lot of times root memorization is recommended because people don't see a way to order the information in a way that's actually meaningful.
Alternatively, I most often see rote memorization recommended by people studying fields that are inherently somewhat organised.
It's easy to see why anki might work well for something like "memorizing lots of words in kanji" because the work of organising concepts into buckets is already embedded in the kanji and kanji radicals.
It's less obvious to me how you could, for example, learn optimal riichi mahjong with this type of method; and probably because of that I've never seen someone recommend that.
Having a large pool of specific info on effective recall is a sign of mental health and quite useful. I've noticed various successful and charismatic commentators appearing to have talent in this area. It's possible that as well as being a sign of health it buffers brain abilities generally, that modern recall-augmenting tools will atrophy the native facility. It seems you can IQ test pretty high as long as you're capable of remembering what words mean but otherwise aren't guaranteed to have exceptional long-term memory capacity.
And making only strictly true (i.e. p>0.99) factual claims, or signaling appropriately when your p(true) is low
I would reserve lowercase p for the statistical p-value and use capital P here. I got confused for a moment before reading the second clause of the sentence.
I generally agree with having a weakened version of memorization: it is important to have the ability to (a) reasonably quickly (on the order of 5-100s) retrieve what you have learned and (b) connect between different concepts or facts, but memorizing exact facts is not that important for pure understanding of the world. It is important for talking to people in real time though.
Having facts readily available in your brain (not just "Google-able") enables real-time bullshit detection
This doesn't require you to memorize exact facts.
(I think you agree with these two points, but I've already written them down before I read the whole post, and these still seem to be a good supplement to the question of what to remember)
TLDR: Western education creates a false dichotomy between memorization and understanding. I believe we should expect both. Having facts readily available in your brain (not just "Google-able") enables real-time bullshit detection, helps you calibrate who to trust, holds your own beliefs accountable, and provides the raw material for insight and critical thought. I offer some concrete suggestions (spaced repetition via Anki, tracking unfamiliar terms, connecting new facts to existing knowledge, etc.). Rationalists need to be careful to not focus purely on epistemics. We also need lots of knowledge. There's no way around memorization.
I believe memorization is unfairly maligned. It is on the shortlist of things I think are required for becoming a rational intellectual. Besides curiosity, these things are:
Good epistemics: a reliable process for obtaining, vetting, and updating your knowledge. How do you know a claim is true? That a study is well-designed? That an observation licenses a general induction? You need to recognize and avoid fallacies and cognitive bias, understand the probabilistic nature of knowledge, follow complicated chains of reason, responsibly evaluate both qualitative and quantitative evidence, etc.
Good knowledge. You need a wide range of properly-vetted, high-confidence information readily available in your mind. This includes brute facts (When was the Song Dynasty? What is silicate weathering?) and contested knowledge (Why did the Song Dynasty collapse? Will silicate weathering slow with climate change?). The key phrase here is “readily available”—these are not facts you could understand if you looked them up, but knowledge actually present in your brain. These are facts available to be thought with, not merely comprehended.
Intelligence. You can have excellent knowledge and rigorous epistemics but lack the ability to do anything interesting with them. You need the spark that connects disparate ideas, sees patterns, generates novel solutions. Creativity, insight, synthesis.
"Being intelligent" as an ingredient for being a good intellectual is so obvious that it’s almost trivial. Similarly, in the culture I grew up in (and on a blog about rationality...), good epistemics needs no theoretical defense as part of education. Every institution I’ve ever attended emphasized "fostering critical thinking" as its central goal. They may not have taught epistemics particularly well (or at all), but at least it was valorized. I understand this isn't universal—friends from Latin America, India, and elsewhere tell me large parts of their education was based on pure rote memorization, and critical thinking was sometimes even actively discouraged. Obviously, this is bad. If I’d been educated in one of those systems, this essay would probably be titled “In Defense of Critical Thinking."
But I wasn't educated in Latin America, India, or elsewhere. I was educated in (wealthy) schools in America and the UK. There, "good knowledge" — the actual retention of factual information — is surprisingly neglected as an ingredient of education.
This sounds counterintuitive. What teacher would claim that knowledge acquisition is unimportant? But I think if you called “the acquisition of knowledge that you retain and can readily access” by its pithier title, “memorization,” the discussion immediately becomes more contentious. How often have teachers told you “I don’t need you to memorize this material, I just want you to understand it.”
In the US and UK, at least, “memorization” has become synonymous with the worst kind of rote learning: students committing tracts of information to memory without understanding how to use those facts. Memorizing synopses of Shakespeare without reading the plays, reciting Gauss’s equations without understanding electromagnetism, etc. I agree this is bad education, and that it happens constantly. In fact, when people bring up this critique of memorization, they're often surprised by the extent to which I immediately agree with them. I often reference this classic essay in the sequences about memorizing passwords, about how much of schooling, even in the West, is essentially an elaborate memorization ritual where students guess what the teacher wants to hear (“light is both a wave and a particle!”) without truly understanding what they’re saying.
I would much prefer medical students spend more time developing clinical reasoning, learning bedside manner, and understanding how healthcare systems actually work, rather than memorizing minutiae about the nephron. Especially if they have no intention of becoming urologists.
But people use this common failure to construct a false dichotomy between memorization and understanding. They believe rote memorization is necessarily the enemy of critical thought, that anyone memorizing large amounts of information is no better than a robot, or a fool who hasn’t realized that any fact is just a Google search away. Why memorize the capitals of the world when you carry the sum of human knowledge in your pocket?
I think the critics of memorization have gone too far. I think we should have a much greater expectation, both in school and of ourselves, to actually know things. To have facts in our brains, not just opinions. Here are a couple reasons why.
Memorized facts let you detect bullshit in real time
I was in a lecture last October by an Oxford professor, a biologist who specializes in occupancy modeling. Essentially, he uses math and camera trap data to create spatial models that predict where certain animals are likely to be. He was discussing the odds that a large carnivore in Southeast Asia would soon go extinct, when he claimed that “urban sprawl is a primary driver of land-use change around the world.”
This sounds plausible. We hear about urban sprawl constantly. Los Angeles, London, Chongqing, all sprawling endlessly. What used to be biodiverse California coastline or good old English bog has become Whole Foods and Tescos. This Oxford professor, a world-leading expert on the question “where are the animals?”, was saying this fairly basic claim to a room of other Oxford professors. Surely it must be true.
Here are the actual numbers: 1–3% of Earth’s land surface is human settlement. 11% is crop agriculture. Around 26% is livestock grazing. The claim that urban sprawl is a “primary” cause of land-use change is pretty hard to make when all current human settlements account for roughly 2% of land use.
These facts are easy to look up. You could verify them in 30 seconds on your phone. But in the middle of a lecture, you can’t pause to think “hmm, what statistics would confirm or undermine this claim?” and then spend 30 seconds Googling them while missing the next point. It’s not just the time to physically look something up, it’s the mental energy of identifying what to search for.
If you don't know that total human settlement occupies an order of magnitude less land than multiple other land-use categories, the professor's claim about urban sprawl sounds perfectly reasonable. And that fact doesn't even definitively disprove the statement. I'm most of us can imagine this Oxford professor retreating to the semantics of the word "primary" when challenged by something as inconvenient as actual fact. "Well, by my new definition of 'primary' that I've just invented, I'm allowed to say whatever I want, regardless of the facts of the matter. Also stop being pedantic!"
But a passing familiarity with the danger of semantics will inure you to this evasion. And having just a few facts about actual land use in your head allows you to hear alarm bells and start asking tough follow-up questions. Without an arsenal of facts to protect you, you’re at the mercy of any effective rhetor with a plausible-sounding claim, regardless of whether or not it’s true.
Memorized facts help you calibrate who to trust
Most facts we receive from outside sources. Those land-use statistics? I learned them from an FAO report. How does the FAO know? I have no idea. I assume satellite data models, which I could track down I suppose, but I have a life to live. You can’t fact-check everything. Often you’re forced to just trust people.
There are useful heuristics when deciding who to trust. You should probably trust UN agencies’ official data. Oxford professors (hopefully) know enough to be accurate in their particular subfield. Someone with extreme political beliefs is probably not a reliable factual source. But these heuristics are imperfect. The UN is sometimes wrong, Oxford professors are often wrong, and some apparently controversial causes are overwhelmingly one-sided when you examine the evidence.
A way around this is having a large corpus of properly-vetted, high-confidence information already in your brain that you can compare against people’s claims. When someone says something false, or something seemingly contradicted by a fact you know to be true, you can ask follow-up questions immediately. If their responses fail to convince you, you can start attaching doubt to their other claims. In extreme cases, you can simply discard their authority altogether. If an Oxford professor throws lots of facts at you and two or three are incorrect or dubious, you know they’re a less reliable source.
And making only strictly true (i.e. p>0.99) factual claims, or signaling appropriately when your p(true) is low, is way harder than it sounds. Most people, even those arguing in good faith, fail to clear that bar. So if the facts in your brain are actually properly vetted and high-confidence, you have a useful filter. When someone says something counterintuitive or contrary to your priors, you can check: are they only making factually true claims, as far as you can tell? If so, it might be worth taking them more seriously, maybe even investigating their argument in good faith. Facts don't only tell you who to distrust; they also offer clues about who deserves special consideration.
As a final note, educated people like Oxford professors almost never say things which would be obviously false to an average college-educated person, and usually don’t say things that are obviously false to a member of their own field. You’ll need facts slightly off the beaten path to catch errors. But not that off the beaten path. It’s shocking how few people have basic statistics, dates, or history readily available in their brains. A few memorized facts go a long way toward recognizing who has real epistemic authority.
The more facts you remember, the easier remembering becomes
There’s a famous psychology study from the 1970s. Half the participants were given this paragraph without a title and asked to remember as much as possible:
Most people in this group did poorly. The other group, who were given the title “Washing Clothes,” did much better.
Having a schema on which to hang information makes it significantly easier to retain. This happens for two reasons. First, it helps organize information in your brain, making it easier to remember. Second, the more connected a piece of information is to something you already know, the easier it is to recall later. If someone tells you the Mongols conquered the Jin dynasty in northern China in the 13th century, you might forget within a week. But if you also know the Mongols invaded Eastern Europe and reached Hungary in the same period, it’s much easier to remember what they were up to in East Asia around the same time.
Information begets information. If you already have lots of facts in your brain, a new fact will have plenty of niches to fit into. If someone mentions that Hamnet was directed by Chloé Zhao, it’s much easier to remember if you already know who Chloé Zhao is. Fact 1 (Chloé Zhao is a director) is necessary to remember Fact 2 (Hamnet was directed by Chloé Zhao). In a week, someone who already knew Fact 1 will probably still remember Fact 2. Someone who didn’t will have forgotten. The more you already know, the easier it is to learn more.
I think this partly explains why some people seem vastly more knowledgeable than others. There’s a cluster way off the scale of people who are total steel traps, remembering random facts from years ago, recalling them instantly, possessing an astounding quantity of general knowledge. I’m sure this comes from multiple things (high curiosity, better than average memory, etc.), but I suspect one underrated factor is a kind of exponential threshold where once you reach a certain level of knowledge in a particular field, it becomes significantly easier to retain and process new knowledge.
Memorized facts help you hold your own beliefs accountable
If you pay attention to most people’s arguments, especially extemporaneous oral arguments, they usually have literally no supporting evidence that isn’t anecdotal. Occasionally someone trots out a lone pet statistic that they keep in their back pocket and deploy whenever the topic arises, but otherwise their opinions, even their cherished beliefs, are held together mostly by vibe.
This is true of almost everyone, including me and probably you. Test it: think of a cherished belief. Something contentious, like the question “Are immigrants dangerous?” You almost certainly have a strong opinion about that topic that you’re confident is right. If you had to argue for your position, how many actual facts would you have? Note that the phrase “studies show...” followed by a vague conclusion gets partial credit at best. What studies? What exactly did they show? Who carried them out? Why do you trust them over contradictory studies? Did you actually read those studies, or did you hear someone else say that studies showed whatever your position is? Why do you trust that person?
If you’re honest, you’ll probably find your argument disturbingly devoid of evidence.
But if you’re right, then your opinions are supported by facts. They’re only a Google search away! It’s not effective to angrily Google evidence mid-argument, but you can do it right now. And if you memorize that information, i.e. actually have it available the next time the question arises, you’ll be able to make an argument supported by real facts, real statistics, real studies you can name. (Note: beware confirmation bias here!).
More importantly, if you make it a habit to know the information supporting your beliefs, rather than relying on the idea that you could look it up if you had to, it becomes obvious when you don’t actually have any support for what you’re saying.
I had a belief that as a vegan, I didn’t need B12 supplements. I argued about this constantly with my mother, who insisted I did. Eventually, I started noticing that I had no facts to support my position. My general points boiled down to “vitamins are a scam!” and “The Jains were vegan way before B12 supplements existed!” The first claim is an opinion, not a fact, and the second claim, while true, is completely insufficient to conclude anything about whether I should take B12 in 2026.
It’s extremely hard to change your mind, especially while arguing with your mother. It took many rehearsals of this debate before the cognitive dissonance of my factlessness got to me and I finally actually looked up facts to support my argument.
Turns out there were none. I should definitely be taking B12.
It seems obvious that you should find facts for your beliefs, but it’s shockingly hard to actually do it. Being accustomed to having facts available when challenged makes it easier to recognize when you have none. And, hopefully, this motivates you to find them. Either you were right all along and can now prove it, or you were wrong and get to take B12 and maybe live longer. Win-win.
Facts are the grist of critical thought
You can’t think critically about something you know nothing about. This sounds obvious, but I’m often surprised by how many people hold strong opinions on technical questions without knowing technical details.
Take the lumper/splitter debate in biology: at what point should a group of organisms be considered separate species? This is ultimately a semantic question. The species concept is a category, and categories can only be useful or not, not true or false. But whether the species concept is useful, and under what conditions, is a genuinely technical conversation. It requires knowledge of statistical mechanisms of evolution, horizontal gene transfer, gene flow, population dynamics, biogeography. If you don’t actually remember what Hardy-Weinberg equilibrium is and when it applies, you can’t even begin to evaluate statistical evolution arguments about where species boundaries should fall.
You need knowledge to have something to think about.
This is how insights happen. Darwin’s Origin of Species marshals an enormous range of facts about natural history, biogeography, embryology, artificial selection, the fossil record, and more. The insight of natural selection clearly emerged from thinking across all of these facts simultaneously. The same pattern holds for anyone pushing a field forward, both scientific and artistic: Wegener synthesized geology, paleontology, and climatology to argue for continental drift; Eliot pulled together obscene amounts of myth, history, language, and literature in The Waste Land. These weren’t people reasoning from first principles. They had vast stores of memorized knowledge, making connections no one else could see because no one else had all the pieces loaded into working memory at once.
Memorization is the foundation of creativity and insight, not its enemy.
What to do about it
If memorization matters, how do we actually do it? Some suggestions.
Be honest about your goals.
The goal of learning isn’t always to memorize. It would be absurd to demand you remember every detail of every novel you read. But you should be clear about what you’re trying to get out of any given learning experience.
If you’re reading a science fiction novel because you believe it will deepen your insight into human nature, or the relationship between technology and society, that’s totally fine. It’s probably not that important to actually remember the plot. Accept that in a year or two, you’ll have forgotten almost everything besides only the broadest outline, and move on.
But if your goal is also to remember the plot and characters, be honest about that and put a system in place to do so.
The most important application of this principle is recognizing when you’re wasting your time. If you’re sitting through an hour-long lecture on Old English, ask yourself “what is my goal here?” If it’s to actually obtain knowledge about Anglo-Saxon vocabulary, history, and grammar, next ask yourself, “how many facts am I actually learning and retaining?” If you think you’re walking away with three facts, all of which you’re likely to forget by next month, you might want to find a more efficient method of learning the information than going to class. A textbook you can annotate and revisit is usually significantly better than a bad lecturer.
Ask yourself after any learning experience: What will I actually remember from this? If the answer is “almost nothing,” consider the possibility that you haven’t learned anything, but have instead just performed learning. If you care about retaining the information, something needs to change.
Use spaced repetition.
Anki is a free flashcard app that uses spaced repetition. It shows you cards right before you’d forget them, which is the most efficient way to move facts into long-term memory. Cards you know well appear less frequently; cards you struggle with appear more often. As you learn information, the intervals between reviews grow longer. Once you've mastered a deck, you might see individual cards every 5 years, every 11 years, etc. The result is that reviewing a large body of information eventually takes only minutes or seconds a day. I have a deck with all the world's country flags that takes an average of 4 seconds a day to maintain, and I almost never forget any of them.
Here’s one of the best ways I use Anki: whenever I encounter a word, concept, or cultural reference I don’t know, I write it down. Once you start paying attention, you’ll be shocked how often your brain simply edits out unfamiliar terms. They’re everywhere. At the end of each month, I upload these to an Anki deck that I review every day.
This has two benefits beyond the obvious. First, it functions as a frequency-weighted filter. The more common a term I don't know, the more likely I am to encounter it and add it to my deck. Since it's hard to judge the importance of unfamiliar terms, this frequency-weighted approach does the sorting for you. You know you’re likely to come across the terms in the wild because that’s how you chose them in the first place.
Second, tracking what I add each month gives me a rough metric for how much I’m learning and exploring. If I get to the end of a month and I have relatively few terms to add to my New Terms Anki deck, that’s good evidence I’m in a rut. I’m probably consuming mostly familiar media, or talking mostly to people with similar knowledge bases. If I start reading a book or article that is dense with unfamiliar vocabulary, this signals I’ve found someone who swims in unfamiliar intellectual waters. This is a good sign I will learn a lot if I keep reading. It’s unfamiliar intellectual territory where the most valuable new ideas often live.
Write things down.
You’re not going to remember it otherwise. Have a notebook, an app on your phone, scrap paper in your pocket, anything. When a piece of information enters your short-term memory that you want to remember, write it down immediately. Then have a system (Anki, Zettelkasten, etc.) for moving that into long-term memory.
Connect new facts to existing knowledge.
Remember the “Washing Clothes” study: information sticks when it attaches to a schema. When you learn something new, consciously ask where it fits in your existing knowledge. What does it relate to? What does it contradict? The more connections you build, the more durable the memory, and the more likely you are to recall it when it’s relevant.
This is also why breadth of knowledge feeds on itself. The more you know, the more hooks you have for new information to attach to. Reaching a critical mass in any domain makes further learning in that domain significantly easier.
Brute memorize useful frameworks.
Want to learn about geopolitics? Memorize the world map. Want to learn about chemistry? Memorize the periodic table. Want to learn about the history of England? Memorize the monarchs in order.
If having a fundamental schema makes remembering everything else easier, you should invest in learning that schema as soon as possible. Often there’s no way around brute memorization. After you have the framework, you can start learning and sorting facts into their places one by one.
Choose what to memorize with care.
The critics of memorization are right that facts are useless without comprehension or context. They are wrong to identify memorization itself as the problem, but we would be equally wrong to not recognize the danger they are rightfully pointing to.
Memorize with intention. If you want to learn about the Roman Empire and you start by memorizing the emperors in order, that’s probably a great place to start. But if you insist on memorizing all thirty minor emperors of the Crisis of the Third Century, you’re probably just memorizing for memorizing’s sake. It’s useful to know most of the world capitals, but even if your main interest is geopolitics, it’s still probably trivia to know that Alofi is the capital of Niue. There’s nothing wrong with trivia if that’s what you’re into, just make sure to be honest with yourself.
Only memorize information that is genuinely useful for your goals.