I will soon be teaching a "critical thinking" class for undergraduates. Feel free to mentally replace "critical thinking" with "epistemic virtue". I would appreciate answers to any of these questions:

What would you do if you only had one three hour class period to teach a group of 15-30 undergrads "critical thinking"?

If you know me (Ronny Fernandez) what one educational objective would you give me, knowing it will not be my only one, to come up with a plan to achieve?

How would you given three hours or less make it so that 15-30 students are all absolutely sure of something, and then realize they were all wrong, without feeling like them feeling like you had cheated or done anything shady?

What are the most transferable rationality skills that are abundant in our community and rare elsewhere and how do you transfer them in a classroom setting?

How do you teach the things in the vicinity of scout mindset vs warrior mindset, arguments as soldiers, politics is the mind killer, without making your student temporarily dumber by giving them a fully general excuse to not engage with any disagreement they feel like ignoring?

How do you teach the difference between the kind of cognition you use to figure out how to get to your friends house and the kind of cognition Malfoy used in hpmor to think about the heritability of magic?

How do you teach the virtue of lightness, the virtue of curiosity?

How do you teach fallacies and cognitive biases without making your students temporarily dumber by giving them a fully general excuse to disregard any position or thinker they disagree with?

New Answer
Ask Related Question
New Comment

1 Answers sorted by

If you have a three hour one-shot, then I would be strongly inclined to focus on as few things as possible, while still pointing to the shape of the skillset. I expect the central challenge will be getting them to internalize the idea that different ways of thinking even exist. I would break it down into three things, which can each occupy ~1 hour:

  • Demonstrate how they way they think right now is wrong, via a cognitive bias.
  • Demonstrate a specific technique for overcoming that bias.
  • Argue that reality in fact has joints, and they can in fact be cleaved. Preferably with examples.

I expect the best results will come from hammering on the notion that thinking is a thing you can do on purpose, doing it by reflex leads to predictably wrong answers in a lot of cases, and doing it better is possible. If the choice of bias and technique for overcoming it is something they can immediately apply, so much the better.

Not sure I understand the "reality has joints that can be cleaved"-thing but sounds like a possibly valuable framing.

Do you mean that reality can be broken down into different gears and one can find out how the gears interact?

Would an illustration of this be a look at how humans, on a biological level, could be described as "selfish-gene"-style driven and, possibly, on a mental level modeled as multi-agent minds?

It's a reference to some posts in Rationality: A-Z [https://www.lesswrong.com/posts/82eMd5KLiJ5Z6rTrr/superexponential-conceptspace-and-simple-words]
6 comments, sorted by Click to highlight new comments since: Today at 7:29 PM

There used to be a set of Walter Lewin's physics 101 lectures on MIT opencourseware; they're probably still floating around on youtube somewhere. The very first lecture, he explained that his grandmother used to argue people were taller lying down that standing up - y'know, because there's less weight compacting them when lying down. And of course this is completely ridiculous, but he does the experiment anyway: carefully measures the height of a student lying down, then standing up. On the surface, he's using this to illustrate the importance of tracking measurement uncertainty, but the ultimate message is epistemic: turns out people are a bit shorter standing up.

He talks about this as an example of why we need to carefully quantify uncertainty, but it's a great example for epistemological hygiene more generally. It sounds like something ridiculous and low-status to believe, a "crazy old people" sort of thing, but it's not really that implausible on its own merits - and it turns out to be true.

Anyway, besides that one example, I'd say it's generally easy to make people believe something quickly just by insinuating that the alternative hypothesis is somehow low-status, something which only weird people believe. Heck, whole scientific fields have fallen for that sort of trick for decades at a time - behaviorism, frequentism, Copenhagen interpretation... Students will likely be even more prone to it, since they're trained to tie epistemics to status: "truth" in school is whatever gets you a gold star when you repeat it back to the teacher.

One of the better skills you can impart on them is internalizing their own fallibility. And not a one-time exercise that leaves them feeling that it was all a trick that does not apply to their day-to-day life. A personal reflection on how you came to realize your own limits, and how you grappled with being overconfident in something you believed. Maybe have them describe, to the class or in small groups, a case where they were 100% sure of something, and then realized they were wrong, and discuss the reasons they ended up being wrong, and what lesson they learned, or didn't learn. Then another exercise talking about what they are currently 100% sure about, and why, and which of their implicit assumptions would have to be invalid for this certainty to decrease to something reasonable. Bonus points to those who, after doing the exercise, actually changed their mind about something important and dear to them. Extra bonus points to those who concluded that they need to learn more and ask for more information on how to do that. Just off the top of my head, anyway.

the kind of cognition Malfoy used in hpmor to think about whether magic is heritable

Magic is heritable in hpmor. The only question was whether it's binary (one gene) or continuous (many genes, leading to a spectrum of blood purity). If you want to teach that kind of critical thinking, get ready for the fireworks when your students start asking which abilities in our world are heritable, binary or continuous.

Luckily, I don't know much about genetics. I totally forgot that, I'll edit the question to reflect it.

To be sure though, did what I mean about the different kinds of cognition come across? I do not actually plan on teaching any genetics.

Yeah, it came across.

After you're done with the class, do you think you could post a summary of what you ended up going with?

New to LessWrong?