Two More Things to Unlearn from School

In Three Things to Unlearn from School, Ben Casnocha cites Bill Bullard's list of three bad habits of thought: Attaching importance to personal opinions, solving given problems, and earning the approval of others. Bullard's proposed alternatives don't look very good to me, but Bullard has surely identified some important problems.

I can think of other school-inculcated bad habits of thought, too many to list, but I'll name two of my least favorite.

I suspect the most dangerous habit of thought taught in schools is that even if you don't really understand something, you should parrot it back anyway. One of the most fundamental life skills is realizing when you are confused, and school actively destroys this ability - teaches students that they "understand" when they can successfully answer questions on an exam, which is very very very far from absorbing the knowledge and making it a part of you. Students learn the habit that eating consists of putting food into mouth; the exams can't test for chewing or swallowing, and so they starve.

Much of this problem may come from needing to take three 4-credit courses per quarter, with a textbook chapter plus homework to be done every week - the courses are timed for frantic memorization, it's not possible to deeply chew over and leisurely digest knowledge in the same period. College students aren't allowed to be confused; if they started saying, "Wait, do I really understand this? Maybe I'd better spend a few days looking up related papers, or consult another textbook," they'd fail all the courses they took that quarter. A month later they would understand the material far better and remember it much longer - but one month after finals is too late; it counts for nothing in the lunatic university utility function.

Many students who have gone through this process no longer even realize when something confuses them, or notice gaps in their understanding. They have been trained out of pausing to think.

I recall reading, though I can't remember where, that physicists in some country were more likely to become extreme religious fanatics. This confused me, until the author suggested that physics students are presented with a received truth that is actually correct, from which they learn the habit of trusting authority.

It may be dangerous to present people with a giant mass of authoritative knowledge, especially if it is actually true. It may damage their skepticism.

So what could you do? Teach students the history of physics, how each idea was replaced in turn by a new correct one? "Here's the old idea, here's the new idea, here's the experiment - the new idea wins!" Repeat this lesson ten times and what is the habit of thought learned? "New ideas always win; every new idea in physics turns out to be correct." You still haven't taught any critical thinking, because you only showed them history as seen with perfect hindsight. You've taught them the habit that distinguishing true ideas from false ones is perfectly clear-cut and straightforward, so if a shiny new idea has anything to recommend it, it's probably true.

Maybe it would be possible to teach the history of physics from a historically realistic point of view, without benefit of hindsight: show students the different alternatives that were considered historically plausible, re-enact the historical disagreements and debates.

Maybe you could avoid handing students knowledge on a silver platter: show students different versions of physics equations that looked plausible, and ask them to figure out which was the correct one, or invent experiments that would distinguish between alternatives. This wouldn't be as challenging as needing to notice anomalies without hints and invent alternatives from scratch, but it would be a vast improvement over memorizing a received authority.

Then, perhaps, you could teach the habit of thought: "The ideas of received authority are often imperfect but it takes a great effort to find a new idea that is better. Most possible changes are for the worse, even though every improvement is necessarily a change."

149 comments, sorted by
magical algorithm
Highlighting new comments since Today at 8:29 PM
Select new highlight date
Moderation Guidelines: Reign of Terror - I delete anything I judge to be annoying or counterproductiveexpand_more

In school, There are right answers. Copying from a reference work with known solutions is forbidden. Copying from someone else is forbidden. Asking someone who knows is forbidden. Working with others is often forbidden. Testing out the answer against reality is forbidden or impractical. You are expected to find the right answer by rooting around in your own head.

It would be difficult to find more crippling and maladaptive habits to instill in a mind that wanted to deal with reality.

Testing out the answer against reality is forbidden or impractical.

Not to mention counterproductive. You don't want right answers, you want 'Right' answers. Reality is far too narrow minded to be a good authority-figure-satisfier.

There goes my foolishness again. When am I gong to get with the program? I'll try that last sentence again.

You are expected to find the right answer by rooting around in your own head for what the teacher said.

Better?

The pendulum seems to swing between the extremes of teaching that there are always right answers or never right answers. It also varies from place to place.

The swing of that pendulum always misses what's important in the real world - getting a better answer. Or, for some shameless pandering, getting lesswrong.

It is obvious to most teachers, and to many students, that school tests and rewards are often quite at odds the usual stated purposes of school. It often seems like there are other ways we could teach and test that would be more in line with those stated purposes. You seem to be suggesting such alternatives.

But I think we have to take very seriously the fact that schools have long had the option to switch, and have chosen not to. I conclude that the real purpose of school is somewhat different from the stated purpose, and that the things taught are in fact more useful for the real purpose.

Well, this much is clear: The people who run the schools are acting as if schools have a different purpose (like turning out good workers, or impressing bureaucrats). This does not necessarily mean that schools themselves have this other purpose (at least I don't think it means that).

University English Lit departments should be closed down for teaching appalling habits of thought to impressionable young people.

You read the books, and then you pick up elements from them and turn them around a bit until they line up nicely to form a pleasing argument. The more tenuous (sorry, 'sensitive') your reading, the more marks you get. The more 'powerful' the story you weave, the more marks you get. Especially if it chimes with the prevailing intellectual fashions. Extra points also for being subversive or challenging the (straw man) orthodoxy. Looking behind the superficial to decode the deeper truth is, of course, compulsory. Marks deducted for anything as neolithic as thinking literature might teach us anything about the human condition.

Never do you weigh the merits of your chosen interpretation against other available interpretations - in fact the question is nonsensical, because there are no criteria for comparison. There is no analog of testing whether your hypothesis is consistent with facts. Never do you consider how the elements of your 'reading' hold together or relate to the real world - that is to say you can employ any half comprehended 'philosophy' without being held to task if that 'philosophy' is a poor description of reality. Internal logical consistency is not required.

Once you learn the tricks, it is child's play to get a first class degree.

Then you go out into the world and start applying your mental habits to the real world. For the results, see newspaper columnists, novelists and playwrights taking on topics such as economics, politics and foreign policy.

I am aware the above might make me look a bit like a nutjob .. perhaps I just had a particularly unpleasant match with my Eng Lit faculty. But I reckon there's something in it.

Re: "Wait, do I really understand this? Maybe I'd better spend a few days looking up related papers, or consult another textbook," they'd fail all the courses they took that quarter. A month later they would understand the material far better and remember it much longer - but one month after finals is too late; it counts for nothing in the lunatic university utility function.

This line of thought reminded me of Robert Frank's The Economic Naturalist: "When students are given tests designed to probe their knowledge of basic economics six months after taking the course, they do not perform significantly better than others who never took an introductory course. This is scandalous."

I gather the goal of Frank's student assignments is to have them think, even if imperfectly, rather than to parrot well.

I think we should do that testing on a random basis, and only the ones who pass actually get degrees. Retention is everything; fail to retain and you get no degree.

You seem to be under the impression that modern college degrees are mostly certifications of knowledge rather than long, expensive, politically correct intilligence/conscientiouness tests. What gives you that idea?

Depends what your degree is in. I'm studying nursing, and it seems to me that the point of it is to finish with a licence, i.e. a piece of paper proving that you know how to perform a certain range of skills and can do so safely. (Actually, I guess that's the point of the provincial exam...the point of the degree itself is to prepare you for the provincial exam, but just graduating from a nursing program doesn't automatically give you a licence.)

I'm guessing this is true for engineering, too, and any other degree where you're going to end up doing a job that affects people's safety and survival. And I think it's true for a lot of community college programs. I don't know what percentage of all college and university degrees these kind of degrees represent, but it's non-zero.

To a degree, this is true of any technical/science degree. An industry lab would hire someone with a degree in chemistry, but probably not someone with a degree in English. So that does make the degree proof of knowledge, not just conscientiousness.

I recall reading, though I can't remember where, that physicists in some country were more likely to become extreme religious fanatics. This confused me, until the author suggested that physics students are presented with a received truth that is actually correct, from which they learn the habit of trusting authority.

You're probably thinking of the engineering (and hard sciences in general) correlation; see http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=205920319 or http://www.theatlantic.com/magazine/archive/2008/01/primary-sources/6559/2/ and the original paper, http://www.nuff.ox.ac.uk/users/gambetta/Engineers%20of%20Jihad.pdf

Biologists are more likely to be rationalists! (Maybe we should all be learning biology?)

I think I'd phrase it the other way as 'biologists who do not compartmentalize are more likely to be atheists' (since if religious claims are literally true, terrorism actually looks like a pretty good idea).

Possible purposes of school include: 1) babysitting, 2) social mixing, 3) sorting by intelligence and/or consciensciousness, 3) imprinting work habits, 4) learning specific useful skills or knowledge, etc. If you know what general skills tend to be useful in typical office jobs in our economy, you will see relevance of typical work habits imprinted and characteristics sorted in school.

Considering the number of complaints I hear about recent graduates not being good at work, it's possible that schools aren't doing a good job of preparing people for typical office jobs--- after all there isn't reliable feedback from graduates or employers to the schools.

I suspect the short-run goals (baby-sitting, status enforcement vs. children and teenagers, acquisition of easily checked credentials) are the ones mostly being served.

Considering the number of complaints I hear about recent graduates not being good at work, it's possible that schools aren't doing a good job of preparing people for typical office jobs...

In my experience, schools aren't doing a good job of preparing them for software engineering jobs, either. Most of the candidates I've seen (and I've seen quite a few) run the gamut:

  • Has heard the word "linked list" before (just for example). Doesn't know what it means.
  • Has heard the word "linked list" before. Knows what it means. Doesn't know what it's for.
  • Can answer basic questions about data structures and algorithms. Knows what they're for, in theory. Doesn't know how he'd actually use them.
  • Knows how to use basic data structures and algorithms. Can apply them, but only if he is given a source code file with clearly labeled "YOUR CODE HERE" sections. Doesn't know how to get help, or how to ask for help. If he gets stuck, just sits there, staring forlornly at the screen.
  • Knows how to write programs. Knows how to ask for help directly. Doesn't know how to find help on his own.
  • Knows how to write programs and how to look up answers to questions. Forgets the answers as soon as he looks them up; doesn't know how to correlate them into a general picture. Does not believe that a general picture exists.
  • Knows how to write programs, how to look up answers, how to ask for help, and is able to actually learn on his own.

The last category is actually employable, and makes up maybe 5% of college graduates. The other 95% either lack the basic CS vocabulary needed for learning, or are unable to learn at all, which renders them quite ineffective as far as real-world software engineering is concerned.

Does this mean that those of us who are close to the bottom category actually have a really good chance of getting software engineering jobs? Or is the job selection process equally defective?

My own job-seeking data is stale, since I do have a job currently. From what I've seen, though, there's always a need for the following two categories of software engineer:

1). Someone with a lot of experience, who can easily pick up (and extend) whatever eclectic patchwork of frameworks and home-grown code exists at the current company, and
2). A smart programmer with little to no experience who could work for (relatively) little pay, while learning how to solve real-world problems.

The big problem with category (2) is that these programmers are most often employed as interns, either in a temp position, or with an option to get hired full-time. Unfortunately, interns get paid very poorly.

The job selection process itself can indeed be very defective, but the advantage here lies with the job-seeker. People who have actual skills, or the ability to acquire such, are relatively rare, and thus there's always some degree of demand for them -- which means that they can pick and choose among multiple employers. Chances are that at least one of these companies would have a decent job application process, where they let you talk to actual developers, solve small programming problems, etc. -- as opposed to answering inane questions about the shape of manhole covers, or what your biggest weakness is, etc.

(Replying again to old post)

How do you get past the resume keyword-scanning process where you worked with language X for the past 10 years and even if you are in category 1, nobody will hire you unless the job is for language X? Even if there is not literally a computer scanning resumes, a prospective employer should do a Bayseian update that compares the chance that someone without Y on their resume can pick up Y and the chance that someone with Y on their resume is skilled at Y. You would never get to the interview stage unless the total number of resumes is small enough that the employer is willing to interview even applicants with low probabilities of being suitable candidates.

One popular method, if you've got a programming job, is to write some automation tools or other minor, relatively language-agnostic projects in whatever language is buzzword-compliant at the moment. Wrote a few build scripts in Ruby? Congratulations, you've deployed Ruby infrastructure in a mission-critical environment.

This will usually come out once you're talking to a human, but at that point you can talk about your personal projects and show off your actual knowledge of the language.

Yeah, that's tough. The only way out of this that I can think of is to keep practicing other languages and frameworks in your free time, by building your own projects. This way, you could put these languages on your resume and be fully honest about it.

Alternatively, you can do what apparently 99% of job applicants at our company are doing, and lie through your teeth. Normally I'd argue against this approach, but the fact that the vast majority of applicants are doing it is evidence for the viability of the strategy; the fact that people like me actually say stuff like, "It says here you know C#, so let me ask you a basic C# question" is probably just bad luck for them.

I have read the opinion that the invention of public schooling in its current form was designed to create a more agreeable populace and workforce. People who will do what they are told, basically.

The exceptional would rise to the top naturally, and overcome the barriers set in place, but it would ensure the less than gifted stayed mediocre.

I haven't read any studies to this affect, but it seems plausible, if rather conspiracy theory-ish. I don't think it was quite so intentional, but it seems to be the result, and it has a great deal of momentum for those who run things to allow it to continue if they ever recognize the truth of it.

Could you imagine the nightmare it would be to be a politician in a country where everybody is skeptical of authority as a matter of course? It's a politician's worst nightmare to have someone question his reasoning if it is based on something flimsy or non-existent, what if that happened by default for everyone in the country?

I have read the opinion that the invention of public schooling in its current form was designed to create a more agreeable populace and workforce.

On the one hand, I wouldn't say the modern U.S. public school system was "designed" at all. Rather, it arose over the course of generations from disparate 19th-century origins. Even now, in spite of the federal government, it's hugely local.

On the other hand, a lot of parents would be thrilled if the schools they have to send their children to were actually good at keeping students "agreeable" (as opposed to violent) and equipped them to join the workforce in some way -- any way.

If you want an equally cynical, but much more sophisticated view of the American public school system, I'd suggest something from public choice theory

It was certainly modeled after the Prussian style schools, so it was designed in a certain sense. That a group of movers and shakers actually sat down and said "this is how we want our schools to be and why, what can we do about it?" seems far fetched to me, but not impossible. More likely the Prussian style was much more impressive, and politicians pushed for it without regarding whether or not it was actually a superior system.

The Prussian system was specifically designed to replace the local aristocracy by instilling obedience in the crown through indoctrination. It was one of the primary goals, the other being preparing the population with skills needed to operate in an industrialized society - namely reading, writing and arithmetic.

I don't believe the American version of the same was as focused in its goals. There was no single entity that could drive such goals (we had no King), so it seems unlikely that the Whigs (the major proponents of the system) were trying to indoctrinate students into supporting the Whig party. I think it was more generally thought of as a more efficient way to educate children.

The general effect is that students learn to do as they are told by whoever is in authority, so much so that this has become a virtue. In contrast, questioning authority was one of the founding principles of the country. Since the old system was community driven, often with one teacher teaching all subjects for all ages, the students were required to learn some things for themselves, and often also required to teach younger students what they knew.

Also, just because an agreeable workforce was a goal does not mean it actually succeeded. In general, however, I think it has done a reasonable job - much better than allowing students to think for themselves would have.

I'm somewhat familiar with education in the U.S., and I perceive a lot of heterogeneity. Public schools vary widely, to say the least. Aside from that, there are alternatives such as parochial schools and home schooling.

I'm not so familiar with schools outside the U.S. Which modern systems would you say are less authoritarian than the prevailing U.S. system?

Which modern systems would you say are less authoritarian than the prevailing U.S. system?

Finland comes to mind. Interesting system from what I've read about it.

How does Finland do things? Are there links?

P.S.

I have been informed that the PISA scoring system is a good metric for international comparisons. Finland seems to do really well! I truly know nothing about their educational system, however, or how authoritarian it may be.

I can only find http://news.bbc.co.uk/2/hi/8601207.stm in my Evernote, but it's the BBC so it should be reliable enough.

Then those most able to deal with the skeptics would rise to the top. It wouldn't get rid of politicians, but it would change the sort of politicians who have a relative competitive advantage.

Certainly, and a significant portion of the current crop of politicians would not be among the new group of politicians.

Do you see the problem?

Of course. But at least there are prospective politicians who could benefit from it. It's not as if they would uniformly stand to lose.

Yes, but that means changing the status quo, which means redistributing power, and if there's one thing people in power do not like, it is redistributing power.

We might ultimately have to make this happen by force. (I hope not... but I can't rule it out.)

Then, I suspect, whoever was familiar with a particular subject would still be massively outnumbered by everyone who wasn't. So you'd get a lot of junk data/bad reasoning thrown out, that on even light investigation turned out to be nonsense but that would determine the way the majority voted.

it's possible that schools aren't doing a good job of preparing people for typical office jobs

The highschool I went to attempts to prepare students for modern jobs. I hear that the educational model (project-based learning) is spreading to other local schools.

status enforcement vs. children and teenagers

This is going to sound very naive, I suspect, but I'm trying and failing to imagine how this came about. What the decision process that gave this result looked like, and what the people who shaped schools' goals, acting out of this motivation (among others of course), were actually thinking about their own motivations. I mean, I can't see myself designing an education reform and thinking that "teenagers need to be shown their place".

I mean, I can't see myself designing an education reform and thinking that "teenagers need to be shown their place".

If you haven't met any adults who think precisely that way, you've led a very lucky life. ;-)

It's easy. You just assign different rights and responsibilities to teachers and students. Which rights and responsibilities get assigned are going to come partly from 'common sense' and implicitly encode things about their relative age and status; you don't need to think about that part explicitly at all.

And sometimes it's semi-explicit. For example, there's a lot of evidence that many teenagers have sleep schedules that run late. They may not be able to get to sleep before midnight or 1 AM, but it's very difficult to get schools to give them later schedules.

Waking up early is thought of as virtuous, and letting teenagers get enough sleep is thought of as coddling them.

And study after study shows that students do better when school starts later... yet they hardly ever actually implement a later start time. Apparently something else is more important to the powers-that-be than actually making students do better.

Even a well-run school that's actually directed at useful teaching requires some structure. And that structure must be enforced on the children because only the most incredible children naturally follow the rules all the time.

Some of the rules that could be imposed don't really have correct answers. (I'd offered something like dress code as an example, but I predict that there has been some research on what dress codes are "best" for education outcomes). But having no rule is strictly worse than any plausible rule. So the principal picks a rule. The teacher doesn't agree with the chosen rule, but enforces it anyway for one reason or another (he doesn't object that strongly, he is concerned about punishments for his deviance, he is logrolling to get support for some other issue, etc).

And that's one way status enforcement rules get into the education system. Since the usefulness of education is not easily measured, there's a significant risk (as others have noted) that this and similar issues become more important than useful education.

Even a well-run school that's actually directed at useful teaching requires some structure. And that structure must be enforced on the children because only the most incredible children naturally follow the rules all the time.

This presumes that the structure needs rules. Some "free" schools get by with very few, and little or no status impositions. Also, homeschooling is another case where "structure" can be arbitrarily simple and tailored to the needs of a single child.

Really, the only reason rules are required to be "enforced on the children" is when you have the same rules for every child, regardless of fit. This is not a problem if your school has a 1:1 teacher:student ratio. ;-)

So there's an argument for going into the classroom (as a teacher) completely unprepared, stumble through the material, reason things out in fron of the students, go down hopeless calculations for a while, then say "scratch that", "let's see....hmmmmm", etc....Nothing should be clear, the students would have to make huge efforts just to find out what's on the hw. I know some people like this (not by design). I wonder if their students learn some important life-skills though.

In my own experience, this can work well in a small group with engaged students. I had an excellent optics class where we would try to derive a known result as a group: the professor would explain the experiment, draw a picture, and then ask us to help. If we got him going, he would take a few steps, then ask again. Now, I remember next to nothing of equations for optics, but I have a very good idea of how to go about figuring out the outcomes for various experiments theoretically.

On the other hand, I've had professors stop referring to notes partway through a derivation or proof, get dreadfully confused, and simply frustrate themselves and their students. So this may be an all-or-nothing: for a given day or proof or class, either do a group derivation or present the material on a platter.

I will say that I also had a high school English teacher who would use the wrong word or give a ridiculous interpretation in the hopes that a student would correct him and learn to not always trust authority. I liked the theory, but in practice it meant that the attentive students had to do work that was frequently repetitive and irritating, such as correcting word choice or grammar (as these were students who were already thinking) while those who could learn most from such a lesson never noticed it.

I will say that I also had a high school English teacher who would use the wrong word or give a ridiculous interpretation in the hopes that a student would correct him and learn to not always trust authority.

I had a teacher somewhat similar to that my freshman year in high school, except she was a last-minute replacement and was not really an English teacher. Her grammar was atrocious, and I ended up getting detention for correcting her too often (interrupting class or lack of respect or some such was the reason given on the detention). It was probably my first real experience with an authority figure being so utterly and obviously wrong, and I wasn't sorry at all for the detention. It was well worth it.

Here's my bad teacher story:

When I was 13 or 14, my physical science teacher was talking to the class about space probes with trajectories that take them outside the solar system. He said that such probes get faster and faster as they go. Thinking he either had misspoken or was intentionally being wrong to see who would catch his error, I corrected him. To my surprise, he said he had not misspoken and that he was correct. We argued about it a bit then he told me to write down a defense of my position.

Later that day, kids came up to me and said, "Why are you arguing with Mr. S? You know he's right!".

I wrote a weak attempt at a defense of the law of inertia (using a reductio ad absurdum argument if I remember correctly). When I gave it to him the next day, he praised it and conceded the argument -- but only privately. He never admitted he was wrong in front of my classmates.

I argued publicly with my German teacher about the derivation of 'case' in class. At the beginning of the next lesson, she started with an admission that she'd been wrong and I'd been right. In conceding to a twelve year old on her home ground in front of a class of other children that her job was to control, she taught me an awesome lesson about honesty and humility. I held her in huge respect after that and was her ally ever after. Thank you Ms Eyre.

Yeesh, that's terrible. It kind of figures that he'd rather mislead a class full of students about the way physics works than own up to his mistake.

It reminds me of an error I had been taught about the way airfoils work that wasn't corrected until I read a flippin comic strip on the subject almost a decade after I graduated high school.

I was stunned, and spent the rest of the afternoon learning how airfoils really work. What makes this particular example so tragic is it leverages another principle of physics that you won't realize doesn't fit if you are taught to accept everything the teacher says as gospel. What's worse is I'm pretty sure the mistake is still there in the vast majority of textbooks.

Evidence-based education does suggest teaching to learn is great for learning

I sort of unintentionally had this happen to me. When I was 13 I moved to a new school which was a little less competent than my previous school. In my previous school all the information was packaged and delivered to us to memorize. In this new school the teachers would just roughly go over the topic and it required a lot of independent effort on my part to understand the subject matter.

This led to this mini explosion of clarity in my head. While in my previous school I was bored of learning and depended almost exclusively on rote memorization, this new school forced me to think about why I know what I know. This transformed me into an incredible math/science nut. Just taking the information into your own hands and thinking about it turned me into someone who constantly raved about the beauty of mathematics.

But you are probably one in a hundred, if not even rarer. I am sure that most students who have the misfortune of being taught by an ill-prepared/incompetent/overly spontaneous teacher would just perform poorly in their studies rather than turning out the way you have.

Bad Habit #1) Don't notice when you're confused.

Bad Habit #2) All authoritative ideas / all new ideas / all ideas that have a few plausible reasons to support them, are true.

All textbooks should contain a few deliberately placed errors that students should be capable of detecting. This way if a student is confused he might suspect it is because his textbook is wrong.

Starting that in the current culture would be...interesting, to say the least.

I still recall vividly a day that I found an error in my sixth grade math textbook and pointed it out in class. The teacher, who clearly understood that day's lesson less well than I did, concocted some kind of just so story to explain the issue which had clear logical inconsistencies, which I also pointed out, along with a plausible just so story of my own of how the error could have happened innocently.

I ended up being mocked by both teacher and students as someone who "thinks he knows everything". Because of course, we all know that the textbook author not only does know everything, but is incapable of making typographical errors.

Oddly, at the time I was remarking on the error to stand up for a classmate who was expressing confusion. She couldn't understand why her (correct) answer to a question was wrong.

Great post, Eliezer (you've earned my approval). I think tied for worst school-nutured habit, along with parroting things back, is emphasis on what we think we know, as opposed to what we don't know. I think school science and history subjects would be a lot more interesting, and accurately presented, if at least equal time was given to all the problems and areas where we don't know what's going on, and for which there are various competing theories. Unfortunately one doesn't usually get this presentation of the state of things until one is working as a research assistant in college or grad school.

Maybe I was lucky to have "better than average" teachers, or maybe the french school system is a quite different from the US one, but I remember several counter-example to those problems from my high school and university time, in maths, physics, chemistry and biology, I'll tell one example from each.

In maths, we were often asked to figure by ourselves (intuitively at least) if a "theorem" would be true or not, before being a proof of it being true to false.

In physics, we were given experimental results and asked to draft what law could the results follow. It lacked the "devise new experiments to test your law" part, but it's still better than nothing.

In chemistry, we were once given a substance (potassium permanganate, but we weren't told what it was) and a set of solutions, and we were told the substance was used to test solutions, but not how, and we had to figure out what it could test (acidity).

And in biology, in genetics, it wasn't uncommon to give us some experimental results over generations, and ask us to devise the way a given characteristic was reflected in gene (using one or two gene, on sexual chromosome or not, dominant or not, ...). I remember even being told "try to make a law on part on the data, and then test it on the rest of the data", which is close as we can get to real experimental method on paper.

Another one in biology was a very interesting "proof" of evolution : we had two boxes, on each we were putting cotton with water and sugar, a pill of antibiotics in one side, and some bacteria in the other side. One box was to be exposed to UV light for a light while every day, the other not. Then we had two weeks to explain what will happen and how, and after we explained the predicted outcome, we would look at the boxes. (In the box that was exposed to UV light, the bacteria colonized everything, but in the one not exposed to UV light, the bacteria couldn't get near the antibiotics. After a few more weeks, the bacteria did spread everywhere in the two boxes).

Also, most of the teachers I had were very receptive and encouraging when pointed to a mistake they did (as long, at least, as the mistake was politely pointed at, not aggressively so), mitigating somewhat the "authority effect".

But I agree that those were rare, not exceptionally rare, but still much less common than the "here is the laws of newton mechanics, now compute the movement of a projectile with that initial speed and direction" or "here are the laws of thermodynamics and the gas state equation, now compute the final temperature of that system in which that compression was done". Which is better than pure "guessing the password", since you've to apply the laws and do computation, but which are still "here is the truth, apply it".

And I definitely would like we had more of those few examples, they were teaching much more than just giving the answer.

Also, sometimes teachers try to teach that way and fail miserably.

Like my genetics class: I did the experiments as well as I could and got a result different than what the teacher expected. This was marked as "wrong". Yet... was it actually wrong? Perhaps I did the experiments wrong, I'm not sure. But if so, that should have been pointed out. I actually think the fruit flies just didn't behave as neat little models but instead as complicated messy lifeforms, and as a result I got the "wrong" answer by being too close to reality.

I feel similarly about my high school general science class. The teacher tried to make it hands-on etc. by having us run experiments and do simple engineering within the theme of the current topic, but a. he rarely explained the prerequisites to the projects and b. he still treated them as "one right answer", and you would receive a poor grade if your experiment had odd results or your project didn't meet the standards.

As you say Joshua, ad hominem. Since you ask, it's from providing therapy to friends who were damaged by the school system. But nobody here has alleged that I'm off-base in my description (as opposed to suggestions and conclusions), and therefore it doesn't matter how I got an accurate description, only that I did. As I recently told a schooled friend who was taught silly rules, "The first rule of math is that it doesn't matter how you get the correct answer, so long as it is correct."

(I also explained that "Math is what you do when you don't know what to do next. If you already know exactly how to solve a problem, it's not math, it's computation.")

Wait, don't leave us hanging! What's the real purpose?

I think the worst thing I learned in school was how to kill time.

On the propositon that 'knowing that you are confused is essential for learning' there is a structural equation model, tested empirically on 200+ subjects, that concludes that the ability of knowing-that-you-don't-understand is an essential prerequisite for learning, in the sense that people who have that ability learn much better than those who do not. Three other individual difference variables are also involved, but only come into play after the person realizes that they don't understand something. Its called 'Learning from instructional text: Test of an individual differences model' and is in the Journal of Educational Psychology (1998), 90, 476-491.

Another well-known study was of students learning a computer language from a computer tutoring program, in which all their keystrokes during learning were captured for analysis, and the biggest correlation with successful learning was the number of times they pushed a button labeled 'I don't understand.' (John Anderson's of Carnegie-Mellon)

Another famous result was from the notorious California State Legislature-mandated study of self-esteem: in high school seniors, i it was found that students with the highest self-esteem when they graduated -- they thought they already knew everything -- were those with the lowest self esteem the next year-- they couldn't keep a job because -- they thought they already knew everything.

The problem with self esteem is that you need a middling amount. Too little can lead to depression, too much can lead to narcissism and intractable ignorance.

Too little can lead to depression, too much can lead to narcissism and intractable ignorance.

Yes, and to make this more concrete, there are studies which show that since the self-esteem movement has grown, college students have become much more narcissistic. See this article.

There's also conflicting data about how criminality and self-esteem correlate. With some metrics self-esteem is inversely correlated with criminality but under other metrics it is postiively correlated. The positive correlation seems to win out when looks at repeat violent offenders. See 1, 2, 3 ,4. That last source is an extensive review of the literature on self-esteem and violent behavior, and suggests that a fair number of the studies which show an inverse correlation have methodological flaws or other problems. They make what seems to be a strong case that the correlation is actually positive ( I haven't read most of the sources they cite).

That article reads like it has a very large political axe to grind. While empathy may have decreased due to some large-scale social changes, blaming the "self-esteem movement" is confusing correlation with causation. I'd be curious to know, for instance, if people in urban communities score lower empathy than people in rural communities.

It seems reasonable that a lack of empathy and grandiosity would be associated with violent behavior, but I don't think it's meaningful to call this "self-esteem" or blame a movement that tries to make people feel better about themselves. There's a problem with your measure of self-esteem if it correlates with not being able to admit when you're wrong: that shouldn't be called self-esteem! A secure person is more likely to admit when they're wrong.

The survey in the first article measures empathy; I don't see the self-esteem surveys anywhere, but that last link says

it may be more correct to say that a form of high self-esteem -- more precisely, a highly favorable and possibly inflated view of self that is confronted with an external threat -- leads to violence.

That final article also refers to 'egotistical' and 'arrogant' as terms of "high self-esteem". While it makes sense that egotistical and arrogant people may be more likely to be violent, it's highly misleading to call that having high self-esteem. The article seems to be talking more about lacking the ability to react well to criticism, which sounds more like low self-esteem, not high. (That final article does note that many of the scales that measure self-esteem may be biased either negatively or positively.)

(Edited to make clear which article I mean in the last paragraph.)

While it makes sense that egotistical and arrogant people may be more likely to be violent, it's highly misleading to call that having high self-esteem. The article seems to be talking more about lacking the ability to react well to criticism, which sounds more like low self-esteem, not high.

So, the once standard explanation of bullying was "they have low self-esteem, and do it to feel better about themselves." This is not an explanation that plays well with actual bullies. The better explanation is "their self-esteem is higher than it should be, and they need to use violence to make up the gap." If I think I'm level 6, but really I'm level 4, it makes sense to say my self esteem is high (too high), and whenever someone criticizes me I'll blow up because to me it looks like they're trying to reduce my status from level 6 to 4 (even if they just wanted to fix my spelling this one time and don't know who I am).

People with low self esteem (you're level 4, but think you're level 2) aren't likely to be violent because they don't have anything to protect / uphold by that violence. If you criticize them, you're reinforcing their low-status view, not contradicting it.

I think of self-esteem as a thermometer. It's a measure of something (Your value as a person? More narrowly, your deserved social status?) A thermometer should be high when it's hot out, and low when it's cold.

So should someone who is a famous scientist or entrepreneur have high self-esteem? Definitely. If they don't, they're doing something wrong.

But for the same reasons, a 14-year-old who is an idiot and bullies other kids should not be very proud of himself, and should instead be trying to change himself into something worth being proud of.

I think of self-esteem as a thermometer. It's a measure of something (Your value as a person? More narrowly, your deserved social status?)

What do you mean by "deserved" social status? Status is decided by those that grant it. And if people decide to grant status to Joe because Joe coerces them to, Joe's status is granted by his peers and thus deserved. That is, self-esteem is your vision of the esteem others give you.

Now, is Joe someone you or I would like to be around? No. But that doesn't mean he has low self-esteem, or that he isn't proud of himself.

Along slightly different lines, look at self-esteem as self-description. If I describe myself as "good-looking," and someone points out that my ears are grotesquely large, that will conflict with my self-description. If I describe myself as "bad-looking," the same comment with reinforce my self-description. If I describe myself as "assertive" and someone cuts in front of me, in order to maintain that description I need to rebuke them. If I describe myself as "submissive," then when someone cuts in front of me I might sigh, but if I do more it'll conflict with the self-description.

Typically, when converting self-description to self-esteem, one would say that good-looking is higher than bad-looking, and assertive is higher than submissive. A bully halts criticism and commands respect- the features of being held in high esteem- but obtains that esteem through violence and domination. At each instant, when someone is deciding how to respond to praise or an insult, they don't have time to run a calculation of which response will work better for them: they consult their self-esteem and see if what's happening matches what they expect to happen.

When I say "deserved", I mean MORALLY deserved. And yes, this is a shorthand for a mind-bogglingly complex set of concepts... but the same goes for most words. If you really want to get into what sort of characteristics would make one deserving of social status, we could do that; but I really think it's a waste of time.

It should really be enough to point out some obvious examples where actual status does not equal deserved status. Alan Turing deserved more social status than he had: After making some of the most important contributions to scientific knowledge in history and at the same time helping to literally save the world from fascism, he was driven to suicide in prison after being chemically castrated. Donald Trump has more social status than he deserves: He is a famous billionaire and TV star even though he is an incompetent narcissist born into wealth who has never made a real contribution to humanity in his life.

It should really be enough to point out some obvious examples where actual status does not equal deserved status.

If you couldn't use the word "deserved," could you still write this sentence? Easily: "Here are some examples of people that I hold in higher or lower esteem than I think society in general holds them."

You could gloss it that way, but you'd miss something very important: I think I'm RIGHT to do so. I don't think it's just some subjective esteem that I randomly happen to hold for some people over others. I think that it is IRRATIONAL to esteem Donald Trump (and yes, I think that most people are irrational; why else would I be on Less Wrong?).

I think I'm RIGHT to do so.

Do capitalized words, and the confidence they represent, result in a more precise map of the territory? Or do they convince us to draw our map to suit them, rather than to suit the lay of the land?

My understanding is that the 'self-esteem movement' tends to go for relentless, effectively information-free affirmations, based on an ideology that people need to be told they've done a good job whether or not they actually have. Handing out halos like nametags, in other words. It is not hard for me to imagine that such a thing could lead to unwillingness to accept criticism, in the same sense that obsessively sheltering children from any possible irritant leads to allergies.

That's a very good point. Part of the issue may be connected to the fact that no one seems to have an agreed definition of self-esteem. You seem to be doing the same thing here when you say "There's a problem with your measure of self-esteem if it correlates with not being able to admit when you're wrong: that shouldn't be called self-esteem!" We need to be careful to not argue over definitions.

Yes, tabooing "self-esteem" might be useful. Knowing when you're confused, being able to admit when you're wrong, and being able to handle criticism are important characteristics that I value, and these characteristics seem to be tied to learning.

I would suspect that these characteristics are associated with having a stable sense of your own value: that last article mentions a study that associates high and stable self-esteem with being less violent, but high and unstable self-esteem with being more violent.

It is certainly more ideal for a person to have high self esteem and also the security to admit fallibility, but the two are not mutually exclusive. Self esteem is exactly what it sounds like - how highly a person values themselves regardless if their belief is justified in the right context, morally or not (What Vaniver says goes into this better). Self esteem that is incongruent with reality or the context is the issue here, which is why programs that simply seek to boost self esteem without also teaching proper skills that can justify high self esteem can create narcissistic individuals. Your comment below identifies this by indicating instability.

This also means that being unable to react well to criticism does not indicate low self esteem - you cannot assume a connection, let alone directionality based on your purported view of self esteem.

Its called 'Learning from instructional text: Test of an individual differences model' and is in the Journal of Educational Psychology (1998), 90, 476-491.

I have a copy of this if anyone wants it.

I quote from one of my favorite authors, Jamie Whyte:

Alas, most know next to nothing about the ways reasoning can go wrong. Schools and universities pack their minds with invaluable pieces of information--about the nitrogen cycle, the causes of World War II, iambic pentameter, and trigonometry--but leave them incapable of identifying even basic errors of logic. Which makes for a nation of suckers, unable to resist the bogus reasoning of those who want something from them, such as votes or money or devotion.

Perhaps I'm naive, but I think the problem can be alleviated by making the introductory logic course a requirement for all students. Such a course could include elements such as formal logic, inductive reasoning, or more specifically, how the scientific method is practiced. Perhaps it could even include some simple psychology so students can learn about our inherent biases in cognition, and then some statistics so they can learn about how data can elucidate the truth. Does this sound too ambitious?

I know this is was posted a long time ago, but I just want to note that when I took an introductory logic course in college, nearly every student came out of it thinking that introductory logic courses should be required at the high school level, if not earlier. It didn't include basic psychology or statistics, but an introduction to formal logic and the basics of inductive reasoning was still enough to transform the thought processes of most of the students who went through it.

Is the second thing to unlearn how to count? My traditional, school learned way of counting only finds one thing you want to unlearn, not two.

"..show students different versions of physics equations that looked plausible, and ask them to figure out which was the correct one": I used to do this. Professor Rosencrantz favours this, I wrote, and Dr Guildernstern that. I stopped when we started getting students who didn't know who R and G were.

I agree with the English lit comment. I know someone who went through that and she was able to pass classes without reading the books. In fact, the less she read, the better she did!

Cliff Notes (and Spark Notes) and the like are really spectacular at teaching you the kinds of things that literature teachers want to hear parroted back on tests. They aren't good at teaching you genuine understanding of literature, but that's not what's being tested for anyway.

Literature in English class generally serves as reading practice, and as an odd excuse to practice composing thoughts for other people to read. Literature is the vehicle rather than the purpose, unless you're looking at a literature degree.

I'm curious how to test an understanding of literature, and what purpose one serves. Intuitively, a person well-versed in literature should be better equipped to write or recommend fiction than a person who is not well-versed in literature. Is there another benefit one might test?

A good understanding of literature would be like having TV Tropes inside your head. Understanding literature lets you appreciate more complicated literature: if you're familiar with a literary device, you can recognize when it's being subverted or played with.

Also, simply having read a lot lets you recognize and make quotes and references, which can be fun.

I think the purpose of testing an understanding of literature is testing critical thinking and argument skills (when it's not guessing the password, of course). With that said, I think philosophy classes can facilitate a more rigorous learning environment for that purpose.

I never went to school. Bill Bullard seems to assume that without the indoctrinating influence of school, we'd be prissy self-effacing socialists. He's wrong, because I'm an individualist and I think his first two points are garbage.

Willingham alluded to the fact that critical thinking courses depend largely on the skill of teachers. From my personal experience, some teachers are excellent critical thinkers, but a majority of them are very bad...which is why I disagree with him when he states that critical thinking should not be taught on its own. Willingham proposes that critical thinking should be taught in the context of subject matter but I just don't think we have enough qualified teachers to do this.

Not long ago Doug S pointed us to this article suggesting a general failure of courses that attempt to teach "critical thinking." It's just a lot harder than it might seem.

That's a reason to do it better, not a reason to give up on doing it.

Teach a class about Michael Polanyi. Problem solved better. ;-)

Though I certainly take your point, I think giving tests to students is actually meant to combat the problem of parroting rather than understanding information. In high school I often complained that we wasted time taking tests when we could be learning new information, but if teachers determined when to move on to new material just by asking students whether we understood, I'm sure we would have always just nodded and parroted back the teacher's sentences. Knowing that you'll have to take a test on material (that you'll be asked to answer different problems using the same methods) encourages students to make sure they actually do understand the material. It might be that the best way to solve the problem of parroting rather than understanding material isn't to get rid of exams but to have more of them, or, really, to have better ones.

Can you imagine the reductions in the number of students able to handle coursework if professors actually made their students think rather than memorize.

Unfortunately, it seems that most universities are obsessed with making money and thus need to address the abilities and intellect of a wider audience... not everyone is capable of the upper level critical thinking suggested here.

I see the purpose of BSc or MSc is to learn to be able to make an analysis of a given subject (related to your particular field) and write a structured and coherent report of it. To do this you need to learn how to use the tools of analysis particular to your field (e.g. calculus, physics equations, balance sheet, schools of thought in philosophy etc.). So when you are done with the school and have BSc or MSc if somebody gives you data and a research question you can apply the tools and write a report/essay.

To be a researcher you need to learn how to make/enhance those tools & need to see the problems and limitations with the tools currently in use. This means you need to learn critical thinking, skepticism, to take wider perspectives.. Besides listening and doing you need to learn to ask, propose and argue.

It is a problem that some BSc or MSc (or BA, MA whatever) may never be exposed to this culture of critical thinking and become experts/pundits with narrow perspectives and a lack of skepticism.

Robin, good point. At the same time, there might be a large functional vs. optimal gap in the degree to which school is fulfilling its real purposes. Although the best way to optimize it might not be to brainstorm about how to get it closer to its stated purposes -so point well-taken on that end.

Robin,

and what are the stated purposes you're specifically thinking of?

A good post, Eliezer, but it brings to mind that quote about the horse, and the water -- you know the one I mean. In my college years (as a philosophy major) it because clear that there were students who actually went through the process of digesting, seeking broader context, checking out other sources, and so on. And there were students who were there to get a BA. I don't recall either group doing much better or worse on exams, papers, etc. But perhaps this is more common in the humanities, where reading is the main activity, than in the sciences...

As far as high school goes, Robin's point about the true purpose of school is on target -- it's obvious that the primary function of high school is keeping rowdy, hormonal, unstable adolescents under control and out of everyone's way until they stop being crazy. Also as a way to fill space between extracurricular activities.

My father is a college professor and he's going to be teaching an introduction to engineering course to future electrical engineering students. He's planning on making the students learn basic electromagnetic theory by forcing them to try to perform their own experiments with a pile of stuff that would have existed around 1900 or so.

"Today's assignment: In 1820, Hans Christian ├śrsted discovered a relationship between electricity and magnetism. Replicate his experiment and demonstrate that a relationship exists."

Hopefully, some student will eventually connect a wire up to a battery and put a compass near the wire, causing the compass needle to deflect. (The compass is included in the collection of stuff the students will be given.)

College students aren't allowed to be confused; if they started saying, "Wait, do I really understand this? Maybe I'd better spend a few days looking up related papers, or consult another textbook," they'd fail all the courses they took that quarter.

My education was not like this. My teachers and lecturers delighted in asking tricky questions to people who thought they understood because they had memorised gibberish.

School systems may be mainly rubbish, and school is slavery for children, but just routinely bashing it ignores the fact that we're the most knowledgeable generation there has ever been, and no-one knows how to do it better.

School systems may be mainly rubbish, and school is slavery for children, but just routinely bashing it ignores the fact that we're the most knowledgeable generation there has ever been, and no-one knows how to do it better.

Yes. Sometimes in discussions about school system I am not sure whether the message is "schools are imperfect" or "schools are obviously worse than X" (and what is this X specifically). Because I fully agree that school system should be improved, and in my opinion we should try many experiments and measure the outcomes. (Also there would be some discussion about goals, like: do we want the best education ever, or just a decent education for a reasonable price? how much utility do we give to kids having knowledge versus kids feeling happy -- I agree that both are important, but what exactly is the desired exchange ratio?)

Only when people start suggesting their improvements, then most suggested improvements would actually make things worse, because they ignore some existing constraints, such as: human nature, limited budgets, lack of "perfect" teachers, limited time, etc. As the article says: "Most possible changes are for the worse, even though every improvement is necessarily a change." That's also true for changes of education. So we have a meta-problem: how to teach people to think rationally about learning?

Because I fully agree that school system should be improved, and in my opinion we should try many experiments and measure the outcomes.

One can start cheaply by comparing outcomes.(There's a huge amount of not-invented-here bias in politics). Unfortunately, it is likely to turn out that in order to have a good public education system, you need to spend money).

Comparing outcomes of existing systems would be good, assuming that you have multiple systems used by the same population. Some countries have this data, other countries don't. For example, if majority of schools in a country follows a government blueprint, and only a few alternative schools are allowed to coexist, it is not obvious whether the differences between their results are caused by different education, or simply by a selection bias (alternative schools are chosen by parents who are more interested in their child's education). So if you already have data, definitely use it; but many countries don't.

it is likely to turn out that in order to have a good public education system, you need to spend money

A decent quality requires some financial treshold, but mere money does not guarantee quality. You can't have a great school if most teachers need to take a second job to pay their mortgage, or if the school cannot afford to buy any educational tools or any trivial extra expense which could solve problems. On the other hand, it is possible to burn huge amounts of money without getting any improvement. (For example you spend the money on thousands of education-related government employees, and expensive fasionable educational tools of dubious quality, and the schools get only a small part of the budget.) In my experience, when someone proposes giving more money to education, they usually have a very specific idea about how that money should be spent, and it usually requires mandatory buying of something they produce. (This part may be country-specific.)

Comparing outcomes of existing systems would be good, assuming that you have multiple systems used by the same population. Some countries have this data, other countries don't. For example, if majority of schools in a country follows a government blueprint, and only a few alternative schools are allowed to coexist, it is not obvious whether the differences between their results are caused by different education, or simply by a selection bias (alternative schools are chosen by parents who are more interested in their child's education).

If one is trying to improve the public education system in one country, one can compare it to the public systems in other countries, which will take in a broad swathe of the population.

I think you have to present people with authoritative knowledge though... without that, you are forced to re-develop the entire history of science in one lifetime, which humans just aren't smart enough to do. Maybe an ideal AI could do it, but we aren't so we can't.

I think a better plan is this: Give authoritative knowledge that tells you to distrust authoritative knowledge. This forces the mind into cognitive dissonance which then gets resolved by saying "Authoritative knowledge is useful---but not absolutely certain."

How reasonable can we be about school?

I share your feelings of dissatisfaction (and disgust?) with education as it stands. What would a rational school be like?

There isn't a consensus on its purpose (what utility it was optimising) but at least it could be predictable. Perhaps if we assume that some external group (parents? pupils?) identified a set of desirable properties to be optimised for, such as income, happiness etc. Then the role of educators would be to create automated classifiers to predict the distribution of these outcomes for each individual. Using these predictions they would alter a set of experiences in order to alter the outcomes of their predictions. Ideally the experiences would be repeatable (so probably automated programs). Could a meaningful and verifiable chain of reasoning be constructed that made school more than a glorified historical legacy? Would any of the existing subjects survive this kind of scrutiny? And finally what would the social consequences be of society viewing education as a system to optimise rather than a tool for the judgement of worth?

Excellent post! For those looking for an alternative to school for their children, I highly recommend The Unschooling Unmanual.

http://www.naturalchild.org/unmanual/

floccina, perhaps the real purpose of schools is sorting. Perhaps the idea that children must be formed into educated people by schools is just part of Pinker's "nurture assumption". Schools have an incentive to promote that assumption, as it gives them more reason to exist. However, if they don't actually know how to educate children (and as you note, it is hard to test whether they actually teach), why would we expect them to?

Not all schools/universities are as grim as all that. I went to a small liberal arts university where research professors taught small classes, and although it wasn't perfect, the students who wanted to learn critical thought were encouraged by many professors to do so.

trusting authority. Learning critical thinking of great minds is a decent start on developing one's own, and closer than most students will ever get.

"Maybe I'd better ... consult another textbook, they'd fail all the courses they took that quarter."

I did that occasionally, and passed!

with a textbook chapter plus homework to be done every week - the courses are timed for frantic memorization, it's not possible to deeply chew over and leisurely digest knowledge in the same period.

Almost no one would have the mental discipline to use the extra time to digest the knowledge.

And don't put memorization down -- deeper thought needs to complement memorization, but cannot replace it.

A month later they would understand the material far better and remember it much longer - but one month after finals is too late

The ideal approach would have graded work on a course spread out over as long a time as possible: Twice-weekly exercises, weekly quizzes, monthly midterms, semesterly exams, year-end finals, and a summary test at the end of the degree, like many European universities. There would also be papers and reports. All these would have a significant weighting in the final grade.

This would encourage students to continually re-learn the material. Also, tests and other graded work are great ways to learn in themselves -- at least while they are doing them, students are exercising their brains to some extent.

And of course, it would not harm the students' grades, since grades can be made to fall into any curve, high or low as you please, whether or not you make the students re-learn the material.

The main barrier to this proposal is that educators don't want to put the effort into grading.

Eliezer, I hate to raise an ad hominem point, but how do you know what you know about formal schooling?

Joshua

Eliezer promised us two school-inculcated bad habits of thought, and I count only one.

One problem with a professor telling students "I may be wrong." is that many of the students will hear that as "You must be right."