I'm fortunate enough to go to a high-caliber American university. I study math and economics, so not fields that are typically subject to funding constraints or have some shortage of experts. The incentives to become a professor here seem pretty strong--the median professor at my school made over $150k last year, and more than two-thirds of the faculty have tenure. There is, as far as I can tell, little to no oversight as to what they research or what they teach. From the outside, it seems like a great gig.

And yet, most of my professors have been really bad at teaching. It's weird. And I don't just mean that they could be doing a little better. I mean they consistently present things in unclear or inconsistent ways, write exams that are extremely subject to test-taking skill shenanigans, and go off on rambling tangents that lead nowhere. My classes are often poorly designed with large gaps in the curricula or inconsistent pacing throughout the semester. I feel like I've actually learned something in maybe 1/3 of my courses.

I don't want to come across as someone who's just ranting--I'm legitimately confused by this phenomenon and want to figure it out.

Are my standards unreasonably high? I was fortunate enough to go to an excellent high school with some truly fantastic teachers. I also spent a little time in the US competitive math scene and encountered lots of wonderful and smart tutors along the way. And, most of all, I've been fortunate to be involved with a number of extremely good teachers in my time at rat camps, many of whom have pushed the boundaries of pedagogy and content quality. I realize I've been living in quite the intellectual bubble, and I don't want to overlook that.

My counterargument is that there are such strong incentives for professors that schools should be able to find good ones. My high school had a median teacher income of about $55k, and yet I would consider something like 20% of my high school teachers to be better than my university ones. University professors enjoy shorter working hours, the ability to do research, higher job security, and other benefits like housing. I don't understand why the bar for competence is so low.

It's also possible that my teachers just haven't been good for me. Again…maybe? But most of the areas in which my professors have underwhelmed me are really basic things, like not interacting with the audience, arriving late to lectures, or giving exams that are horrible proxies for understanding. I can't imagine any students preferring those to be the case.

Finally, I've heard the explanation that professors are there to do research, not to teach. While this might be true, I don't understand the decision from an institutional perspective. The school makes money and builds a brand name on students/alumni and loses utility on research (if you don't think that's true, name 5 MIT alumni then try to name 5 current professors there). If anything, it seems like universities should only be using research as an incentive to draw great professors, not vice versa.

I know that there are great teachers out there--I've had the pleasure of working with many of them at rat camps, Olympiad camps, and other cool places like that. Why aren't top universities filled with these people?

New to LessWrong?

New Comment
30 comments, sorted by Click to highlight new comments since: Today at 6:27 AM

Professors are selected to be good at research not good at teaching. They are also evaluated at being good at research, not at teaching. You are assuming universities primarily care about undergraduate teaching, but that is very wrong.

(I’m not sure why this is the case, but I’m confident that it is)

[-]niplav5mo137

Being nitpicky: Professors are selected to be legibly good at research.

Which means getting government grants, from which the university takes a cut of overhead.

And then brags about how much research funding they bring in, which is heavily used in creating a "ranking" of that school within that research area. Which isn't really wrong, if you're evaluating its ability to get someone from undergraduate to graduate research in that area, or grad to post-grad.

That ranking gets the school (or at least is thought to indirectly get it) student dollars and donor dollars. And sometimes state funding dollars.

So following the money and prestige, it's a simple story I think.

Agree in general, but there is an ecosystem of mostly-small colleges where teaching has higher priority, and most ambitious American students and their parents know about it. Note for example that Harvard, Yale, Princeton and Stanford do not appear in the following list of about 200 colleges:

https://www.usnews.com/best-colleges/rankings/national-liberal-arts-colleges

I agree that this is the case (and indeed, a quick google search of even my worst professors yields considerably impressive CVs). I don't understand why that's the case. Is it, as ErickBall suggests, simply cheaper to hire good researchers than good teachers? I find that a little unlikely. I also find it unlikely that this is more profitable--surely student tuition + higher alumni donations be worth more than whatever cut of NIH/NSF/etc. funding they're taking.

My question is who this system leaves better off? Students get worse professors, good researchers have to waste their time teaching and good teachers have to waste their time researching. Other than maybe the science journals or something, who has a stake in perpetuating this?

My question is who this system leaves better off?

A natural equilibrium of institutions doesn't have to leave anyone better off. Excellence at research is the most legible prestige-carrying property of professors, being good teachers is harder to observe. As Viliam points out, the purpose of raising researchers is best served by teachers who are good researchers, and also otherwise there is risk of content drifting away from relevance or sanity. So even for students, orgs with good researchers are more credible sources of learning, given the current state of legible education quality indicators.

My question is who this system leaves better off? Students get worse professors, good researchers have to waste their time teaching and good teachers have to waste their time researching.

I am quite curious about this, too.

I suspect there might be some kind of fallacy involved, something like "if we make a job that is for both research and teaching, we will automatically get people who are good at both research and teaching... even if we actually evaluate and reward them only for the research". Maybe, if someone sucks at teaching, it is assumed that they would never apply for such job in the first place -- they could get a job at some purely research institution instead. (So why does this not happen? I suppose that even for a researcher without teaching skills, a work at university can be preferable for some selfish reasons. Or they can be overconfident about their teaching skills.)

And the following step is that someone who is good at both research and teaching is obviously better than someone who is merely good at teaching, because such person will be able to teach the latest science. Which ignores the fact that a lot of what is taught at universities is not the latest science. But it is still better to have someone who has the ability to get the latest science right.

To steelman this position, imagine the opposite extreme: imagine a university where all teachers are great at teaching, but suck at research. It would be a pleasant experience for the students, but I would worry that a few decades later what the professors teach could be obsolete, or even outright pseudoscience. Also, teachers who are not themselves good researchers might have a problem to bring up a new generation of researchers; and where else would we get them?

I'd offer the counterpoints that:

a) Even at high levels, professors are rarely teaching the absolute cutting edge. With the exception of my AI/ML courses and some of the upper-level CS, I don't think I've learned very much that a professor 10-20 years ago wouldn't have known. And I would guess that CS is very much the outlier in this regard: I would be mildly surprised if more than 5-10% of undergrads encounter, say, chemistry, economics, or physics that wasn't already mainstream 50 years ago.

b) Ballpark estimate based on looking at a couple specific schools--maybe 10% of undergrads at a top university go on to a PhD. Universities can (and should) leverage the fact that very few of their students want to go on to do research, and the ones that do will almost all have 4-5 more years of school to learn how to do good research.

If I were running a university, I would employ somewhat standardized curricula for most courses and stipulate that professors must test their students on that material. For the undergrad, I would aim to hire the best teachers (conditioned on a very strong understanding of the material, obviously), while for the graduate school I would aim to hire the best researchers, who would have to teach fewer courses since they would never teach undergrads. Top researchers would be attracted by the benefit of not having to teach any intro courses, top teachers would be attracted by the benefit of not being pressured to constantly put out research, undergrads would be attracted by the benefit of having competent teachers, and PhD students would be attracted by the more individual attention they get from having research faculty's full focus. And as a university, the amount of top-tier research being outputted would probably increase, since those people don't have to teach Bio 101 or whatever.

I contend that this leaves all the stakeholders better off without being more expensive, more difficult, or more resource-intensive. Obviously I'm wrong somewhere, or colleges would just do this, but I'm unsure where...

I would aim to hire the best teachers (conditioned on a very strong understanding of the material, obviously), while for the graduate school I would aim to hire the best researchers, who would have to teach fewer courses since they would never teach undergrads.

This seems like an obvious solution, so I wonder whether some institutions are already doing it, or there is a catch that we didn't notice.

(This is just a wild guess, but it perhaps a university that only does a half of that -- i.e. hires best teachers and mediocre researchers, or best researchers and mediocre teachers -- would be just as popular, for half the cost. You cannot get unlimited amounts of students anyway, so if you already get those who want the best teaching, you don't need to also attract the ones who want the best research, and vice versa.)

I was thinking from the opposite direction, whether it would make sense for the professors to make pairs -- one who wants to teach, plus one who wants to do research -- and trade: "I will teach your lessons, if you write my thesis and add me as a co-author to your publications". Not sure if this is legal. (Also, it seems fragile: if one decides to quit or gets hit by a bus, the other's career is also over.)

There are in fact many universities that have both "research faculty" and "teaching faculty".  Being research faculty has higher prestige, but nowadays it can be the case that teaching faculty have almost the same job security as research faculty.  (This is for permanent teaching faculty, sessional instructors have very low job security.)

In my experience, the teaching faculty often do have a greater enthusiasm for teaching than most research faculty, and also often get better student evaluations.  I think it's generally a good idea to have such teaching faculty.

However, my experience has been that there are some attitudinal differences that indicate that letting the teaching faculty have full control of the teaching aspect of the university's mission isn't a good idea.

One such is a tendency for teaching faculty to start to see the smooth running of the undergraduate program as an end in itself.  Research faculty are more likely to have an ideological commitment to the advancement of knowledge, even if promoting that is not as convenient.

A couple anecdotes (from my being research faculty at a highly-rated university):

At one point, there was a surge in enrollment in CS. Students enrolled in CS programs found it hard to take all the courses they needed, since they were full.  This led some teaching faculty to propose that CS courses (after first year) no longer be open to students in any other department, seeing as such students don't need CS courses to fulfill their degree requirements. Seems logical: students need to smoothly check off degree requirements and graduate. The little matter that knowledge of CS is crucial to cutting-edge research in many important fields like biology and physics seemed less important...

Another time, I somewhat unusually taught an undergrad course a bit outside my area, which I didn't teach again the next year.  I put all the assignments I gave out, with solutions, on my web page.  The teaching faculty instructor the next year asked me to take this down, worrying that students might find answers to future assigned questions on my web page. I pointed out that these were all my own original questions, not from the textbook, and asked whether he also wanted the library to remove from circulation all the books on this topic... 

Also, some textbooks written by teaching faculty seem more oriented towards moving students through standard material than teaching them what is actually important. 

Nevertheless, it is true that many research faculty are not very good at teaching, and often not much interested either.  A comment I once got on a course evaluation was "there's nothing stupid about this course".  I wonder what other experiences this student had had that made that notable!

re institutional incentives, I've heard that part of US News rankings are based on asking survey respondents to evaluate other universities by reputation. Professors elsewhere (can only, and do) evaluate other professors based on the quality of their research, not teaching.

I'm curious, did you check what the quality of teaching would be like at your university before you went? If not, why? If so, why did you pick it anyway?

Professors being selected for research is part of it. Another part is the tenure you mentioned - some professors feel like once they have tenure they don't need to pay attention to how well they teach. But I think a big factor is another one you already mentioned: salaries. $150k might sound like a lot to a student, but to the kind of person who can become a math or econ professor at a top research university this is... not tiny but not close to optimal. They are not doing it for the money. They are bought in to a culture where the goal is building status in academic circles, and that's based on research. I also think you've had some bad luck. I had a lot of good professors and a handful of bad ones as an undergrad (good school but not a research university) and in grad school maybe a little more equal between good professors and those who didn't care much. But even in the latter cases, I rarely felt like I didn't learn anything. It just took a little more effort on my part to read the book if the lectures were a snooze (and yes, there were a few profs whose voices could literally put me to sleep in an instant).

some professors feel like once they have tenure they don't need to pay attention to how well they teach.

I imagine that if they taught well before, they would still teach well by the sheer force of habit. Maybe slightly worse because they no longer bother to do it perfectly, but not "consistently present things in unclear or inconsistent ways".

Those who are good teachers will continue to be good teachers. Example of this from a prof I know who won teaching awards and continues to teach basically the same way now that she's gotten tenure. She likes teaching, that's a big part of why she's good at it, she's not about to phone it in. I think what's slipped a bit post-tenure is the amount of resources she devotes to publishing her research. I don't think her actual research has slowed down any, because she also likes that part, she's just not focused on getting papers out the door ASAP because her continued employment no longer depends on it.

There are universities with better teachers, but they tend to be those that focus on their undergrad programs, and not the big prestigious ones with massive endowments. They'll hire people who aren't as prestigious in their fields but are good at teaching (and get a nice discount on staffing costs). The prof above works at such a university; part of her job interview was giving a lecture to a room of students and faculty, and how well she did was IIRC part of the reason why she was hired in the first place. The same tiny university was a pioneer for major chunks of the current paradigm for undergrad engineering programs, and the program director spent a sabbatical at the University of Waterloo improving its undergrad engineering program. That being said, even at a more teaching-focused university like that one, there were still some real bad profs (and not necessarily people with tenure or who did a lot of good research, just inexplicably bad teachers).

That's fair, most of them were probably never great teachers.

That's not luck. Non-research universities do select faculty by teaching skill.

I'm not fully convinced by the salary argument, especially with quality-of-life adjustment. As an example, let's imagine I'm a skilled post-PhD ML engineer, deciding between:

Jane Street Senior ML Engineer: $700-750k, 50-55hrs/week, medium job security, low autonomy

[Harvard/Yale/MIT] Tenured ML Professor: $200-250k, 40-45hrs/week, ultra-high job security, high autonomy

A quick google search says that my university grants tenure to about 20 people per year. Especially as many professors have kids, side jobs, etc. it seems unlikely that a top university really can't find 20 good people across all fields who are both good teachers and would take the second option (in fact, I would guess that being a good teacher predisposes you to taking the second option). Is there some part of the tradeoff I'm missing?

For a professor at a top university, this would be easily 60+ hrs/week. https://www.insidehighered.com/news/2014/04/09/research-shows-professors-work-long-hours-and-spend-much-day-meetings claims 61hrs/week is average, and something like 65 for a full Professor. The primary currency is prestige, not salary, and prestige is generated by research (high-profile grants, high-profile publications, etc), not teaching. For teaching, they would likely care a lot more about advanced classes for students getting closer to potentially joining their research team, and a lot less about the intro classes (where many students might not even be from the right major) - those would often be seen as a chore to get out of the way, not as a meaningful task to invest actual effort into.

Yeah, the joke for professors is you can work any 60-70 hours of the week you want, so long as you show up for lectures, office hours, and meetings. It's got different sorts of pressures to a corporate or industry position, but it's not low-pressure. And if you're not at the kind of university that has a big stable of TAs handling a lot of the grunt work, you're gonna have a number of late nights marking exams and papers or projects every semester, unless you exclusively give students multiple-choice questions.

Also, getting to the point of being a tenured professor is a process in and of itself. Not getting tenure means you likely get laid off.

One other thing a lot of people are missing here is that most "professors" at universities today are not tenured, or even tenure-track. They're adjuncts or sessional lecturers, who are paid more along the lines of $70k a year (often less) for what is in practice a similar workload with similar education requirements, except consisting entirely of teaching, with literal zero job security. Sessional lecturers sometimes find out only a couple of days or weeks in advance what they are being asked to teach for the semester, if anything.

Hm... I seem to have mistaken "flexibility" for low hours and underestimated how much professors work. Is "teaches math at Stanford" really viewed much lower than "researches math at Stanford" (or whatever college)? It seems like universities could drum up some prestige around being a good teacher if that's really the main incentive.

From where do you get the 40-45hrs/week number?

Jane Street is a pretty extreme comparison. An easier one is that a good software engineer at Google can, in their late 20's, make 2x what a tenured professor makes by the end of their career, with similar or better work/life balance. Tenure becomes irrelevant when you can retire by 40.

Update: someone IRL gave me an interesting answer. In high school, we had to take a bunch of standardized tests: AP tests, SAT and ACT, national standardized tests, etc. My school was a public school, so its funding and status was highly dependent on these exam results. This meant that my teachers had a true vested interest in the students actually understanding the content.

Colleges, on the other hand, have no such obligation. Since the same institution is the one administering classes and deciding who gets a degree, there's super low incentive for them to teach anything, especially since students will typically be willing to teach themselves the skills they need for a job anyway (e.g. all the CS kids grinding leetcode for a FAANG internship). There's actually so little accountability it's laughable. And with that little oversight, why would anyone bother being a good teacher?

God, I hate bad incentive structures.

I always thought it would be great to have one set of professors do the teaching, and then a different set come in from other schools just for a couple weeks at the end of the year to give the students a set of intensive written and oral exams that determines a big chunk of their academic standing.

Great point. There is eg the GRE, but doesnt test anything from college

Yeah it's pretty bad. There are some professors who are better than just reading the textbook, but unfortunately they're the exceptions. My undergrad experience got a lot more productive once I started picking my courses based on the *teacher* more than on the *subject*.

[-]Ben5mo20

I know the UK system, the US is probably the same.

Whenever a professor applies for a research grant, and gets it, the university gets a slice of the money. Whenever a professor publishes a paper in a fancy journal, the university gets a bit of prestige (and, in the UK, every few years some computer evaluates all those papers by some criteria of fancyness and doles out money to universities in proportion to how many good papers their employees have made.)

Two people apply for a professorship at the university. One of them last year secured a load of grants, which they translated into a lot of papers in Nature journals. One of them didn't. That is a measurable, quantifiable difference.  You can write down the numbers, and see that $X>$Y.

Maybe at interview you ask them both to do a dummy lecture. Its all a bit subjective though, nothing you can pin a hard and fast number on.

So any of the following is sufficient reason to hire the one with the grants and papers:

  • They are not that much worse at teaching.
  • The institution is cynical and wants money. The bottom line isn't directly damaged by bad teaching, it is by missed grants. Teaching only very softly feedbacks on revenue.
  • You (consciously or not) weigh objective measures more highly that subjective ones.
  • You believe in "the game". It wouldn't be fair to hire person B when everyone knows the game is research grants, and person A has played the game better. If the game doesn't matter then why do you deserve to keep your own job that you got by playing it so well?

You hire the grant-machine. Do they suddenly put all their time into teaching as best they can? Well to get promoted they need ... more grants! The teaching is something that pulls time away from the activities that the system rewards.

Yes, its an awful system. How to fix it I don't know.

A few aspects of my model of university education (in the US):

  • "Education" isn't a monolithic thing, it's a relation between student, environment, teachers, and body of material for the common conception of that degree.  Particularly good (or bad) professors can make a big difference in motivation and access to information, and can set up systems and TAs well or poorly to make it easier or harder for the median student.  That matters, but variance in student overwhelms variance in teaching ability.
  • "Top" universities are generally more focused on research, publication, and prestige than on undergraduate education.  Professors are tenured for research and prestige, not for teaching ability.  Many of them think of their jobs as 'run my lab/work on papers with grad students first.  Do the minimum for most students, identify the future stars to get them into the "real work"'.
  • Much of the Alumnae value from the institution is about reputation, not about quality of the education they got.  If a school is optimizing for donations 15-years on (when the median successful student is getting rich enough to donate), they care about prestige and top outcomes, not median education.  
  • Quality of undergrad education is actually unimportant for most students.  If you're not staying in academia, you need the degree to get in the door of many jobs, but your actual skill and value will come from how well you can learn the actual job and apply what you've  internalized in school.  This will be more about how far beyond the coursework minimum you've gone, and how much you've "played with" and gotten good at stuff you've tried on you own.  The actual material is the bare minimum, usually outdated and incomplete.
  • For law and medicine, undergrad is is only about placement in the "real" school you get your final degree from.  For other advanced degrees, undergrad is really pre-grad school, and tends to be research-focused with fairly minimum effort into other classes.  Oh, and about washing out the students who want advanced degrees but aren't actually able to get themselves there.
  • For most degrees, the first 2 years are just plain worse than the higher-level courses.  If you're just starting, your current experience will likely get better.  But still not great if you only look at the coursework rather than all the resources for challenging yourself.
  • Most of the learning doesn't happen in lectures.  Find the study groups, TA sessions (and TAs willing to spend 1:1 time on interesting topics), and labs where you can really think and learn.
  •  I suspect the vast majority of students would be better off at a lower-ranked school or community college for the first 2 years, and then transfer to a middle-ranked (or top, if your goals and results match that way) university for the degree.  

You don't have much of a LW history, so I can't guess at your thoughts, goals, level of thinking, etc.  My recommendation for the median LW poster (interested in some fairly deep topics, top 20% IQ) who finds themselves at a top university and disappointed by the coursework would be to do enough studying of assigned and optional reading so you just don't worry about grades - get to the point where you just know this stuff.  Identify the outside-of-class reading and groups that challenge you on topics you want to understand more deeply.  It'll vary widely based on your ability, your professors' attitudes, and the institution's policies, but you may be able to take the more advanced/interesting classes sooner than most, and get more than most out of the overall experience.  

It does appear that the process to become a certified teacher is more rigorous than a university professor. The way k-12 schools track progress is also very different from colleges.