The Wisdom of Crowds of Frat Boys

Over at evolgen, RPM is indignant about being rated by students, citing some pig-ignorant comments from RateMyProfessors. Interestingly, someone brought this up to the Dean Dad a little while ago, and he had an interesting response:

A reader wrote to ask a dean’s-eye perspective on ratemyprofessors.com.

The short version: I consider it electronic gossip.

The slightly longer version: I’ve had enough social science training to know that small, self-selected samples lead to skewed outcomes.

The long version: and yet, from what I’ve seen, the ratings usually aren’t that far off. Which is fascinating, given that the methodology is so bad.

I don’t have a lot of ratings on RateMyProfessors (and I don’t want a bunch of fake ones, either, so don’t bother), but this description fits my experience with student evaluations in general.

I’ve generally found that, when I can manage to step back and look at them dispassionately, I have the same reaction to the comments I get on teaching evaluations. There’s usually one or two outliers– there’s always somebody who thinks it’s the Worst Class Ever– but for the most part, the aggregate comments tend to correlate pretty well with how successful the class really was. The classes where I’ve been happy with my own performance have generally gotten me good evaluations. The classes where I’ve gotten mostly negative comments have been classes in which I wasn’t at my best– the first term with a new syllabus, the term when I got in a shouting argument with the guy teaching one of the other sections, etc.

The numerical scores are a little dodgier, in part because the questions on the bubble-sheet form aren’t all that well formed, but the written comments taken as a whole have been fairly accurate. The hard part is managing to step back a bit and see that– it’s much too easy to take the negative comments personally, particularly as junior faculty, when your whole career seems to be riding on the poorly-expressed opinions of adolescents.

A couple of important disclaimers, here: I have the obvious advantage of being white and male and teaching in the physical sciences. I have heard from female colleagues and colleagues who teach classes that touch on race and gender issues that they get some astonishingly vile comments on the anonymous forms. I’ve also had the good fortune to mostly teach in sections for science and engineering majors, rather than general education classes full of economics majors who are just trying to fill a requirement.

That said, my experience with evaluations in general has been fairly similar to the Dean Dad’s experience with RateMyProfessors: I think the student evaluation system we use is terrible in many ways, not least the fact that it allows childish sophomores to write hateful massages to faculty who challenge their worldviews, but it can also be remarkably accurate.

I’ll also note that we don’t rely heavily on the written evaluation forms for the student portion of tenure reviews. During reappointment and tenure reviews, the review committee conducts in-person interviews with at least 20 students chosen at random from the candidate’s classes. Those give a much more accurate picture of what’s going on, from all reports, and are widely agreed to be vastly superior to the standardized comment forms.