Faculty evaluations, an afterthought to most undergraduates, are a dicey subject among college faculty across the country. Opinions about them range from outright hatred to an abiding respect, with most educators falling somewhere in between. But at almost every American university, they are an unavoidable part of the educational landscape. Like grade point averages, they have become an unpopular but indispensable barometer of performance.

At Georgetown, these scores are an important component in evaluating a professor’s effectiveness as a teacher and weigh heavily on promotion and tenure decisions. The university has found in the Middle States Re-accreditation Process that student evaluations are a “reliable and valid” indication of teaching ability – more so even than peer reviews, the method preferred by professors skeptical of student opinion. In selecting a teacher for their faculty, the various departments probably rely heavily on these numbers. But in selecting a teacher for your classes, they may feel it is less important that you have access to these scores.

On its online Course Review, The Hoya has been posting summarized student evaluation scores since the fall semester of 1998. And virtually anyone who has shopped for classes at Georgetown has stopped there to browse. The information is limited. It gives, after all, only a raw score and some cursory information about the work required. But for a quick snapshot of the quality of a professor, it is a valuable resource. Unfortunately, the Course Review provides only a fraction of the total scores and it is significantly distorted.

The reason is simple. Last Friday, when professors handed in their Request for Course Evaluation Forms, many of them probably checked a little box that keeps their scores between them and their department head. To the surprise of many, about one-third of professors decide to exercise this option, judging by the number of course reviews available. We as undergraduates do not have access to a full one-third of the teacher evaluations we author. Undoubtedly, many of the published professor and department averages are thus skewed upward as poorer class scores are censored. These professors might defend their decision by insisting that these scores do not accurately reflect the quality of the class but rather the grade expected by the student, the difficulty of the material or even the class’s time of day. Further, they might say, these scores symbolize the “commoditization of education.” By ending the semester with a customer satisfaction survey, these forms put pressure on a professor to do the popular thing rather than the right thing. And they encourage students to think of education as a product to be purchased rather than a challenge to be overcome.

These objections are reasonable. The best professors are not always the most entertaining. And being pushed is often painful but constructive. But by conducting these surveys and then putting such emphasis on their results, the university tacitly admits that it trusts most of its student body understands this. And that such a significant portion of the faculty keeps their results private bespeaks the legitimacy these numbers hold with the student body. After all, if nobody trusted the scores, why would you bother suppressing them?

Most undergrads think nothing of the absence of a teacher’s evaluations numbers on The Hoya’s Web site. They do not realize that, more often than not, a professor may be attempting to hide below average scores from prospective students. Of course, I am sure there are some faculty members with solid evaluation scores who boycott the process for purely ideological reasons, but surely few would give up the heavy enrollments and general student esteem that come with high averages. Better to participate and say you think the numbers are meaningless, as many popular and politically astute professors do.

Many faculty members would like a complete overhaul of the course evaluation form, perhaps weighing the results based on the student’s expected grade. Others would like to scrap it entirely. There may be compelling reasons for this. But with the reformers and their scores hiding in the shadows there is little chance for university-wide assessment of the system. When choosing classes this semester consider a way to coax these unlisted professors into engagement – perhaps a little benign neglect.

Kevin Rubino is a senior in the College.

Have a reaction to this article? Write a letter to the editor.

Comments are closed.