Two years after the online course evaluation system was first introduced in spring 2010, the University Registrar has completely eliminated paper evaluations this semester in addition to  posting evaluation results alongside course descriptions on MyAccess.

According to Manager of Faculty Evaluations Caitlin Harding, the online evaluation system was an optional choice for faculty until spring 2011. In fall 2011, the online evaluation became the default, and paper surveys remained an option for faculty. This is the first semester in which all evaluations must be conducted online.

University Registrar John Pierce explained that the Center for New Designs in Learning and Scholarship recommended the transition to improve the quality of course assessments.

“There are more opportunities for error with paper,” he said.

Pierce added that online evaluations allow professors to add customized questions, which generates more relevant responses.

Associate professor of French Andrew Sobanet, who has used paper evaluations until this semester, added specialized questions to his online course evaluations this year but had mixed feelings about the online system.

“The only negative thing I see in the online-only evaluation is that there has to be a mechanism put in place where we maximize student participation,” Sobanet said. “Otherwise you get the weird self-selection you see on Rate My Professor, where evaluations are more polarized.”

Another source of concern is that since the web option was launched, response rates for web-based evaluations have been lower than previous response rates for paper-based evaluations.

In spring 2011, paper-based evaluations yielded an 86.9 percent response rate while web-based evaluations yielded a 77 percent response rate. In fall 2011, the respective numbers were 92 percent and 71.1 percent.

Georgetown University Student Association President Clara Gustafson (SFS ’13) pointed out that the decline in response rates elicited concerns among faculty.

“A lot of faculty and staff were worried that they wouldn’t get the kind of percentage responses that they had in the past, since when they required us to do it in class it’s basically 100 percent participation,” Gustafson said.

Despite the drop in response rates for web-based course evaluations during the transition period, Harding expressed optimism that they will eventually increase.

“Nearly all institutions with an online course evaluation system report seeing a drop in response rates when the online system [is] initially rolled out, but these institutions also report a gradual  return to prior response rates or higher over the years following the conversion,” Harding wrote in an email.

According to Harding, Cornell University saw an increase from 50 percent to 70 percent after the full transition, while Brigham Young University also had a similar increase from 50 percent to 72 percent.

Harding pointed out that other universities’ experiences indicate that the most important factor in raising response rates is high-quality and frequent communication from instructors directly to their students.

“The idea is that if instructors encourage, welcome and explain their use of student feedback, the students will understand that their contributions make a difference and will be more likely to respond,” she wrote in an email.

As part of an effort to encourage students to complete course evaluations online, the registrar has created prize raffles offering iPads, iPod Shuffles and iPod Touches.

In order to make the survey process easier and increase student awareness about the importance of evaluations, the registrar has also sent out regular reminder emails to students with log-in information and links to the evaluations included.

Students were mostly in favor of the online evaluations, but also agreed that they could lead to a lower response rate.

“People who do fill them out are going to fill them out more honestly and put a little more thought into it than they would if they were just trying to do it at the end of class,” Bo Julie Crowley (COL ’15) said. “They might get a lower overall turnout but the responses will be more accurate.”

Pierce stressed the importance of maintaining response rates, as faculty members and academic departments rely heavily on course evaluations to improve their teaching and programming.

“The faculty in general takes evaluations seriously and uses responses to make continual improvements to courses. The institution relies on numerical results in merit review and tenure. We take it as a serious opportunity to solicit thoughtful feedback from students and actually use it to inform important decisions,” Pierce said.

Have a reaction to this article? Write a letter to the editor.

Leave a Reply

Your email address will not be published. Required fields are marked *