At the April 13 faculty meeting, Richard De Veaux, chair of the ad hoc committee of teaching evaluation and professor of statistics, presented his committee’s findings. The other members of the committee include Antonia Foias, professor of anthropology, Luana Maroja, assistant professor of biology and Gage McWeeny, associate professor of English.
The College first implemented the student course survey (SCS) system in the 1960s. The system, which centers on students filling out bubble sheets for each course at the end of the semester, has remained essentially unchanged to this day.
The results of the SCS carry great weight for faculty. Faculty members themselves rely on the SCS, as well as the blue sheets, where students leave comments, to evaluate their own teaching. Departments and administrators also depend on the SCS to evaluate professors’ teaching and effectiveness.
In the 2005-2006 academic year, a committee met to review the SCS and implement changes as needed; however, the committee ultimately only slightly altered technical as-pects of the system. It opted to retain the basic structure of the SCS, only tweaking the wording of a few questions. The other recommendation of the committee was that a new committee reconvene in 10 years’ time to reassess the SCS.
It was in this context that De Veaux and his committee met to reassess the teaching evaluation system in September of 2015. The goal of their ad hoc committee was not to propose changes to the SCS, but rather to scrutinize how students evaluate courses and whether an update to the system was necessary.
In order to determine the appropriateness and effectiveness of the SCS, the committee examined relevant literature, as well as information about how the College’s peer institutions approach course and teaching evaluations. What the committee found was that the College is one of few institutions that hands out course evaluation forms in class, rather than online, and still relies heavily on bubble sheets.
“One of the things we saw from other places was that there was much more context,” De Veaux said. “You would talk more about your involvement in the course, and how much time and effort you spent in it; how much you cared; whether it met your expectations. It turned it much more into a retrospective about you and the course, and how the course worked for you.”
This lack of a narrative component on the SCS was one of the main problems the committee identified with the current system. It is difficult for students to rate aspects of a course, such as quality of instruction, on a numerical scale. Quantifying what one has gained from a class is hard, and oftentimes arbitrary. There is little difference between a score of four and one of five for example. Similarly, it is difficult to compare scores across disciplines.
With a ratings system also comes the potential for faculty members to fixate on their scores, sometimes that comes at the expense of their teaching. “There’s some concern among the faculty that these things tend to make us teach in too homogeneous a way, because we all want to do well,” De Veaux said.
The consensus among the faculty was that the SCS forms did not help them evaluate their own teaching, which is one of the overarching goals of the SCS itself. In a survey the committee conducted, faculty members overwhelmingly reported that they had learned a lot from the blue sheets, but had learned less from their numerical scores. When considering promotions, however, departments consider only the numerical scores.
The SCS was initially devised to protect the junior faculty. At the time, professors worried that, when being considered for tenure, their teaching would be criticized without substantial evidence. The SCS then came about as a way for students to evaluate teaching, provid-ing this evidence. According to the survey the committee ran, junior faculty members now do not believe that the SCS protects them.
The committee also found that faculty tend to pay greater attention to students’ answers for the final two questions on the SCS, which pertain to quality of instruction, educational value and intellectual engagement. However, the placement of these questions at the end of the extensive survey troubled the committee.
“I did some analyses to see about the questions that lead up to the two big ones, and it turns out that they’re incredibly redundant. They’re all correlated, as we suspected,” De Veaux said. “We’re just worried these things aren’t helping us improve our teaching, which is what the point is.”
In light of its findings, the committee suggested incorporating a narrative element into the SCS, supplementing the numerical scores. The committee also considered the possibility of introducing an online SCS, although it acknowledged possible problems accompanying this solution as well, such as timing.
Even if specific solutions remain undecided, the notion that the SCS should change was met with overwhelming approval and support from the faculty. “Eighty-nine percent of the faculty was ambivalent or agreed that the SCS should change,” De Veaux said. The survey had 250 faculty respondents.
While many concurred that the SCS should be updated, the responsibility of enacting con-crete changes will be left up to another committee. The ad hoc committee will publish a full report of its findings at the end of this academic year.