The blue sheets and student course surveys that all students fill out during the last week of each of their classes represent one of the few consistencies between all Williams classes.
The student course survey (SCS) system, which was instituted in the 1960s, was the subject of a Gaudino forum last week when Assistant Professor of Political Science James McAllister raised the issue of the purpose and efficacy of the system.
McAllister explained that the SCS system was created to provide students with a voice in the faculty evaluation process.
“The SCS system was established here to protect junior faculty members from senior faculty members,” he said. “Before this, the students didn’t have any voice.”
The purpose of the surveys is twofold. Individual faculty members use the SCS results and their blue sheet comments to evaluate their own courses. In addition, the administration uses the SCS results as empirical data to evaluate a professor’s teaching when making tenure decisions.
“The blue sheets are really for the professors,” said Assistant Provost Richard Myers. “What goes to the Dean of the Faculty is a summary analysis of a faculty member’s student course surveys.” Myers said the results are contextualized within the results for the department, the division, other professors of the same level of tenure within the division, and the entire faculty. The Provost’s Office produces this information for each faculty member every semester.
Myers added that when a faculty member is being considered for tenure, the Provost’s Office provides the Dean of the Faculty with an even more comprehensive report. “[The Dean of the Faculty] will ask for a historical summary of the SCS results,” Myers said.
“[The SCS results] are the major component to evaluate the teaching of junior faculty for tenure,” said Associate Professor of Mathematics Thomas Garrity.
Garrity estimated that roughly half of the tenure decision is based on teaching, although he added that the percentage varies in different departments.
“Probably 80 to 90 percent [of the teaching evaluation] comes from SCS forms,” he added.
While SCS forms are clearly an important part of tenure evaluation, Associate Professor of Political Science Mark Reinhardt raised concerns about their use in post-tenure decisions. “It is not clear to me to what extent teaching evaluations are used for promotion within the tenure ranks and merit pay,” he said. “I think it would be peculiar if the College did not consider teaching an important criteria for tenured faculty.”
Dean of the Faculty David L. Smith said the SCS forms do have a role in post-tenure decisions. When asked about the role of the surveys in merit pay, Smith said, “the numerical results are reviewed when salary times come.”
However, Smith said the surveys are often less important for making decisions regarding promotion within the tenured ranks.
For example, he said that chair positions usually come with certain descriptions and guidelines.
Myers noted that the Dean of the Faculty does not routinely request historical summaries of SCS results for promotion decisions.
“To my knowledge, there’s not a systematic process of reviewing the SCS results for every full professor every semester,” Myers said. “I don’t know that there is a systematic structure for post-tenure review here at Williams.”
Do tenured professors care about the surveys?
Garrity said tenured professors decide as individuals whether to use the SCS results and blue sheets in shaping their courses.
“They [student course surveys] are very good to get feedback from students, and I take them very seriously. . . . Sometimes themes come out that I don’t realize.”
However, Doug Cohan ’00 said he had a professor who joked with his students that they may as well not write anything critical, because he had long had tenure, and he did not care what they said anyway.
“If they care about doing their job, tenured professors will listen to their blue sheets,” said Sam Daigneault ’00.
Ephraim Williams Professor of American History Robert Dalzell said he cares about the surveys. “I take them seriously and pay attention to the results, and think about them, and in some cases alter my courses based on them,” Dalzell said.
“I think senior faculty members care a lot about SCS forms,” agreed McAllister. He noted that Professor of English Robert Bell emphasized at the Gaudino Forum that Williams would not be such a strong college if senior faculty disregarded student surveys.
However, professors also note that there are several flaws with the current SCS system.
“I think they’re a horribly flawed system, but as Winston Churchill said, ‘Democracies are a horrible form of government, except for all the others,’ ” said McAllister. But he later rescinded the adjective “horribly flawed” and said the system could be better described as problematic. Specifically, he said, good teaching may not be conveyed by the surveys and teachers may try to slant their courses to get good ratings. Yet, he did observe that the surveys are a better means of evaluation than peer review or exit interviews.
Garrity expressed similarly mixed feelings. “I have conflicting opinions about course surveys,” he said. “Good teaching should shape how you look at the world. Can someone effectively assess the impact of a course at the end of the semester?” Garrity also wondered if high scores necessarily correlate with good teaching.
But Garrity said he is reluctant to be too critical as he can’t come up with an appropriate replacement.
“I’m torn because I don’t see anything that can replace them,” he said.
However, Dalzell proposed some solutions to the problems he perceives in the evaluation system. “We are probably in some cases over-evaluated,” he said. “I think the way courses are evaluated puts a premium on playing it safe in courses.”
To solve both problems, Dalzell suggested that two or three of a professor’s courses per year (out of an average of five) not be evaluated. “I think that in courses which are not evaluated, faculty would feel freer to take risks, and that would be a good thing,” said Dalzell.
Students also had concerns about the efficacy of the surveys.
“I think it’s really hard to tell if they listen to what we say, because a lot of people don’t have the same professors again,” said Marissa Kreh ’00. “I also think they try to listen, but once you have a teaching style, it’s hard to change.”
Daigneault said she also wonders how much of an impact the forms have on teachers.
“I think that if there’s an overall general theme that students have conveyed, they might take it into consideration, but it’s hard to consider every student’s opinion,” she said.
Doug Cohan ’00 agreed that in especially in large courses, it is difficult for a professor to consider all student input. “I think the professor is only going to take what they want to hear,” he said. “I would say any serious negative critiques are probably not taken.”
For now, however, the student course surveys are here to stay, and this week all Ephs will have the chance to make their voices heard.