Student Course Survey undergoes changes

Joey Fox

Following a years-long process that went through four separate faculty committees and multiple all-faculty votes, changes to the Student Course Survey (SCS) which were first approved in 2017 will be implemented at the end of this semester. The formerly 23-question paper form, which is completed by students in all of their classes at the end of each semester, will be shortened to seven questions and will be filled out entirely online.

The process of reworking the SCS was first initiated for the 2015-2016 academic year, when the ad hoc committee on the evaluation of teaching, then chaired by Professor of Statistics Richard De Veaux, identified several potential changes to improve the survey. Previously, the College had last revised the SCS in fall 2006; the survey was initiated in 1972 and made mandatory for all faculty in 1988.

“We were charged with looking at our evaluation of teaching in the large sense, but we certainly spent a lot of time on the SCS in particular,” said De Veaux of his committee. “We found that the old form was too long and too redundant. There was also no way for students to explain their scores.”

The committee released six recommendations on the SCS: reducing the number of questions, integrating quantitative and qualitative responses, recognizing inherent bias, understanding what conclusions can be drawn from the SCS, moving towards online surveys and considering changing the numerical scale on quantitative questions. According to Director of Institutional Research Courtney Wade, the committee “recommended the formation of another committee to further pursue these ideas,” given the breadth and complexity of the recommendations.

The following academic year, the ad hoc committee on the evaluation of teaching was tasked with drafting specific changes to the SCS based on the previous year’s recommendations. Eiko Maruko Siniawer ’97, professor of history and the committee’s chair for 2016-2017, said that “over the course of this year, ideas were shaped and reshaped, dropped and reworked according to all of the feedback that the committee heard.” 

In addition to consulting with other committees and faculty, Siniawer’s committee reached out to other institutions of higher learning about their evaluations. 

The committee brought two motions to the May 2017 faculty meeting, both of which were approved. The first, according to Wade, “required that all units evaluate untenured faculty members’ teaching using three different methods: the SCS, another method for gathering student opinion and a method of peer review that involves observation of teaching.”

The second motion shortened the SCS, required that it be moved online and specified other implementation details. It also stipulated that the changes would go into effect in fall 2019. 

According to Provost Dukes Love, the vote on the second motion revealed significant disagreements among faculty over the changes.  

“It was a relatively close vote,” Love said. “I wouldn’t say that this is one of the most controversial issues on campus. But really smart faculty have different views about the most effective ways of evaluating teaching performance, effective teaching.” Love clarified that some faculty wanted only one of the two main changes – reducing questions and moving online – while others wanted no changes whatsoever. 

Wade agreed, adding that still others wanted even more drastic changes. “Some faculty feel that we should get rid of student evaluations altogether — that they’re biased, and that they’re measuring student satisfaction more than teaching quality,” she said. “Others feel that while flawed, student course evaluations are the one opportunity for all students to weigh in on their experiences in the classroom, and that involving fewer people in the evaluation process might lead to even more bias.” 

In the two-and-a-half years between the May 2017 vote and the fall 2019 implementation, the SCS implementation committee met to discuss reporting and implementation details. Lee Park, professor of chemistry and chair of the committee for the 2018-2019 academic year, said that her committee has “been working through some of the complex logistical details associated with moving to the new online forms, and we are finally now able to roll out them out.” Professor of Political Science Cathy Johnson chaired the committee for the 2017-2018 academic year. 

Wade and Assistant Director of Institutional Research James Cart ’05 were also involved in the implementation process. “We’ve played a large role in choosing an online vendor, setting up the system, designing and creating the reports faculty will receive, providing documentation of the process and communicating all the changes out to faculty and students,” Wade said. 

For students, the most immediately visible effect of the changes will be the move to an online reporting system. While the blue response sheets intended for professors will remain on paper, the SCS itself will now be administered via Glow. Professors are instructed to give students time to fill out the online survey in class along with their blue sheets, but students may fill them out at other times if they wish. 

According to Wade, “Williams has been an outlier among its peers in continuing to administer the evaluations on paper.” 

In order to make sure as many students as possible complete the forms, students who have not finished their SCS by the end of reading period will receive email reminders. “We’re really interested in supporting as high a participation rate as we can possibly get,” said Love of the new system. “The trial runs we did in psychology had participation response rates up in the 88-percent range.” 

The switch to online administration has also meant that all classes will now automatically be given Glow pages, even if the class’s professor chooses not to utilize it.  

The SCS process has also been streamlined by reducing the number of questions from 23 down to seven, with an eighth for lab courses. All of the questions regarding a student’s demographics and personal information, as well as some evaluative questions, have been eliminated, in order to shorten the time taken to complete the survey and to remove potential sources of bias. 

Siniawer said that “many questions on the previous form have been cut or reworded because they were especially prone to bias in the responses; elicited responses that were highly correlated with each other; and/or made assumptions about normative pedagogy.” 

For Love, reducing bias was one of the primary goals of the process. “It’s worth noting that all evaluation systems are going to be subject to various forms of bias,” he said. “The most important thing is for us to be self-aware. That is, to understand what the sources of biases really are, and then to use that information as best we can to make sense of scores and feedback that we get.” 

The order of the questions has also been changed. The two most important questions in the SCS — overall evaluations of the course and instructor — have been moved from the end to the front of the survey. 

At the end of the survey, there will now be an additional text box to submit comments which will be read by the instructor, their department chair and members of the committee on appointments and promotions. These comments will be displayed alongside numerical answers to the first two questions.  

“That means that you can tell, if you get low scores, there’s some commentary that helps explain the context behind why you just got low scores,” Love said. “And if you get really high scores, vice versa, you can see what it is you did really well.” 

Finally, there will now be a preamble to the SCS, which Siniawer said will encourage students “to reflect on what they learned in a course and how they were challenged intellectually before responding to the questions that follow.” 

In April 2019, the faculty voted once again to make three changes to the original motion. The vote eliminated one provision which would have withheld grades from students who had not completed their SCS forms, and another which would have created different responding time periods for different classes. It also added a provision to make responding in-class the default. 

Love concluded that, despite the many years the changes have taken to be created and implemented, the gradual process was worth it. “My own statement in the end is, this was a long, exhausting process working through the committee structure, and we ended up with a much better product than we would have had if we’d moved earlier in the process,” Love said.