“A very solid and admirable C-”: Instructors respond to the use of ChatGPT at the College
February 15, 2023
At the start of the spring semester, students noticed something different about their professors’ syllabi and welcome-to-class announcements: Many included guidance on, prohibitions of, or encouragement to use ChatGPT — as long as it’s cited.
“If ChatGPT were a student in my intro philosophy class, it would earn a very solid and admirable C-,” Chair and Professor of Philosophy Joseph Cruz ’91 wrote in an email to the Record. “Ask me again in 15 years, though. I expect that my answer will be different.”
ChatGPT is a chatbot released by OpenAI, an artificial intelligence research laboratory, in November 2022. Chatbots — types of software that simulate conversations with their users — have existed since the 1990s, but ChatGPT has gained recent popularity due to its ability to create syntactically sound sentences and paragraphs in response to user-inputted prompts.
Trained on large sets of data, ChatGPT works by predicting what words typically follow each other in sentences. It can also rewrite paragraphs, compose poems, and help users generate code. Its work, however, is sometimes inaccurate, since it lacks a mechanism to fact check the content it provides.
“Some folks in the computer science community describe ChatGPT humorously as ‘mansplaining as a service,’” Assistant Professor of Computer Science Daniel Barowy wrote in an email to the Record. “This is actually not far off: ChatGPT will confidently answer most of the things you ask it, even when those answers are completely wrong.”
Since its release, ChatGPT has been banned from public school WiFi networks in New York City and Seattle, and some college professors across the country have redesigned their assignments in response to it, the New York Times reported in January.
The Record spoke with five instructors at the College about their thoughts on the use of ChatGPT in the classroom at Williams and beyond.
Does using ChatGPT constitute a violation of the Honor Code?
Justin Shaddock, the chair of the Honor and Discipline Committee and an associate professor of philosophy, views syllabi as contracts between professors and students. If professors hope to bring ChatGPT cases before the committee, it is critical that they’ve clearly communicated that expectation to students, he told the Record in an interview.
“You’ve got to have it on the syllabus — or maybe on the syllabus and on the individual assignment — just to spell out if there is any way that you’re allowed to use this thing,” he said.
If professors omit statements about the AI from their syllabi or do not communicate their interpretations of the Honor Code clearly with their students, the matter is up to the discretion of the committee, which does not hold any explicit positions on ChatGPT. Rather, the committee applies the Honor Code to each individual case brought before it.
Several online platforms can be used to detect if a piece of writing was likely produced by AI. But since those detection platforms are newer and haven’t undergone a rigorous third-party review, according to Shaddock, the Honor Committee uses other methods of determining the likelihood that AI generated the written work brought before it.
“Better evidence that someone has used ChatGPT would be discrepancies,” Shaddock said. If a writing style in one student’s submission differs drastically from other students’ work, the committee may consider that to be stronger evidence that the student used ChatGPT than the conclusions produced by software that evaluates whether writing is AI-generated.
Since ChatGPT’s release in November 2022, the Honor Committee has heard one case in which the AI was involved, Shaddock said.
The case was raised before there was widespread awareness of ChatGPT among faculty members. A student used the software on a take-home exam and was found responsible for violating the Honor Code. However, this was determined to be a violation because the student breached the exam’s no-internet policy, not a policy that explicitly prohibited the use of ChatGPT.
“So in a way, we haven’t had a genuine ChatGPT case,” Shaddock added.
Cruz, who researches cognitive science and artificial intelligence, told the Record that he would instantly pursue a case with the Honor Committee if a student wrote a paper for his class using ChatGPT and turned in as if it was the student’s own work — but the Honor Code violation, he said, would not be the most important issue at stake.
“I basically don’t care at all about the product — you know, the paper — that my students produce,” Cruz wrote in an email to the Record. “It’s a summary and a placeholder for the process that they went through in thinking through a topic, in wrestling with the details and the ambiguity, in finding their own coherence in the material and pursuing a creative, tenacious movement forward in their own aspiration to understanding.”
“If my students don’t go through that — the drafts, the debate, the rethinking, the wondering, the revisions — then they don’t engage with how human beings learn to soar,” he said.
What should be the role of ChatGPT in learning environments?
The recent rise of ChatGPT has also sparked conversations about its place in the classroom and what it means for the future of education at the College.
Cruz, for example, told the Record that after running his exam questions through ChatGPT with students, he tried to make his prompts more philosophical. ChatGPT, he said, had produced fairly credible answers that would have certainly passed his introduction to cognative science course.
In an interview with the Record, Director of the Writing Center Julia Munemo agreed that the AI should prompt a reevaluation of academic assignments. “The arrival of ChatGPT gives us all a chance to think about why papers are assigned, what those assignments can achieve, and how they fit in with everything else the student is doing,” she said.
Beyond compelling professors to reconsider the process of student writing outside of the classroom, ChatGPT has also spurred an exploration of the program’s implications for the writing process as a whole. Munemo said that to her, ChatGPT could deprive them of the chance to understand what writing truly is.
“Students have agency over how they become educated, and a student who chooses to use ChatGPT is robbing themselves of the opportunity to really learn about whatever they’ve been assigned to write about,” she said. “The act of writing helps you reinforce what you’ve learned. If you use ChatGPT, none of that happens.”
Munemo noted that she does not expect tutors at the Writing Center — who, she said, can only explain the Honor Code and not enforce it — to monitor students’ writing for hints that it was composed by ChatGPT. “[Tutors] can advise students of the dangers of making bad choices, and remind them that the student is responsible for their own education,” she said. “But when a student hands in a paper, only the student is responsible for the integrity of that paper.”
To some instructors, ChatGPT even has potential to serve as an innovative teaching tool. “If students can use ChatGPT in such a way that the platform helps them to learn new concepts without doing the thinking for them, then AI could be a terrific learning tool,” Matt Carter, associate professor of biology and director of the Rice Center for Teaching, wrote in an email to the Record.
Carter added that he sometimes uses ChatGPT in his daily life, but he makes sure to review its suggestions in a conscious manner. “ChatGPT always gives me revisions to think about,” he wrote. “However — and this is key — it is me that is doing all of the thinking. The end result is a piece of writing that preserves my intellectual authenticity and reflects my own thoughts and ideas.”
What does the future of ChatGPT look like at the College?
Some believe that ChatGPT serves as an opportunity for the College community to reflect on the implications of a small liberal arts school education. “This technology challenges Williams faculty and students to reflect on not only what it means to learn and think critically in the modern world, but what it means to learn and think in a small, residential college such as Williams,” Carter wrote.
“Everyone is going to be using these tools,” he added. “The key question is how we embrace the presence of these tools in our lives, including academic spaces, [in a way] that enhances rather than diminishes thinking and learning.”
Barowy — a former faculty member of the Honor Committee — expressed concerns about AI tools’ roles in the classroom. Such technologies make mistakes similar to those a beginner programmer would make, he said, so discerning academic dishonesty can become increasingly difficult.
However, Barowy added that, despite the technology’s considerable implications for teaching and learning, “It’s an exciting time to be alive.”
“If you asked me two years ago if we would have a technology like this, I would have given you a definitive ‘No!’” he wrote. “I was wrong. I love it when I’m wrong! Computer science will probably continue to surprise us.”