After OpenAI released ChatGPT in November 2022, Assistant Professor of Geosciences Alice Bradley played around with it — “making ridiculous pictures to send to friends when there wasn’t a Bitmoji to do the job,” she said.
While she wasn’t teaching that year, she grew curious about the technology’s capabilities when she returned to the classroom. Bradley fed a couple of problem sets from her geoscience class — that had definitive right and wrong answers — to the software. After grading the machine’s responses, she determined ChatGPT would earn a C. “It didn’t seem like that big of a threat at the time,” she told the Record.
While Bradley was relatively unfazed by the innovation, in the past two years, AI has steadily made its way into syllabi and classrooms throughout the College. In May 2023, Dean of the College Gretchen Long formed an ad hoc committee to study concerns about academic integrity following the rise of the software. Seven months later, a co-chair of the Honor and Discipline committee reported that between a fifth and a third of all cases the committee was hearing concerned suspected violations of AI policies. And in September, the Washington Post called AI an “existential threat to colleges.”
In interviews with the Record, professors across all three divisions shared vastly different opinions on the technology. Some decried the potential fraying of the College’s academic fabric, while others expressed deep excitement about the new kinds of study unlocked by AI. But all could agree that the software was changing the landscape of higher education in meaningful ways and that they would have to adapt in response.
In a Record survey of 102 professors teaching courses this semester, 78 percent said that their syllabi mentioned AI, and 63 percent said they dedicated class time to discussing the technology.
Assistant Professor of Computer Science Katie Keith, whose research concerns machine learning and natural language processing, called AI’s widespread release a “paradigm shift” in higher education, comparing it to how the COVID-19 pandemic radically reshaped classroom procedures.
Professor of Political Science Darel Paul didn’t make any adjustments to his courses immediately after ChatGPT launched in 2022. After receiving two final papers he heavily suspected were written by the software, however, he altered several of his assignments for the fall 2023 semester. When he used to teach “Introduction to International Relations,” for instance, Paul would assign a midterm paper, which he described as “a test to see how well they know the major theories and how well they can apply them.”
“I thought, ‘This is exactly what AI is made for,’” he said. “You wouldn’t have to spend any time writing that paper if you just plugged the prompt into ChatGPT.” So for the assignment, Paul had his students do just that: Prompt the software to generate a paper. The twist? Students submitted a critique of the AI output as their midterm.
Paul isn’t alone in redesigning assignments following the rise of AI. Owen Ozier, associate professor of economics, used to only assess students through open-book exams, encouraging students to use the vast array of resources at their disposal, just as he would if he were conducting his own scholarship.
“I had no problem with people using resources,” Ozier said. “But if the resources are going to allow you to not do any thinking, then it ceases being a meaningful question to ask.” He now offers exams in pairs: one in person with a pen, paper, and calculator, and another at home, with laptops open where students can access whatever resources they desire — AI included.
Keith, however, has taken a somewhat different approach. She said that “Data Structures and Advanced Programming,” her introductory computer science course, is likely structured in similar ways at colleges across the world — and as such, there’s an especially large amount of information available that AI could employ to assist students in their assignments. That means she has to trust her students not to cheat on their homework, since it would be almost impossible to catch them.
“I’ve been trying to give the message to students that if you take shortcuts at this stage in your education, then once you get to harder things, you won’t have that muscle,” she said, noting that she believes many students enter her course with good intentions but may turn to AI shortcuts in moments of high stress. And, like Ozier, she’s reverted back to pen and paper tests — even though she has never been a fan of “high-stakes exams” — because she sees them as the only way to assess whether students have internalized the knowledge necessary to proceed to more advanced classes in the department.
But not all professors are trying to restrict the usage of AI in their courses: Over half the respondents to the Record’s survey said that they permit students to use the technology, though the vast majority of them have instituted specific restrictions on how students can do so. The survey found that some professors allow their students to use AI to explain course content, summarize readings, create study guides from materials distributed in class, and assist with the writing process — though no one usage of the technology received unanimous approval.
Several faculty members called AI a “tool,” likening the software to a variety of other classroom implements — like graphing calculators, thesauruses, encyclopedias, or Google. Some of these tools have faded from relevance as more advanced technology has become available, but none destabilized the fundamentals of academia when they were launched.
“The calculator didn’t make math go away,” Ozier said. “That’s my strongest argument and my most useful guidepost in thinking about what AI will do.”
Many faculty members noted that, as they have become more familiar with AI, they have incorporated it into their own teaching and research practices.
“I love using ChatGPT to code in R, but that’s because I already know how to code in R, and I can fix any bugs that ChatGPT spits out more quickly than I can write my own R file,” Liz Upton, assistant professor of statistics, wrote in an email to the Record.
Lecturer in English Ezra Feldman said that he recently used AI ahead of a class session for his course “Writing Gender in Sci-Fi and Speculative Fictions,” asking the Google chatbot Gemini to create five fictionalized characters for his students to use as the starting point for an in-class exercise about plot design.
Cassandra Cleghorn, senior lecturer in English and American studies, has also incorporated AI into her courses — but rather than encouraging students to use it to assist with their standard academic work, she designed a workshop to demonstrate its limitations. In her course “Expository Writing,” a 100-level English seminar in which many students write their first college papers, Cleghorn closely coaches her students through their first writing assignment.
“It was, truly, for many of them, one of the first times they had ever had an original thought — not saying what they thought the teacher wanted them to say or writing to the exam,” she said of the assignment, in which students submit an initial draft of the paper and then receive extensive feedback from her.
After students submitted a final version of the essay, Cleghorn held a class session where students posed their original research questions and the paper’s parameters as a prompt to Gemini and evaluated the alternate essay that the AI software produced.
“Sure enough, all of their papers come spitting out, and they start reading and giggling,” she said. Quotes were entirely fabricated, she added, and the reasoning was so circular it was close to nonsensical. “In a matter of 15 minutes, they could see that this kind of paper was not going to work for this kind of class.”
The class followed the AI experiment with a discussion where Cleghorn admitted that AI’s adept ability to summarize in grammatically flawless prose was apparent but also emphasized that it failed to form any semblance of an original argument. “It’s a way of backhandedly showing [my students] how exciting it is to actually have an original idea,” she said.
As faculty have navigated the new academic landscape with AI, the Rice Center for Teaching — a College office that helps faculty develop their teaching — has organized a series of lunchtime workshops for professors to hear from AI experts and share ideas about implementing AI policies in their courses.
Matthew Carter, the Rice Center’s faculty director, joked that he’s observed what he calls the “four stages of grief” when it comes to teaching with AI: denial of its ability, panic about the threat it might pose to teaching methods, acceptance of AI’s staying power, and a final embrace of the technology. “Talking with faculty, people are in different stages of their journeys thinking about this,” he said.
Last fall, the Rice Center held one of its most attended AI workshops: a faculty dinner where over 100 professors were seated at tables with their colleagues from various academic disciplines and had the opportunity to share their ideas and fears about the software. After the dinner, the Rice Center compiled notes from the faculty conversations to create an AI guide for faculty.
The self-described “living document” includes example syllabus language regarding AI, ideas for AI-free assignments, ways to use AI as a “learning tool,” and a form where faculty can suggest additions. It also notes, however, that the College has no official AI policy.
“It’s not rules that anyone is bound to,” Carter said. “Actually, I don’t think there’s one faculty member who would follow everything on that document.”
Even though the Rice Center has taken an optimistic approach to AI and its potential benefits, that sentiment is not necessarily shared by all faculty members: 45 percent of respondents to the Record’s survey said that they forbid any AI usage in their classes this semester, and over half said they were “very concerned” or “extremely concerned” about the effects AI might have on higher education.
Professor of Art History Michael Lewis, for instance, said his response to AI usage in his courses was an “emphatic no.” In the first week of his “Modern Architecture” course this semester, he had his students critique three student-written reading responses — but one of them was actually written by AI. “The unspoken lesson was, ‘You can spot it, and so can your professor,’” he said.
Though he ran a lighthearted demonstration, Lewis said that the rise of AI fills him with “profound distress,” as it further exacerbates a growing problem since the launch of Google: Research is becoming merely “an exercise in information retrieval.” He added that the reduction in quality of writing spurred by AI is only worsened by an increase in grade inflation at the College, which he said lowers the incentive for students to hand in strong writing in the first place.
“At a time of grade inflation when even an A- can bring tears, it is a lethal cocktail,” he said.
Large swaths of professors believe students are improperly using AI in their courses — 42 percent of survey respondents marked that they knew or suspected that a student had done so in their course this semester, and another 25 percent indicated that they were unsure. Several said, however, that a hardline stance against the technology was naive, both because enforcing such a policy would be impractical and since employers are increasingly looking for candidates skilled at using AI to assist their work.
“We have to adapt, and our students are going to be at a loss if we somehow keep them from it or stigmatize it,” Cleghorn said. “To me, it would feel really short-sighted to judge [students] for something that they have come to naturally, so I want to understand what makes them tick in this regard, and what they’re getting from it that’s clearly not all negative.”
Carter, who is also an associate professor of biology in addition to his role at the Rice Center, said that rather than AI’s nefarious potential, he was more concerned about a stark gap in his students’ knowledge of AI, which might put students less familiar with the technology at a disadvantage.
“Students in my classes have shown me how they’re using it in ways that I find incredible, and so when I hear these ways, I try to share them with all of the students in the class,” he said. “There’s a lot of quantitative support and writing support, but AI is totally new, and I don’t think that right now we have a structure for students to learn AI support yet.”