
Seventeen percent of students at the College say that they have used generative artificial intelligence (AI) in a way that they believe violates the honor code, a Record investigation found.
Last week, the Record sent a survey to all students at the College to learn why and how students are using generative AI. Students’ usage and perspectives varied, but one thing is certain: AI has arrived in full force at the College.
Seventy-seven percent of the survey’s 749 respondents reported using generative AI, with 67 percent of those using it at least once a week.
Division 3 majors used AI the most frequently, followed by Division 2 and Division 1.

“AI can be incredibly helpful in research,” Professor of Biology Luana Maroja said in an interview with the Record. “It finds typos in your code, it finds all kinds of things and it has accelerated analysis by a lot. There’s fancy graphics and stuff that would take literally weeks to figure out how to do and you can get it done in a day with AI.”
The most popular use of AI was to “explain concepts,” with 73 percent of AI users reporting having done so. Other notable uses included finding information (47 percent), summarizing written materials (40 percent), and proofreading (40 percent).
While AI may speed up statistical analysis, recent studies point towards negative neural and behavioral consequences of using large language models (LLM), particularly for writing. Additionally, 17 percent of survey respondents admit that they believe they have violated the Honor Code enforced by the College to prevent plagiarism. Thirty-three percent didn’t know what the College’s AI policies were. While the College’s website addresses the emergence of AI, the Honor Code lacks explicit mention of navigating LLM programs that pull from opaque and unreliable sources.
“I had students saying, I don’t need to come to class, because I use a ChatGPT Plus feature, and it explains to me all I need to know about your course,” Maroja said. “It resulted in an Honor code [violation], because so many red flags came that I eventually realized that everything was ChatGPT. In one case, my own prompt had been corrected in a PDF by ChatGPT.”
Charlie Tharas ’27 noted the importance of accountability and communication in using AI. “You should be, in general, required to submit a statement that’s like: ‘Here’s what I use AI for, here’s why I think it helped, here’s what I’m concerned about and how it affected my learning’” he said.
Division 3 majors reported using AI the most (81 percent), followed by students in Division 2 (80 percent) and Division 1 (71 percent.)
Twenty-nine percent of varsity athletes reported using AI daily, while 12 percent of non-athletes said the same. Notably, 25 percent of Division 2 majors and 24 percent of Division 3 majors who filled out the survey are athletes, while only 14 percent of Division 1 majors who filled out the survey are athletes.

In addition to academic dishonesty, students expressed concerns about the pedagogical consequences of AI use at the College. “You are here to get a degree,” Catherine Shutt ’26.5 said. “You’re taking these classes. You don’t have to do any of this. And simply taking the easy way out does not benefit you, and really, in the end, [it]… is not going to benefit society.”
Texas Lopez ’29 shared Shutt’s concerns. “I’ve hardly ever touched AI,” he said. “I think in 2023, when image generating was first coming up, I messed around about a bunch, but never for school. I think it’s unethical, and kind of lazy. At my sister’s college, a 30,000 person school, people get by with using AI, and the teacher never really catches it.”
Both Lopez and Shutt also noted concerns about the environmental impact of AI usage. “AI takes up a lot of energy, and energy infrastructure in the U.S. isn’t great,” Lopez said. According to an MIT study, data center energy usage nearly doubled between 2022 and 2023, and OpenAI’s GPT-3 training process emitted over 500 tons of CO2 and spent 1287 megawatts of electricity — enough to power 120 homes for a year.
Despite its drawbacks, many students said they see AI as a net positive to their education. Sixty-three percent of AI users said generative AI has increased their understanding of course material, 56 percent said it decreased their time spent on assignments, and 46 percent said it improves the timelines of their assignments. Seventy-nine percent of respondents believed generative AI has made no change to their grades.
According to Professor of Economics Anand Swamy, the key to managing AI is to keep it in the passenger seat. “I consider AI a fantastically energetic, but also unhinged, research assistant,” he wrote in an email to the Record. “I use it a bit, but always double-check what it gives me against a reliable source. That’s become easier, because it often provides citations now. It’s had modest benefits for my own productivity, but I do see that many informed people expect it to be transformative.”
Tharas said he believes that the prevalence of AI is inevitable. “At the end of the day, once you get a job, you use AI, and you get yelled at if you don’t use AI, and you will do poorly if you don’t use AI,” he said. “So the full ‘anti-AI’ attitude is not gonna help people in the long run, as much as [AI] is very deeply scary.”
The future of AI is uncertain — some believe it heralds a more advanced future, and some see its boom as another bubble. Regardless, its widespread adoption is undeniable across institutions, from defense contractors to universities. The truth is, no one is quite sure when or how the bubble will burst. “I am cautiously skeptical, time will tell,” Swamy said.
Response rates were roughly equal across class years at 31, 22, 21, and 25 percents in order of decreasing seniority.
Of the respondents, 53 percent identified as female, 41 percent as male, five percent as nonbinary, and one percent as another gender.
Students identifying as white made up 58 percent of respondents, greater than the 47 percent of enrolled students reported in last year’s Common Data Set. Nine percent of respondents identified as Hispanic/Latinx, lower than the 14 percent of enrolled students last year. All other racial self-identifications roughly corresponded to the composition of the student body as a whole.