USC

Across the board, USC students, faculty and staff say AI is undermining academic integrity

Debates arise around the effectiveness of OpenAI’s partnership for long-term student learning.

An open ChatGPT window next to the Common App window on a student's computer on May 1, 2023. (Photo by Ethan Huang)
An open ChatGPT window next to the Common App window on a student's computer on May 1, 2023. (Photo by Ethan Huang)

USC released findings of its AI Survey on Thursday afternoon, noting that students and faculty are concerned about the potential negative impact of adopting AI at the university.

Since announcing the AI partnership in early November, which provides faculty and students with free access to GPT-5 and more advanced AI models, there has not been a significant shift in ideology or usage among students and faculty.

“I frame my discussion of the policy in my first class in opposition to the university’s policy and widespread endorsement and investment in generative AI,” said assistant English professor Harry McCarthy. “ I think that that’s undermining the educational mission.”

In a statement from the USC Office of Academic Integrity, the University says that it is up to each instructor “if, and under what circumstances students may use [generative AI]”.

“Students must appropriately cite materials or work obtained, edited or influenced by external resources, including ChatGPT or other Generative AI tools,” the statement read.

The USC AI survey found that faculty, students and staff expressed concerns about the potential negative ramifications of AI use in the classroom.

Regardless of the subject, the survey found that many fear new AI models will foster harmful relationships, leading students to rely on them for information, minimize critical thinking, and compromise academic integrity.

“Once you get an actual job outside of college, you can’t just ChatGPT everything,” said Arleen Moran, a freshman majoring in human biology. “Especially for me, I can’t be with a patient one day and then just ask ChatGPT, ‘What should I do next?”

Students like Moran fear that the University’s “promise” of advancing student learning with the new AI policy will impair students’ academic performances.

“A lot of the texts [in class] are very old, but I believe that the best way to overcome that difficulty is to practice close and attentive reading and to share uncertainties with one another,” said McCarthy. “These tools that the university seems to be applying more and more money into seem to be short-circuiting that or stripping those opportunities away.”

Yet, there are Trojans in favor of the new AI policy. Many believe that when used correctly, AI can be an effective tool for student learning.

“Well, honestly, I feel like regardless of an English class or a regular major class, like [Biomedical Engineering] for me, people can take advantage of AI in the same way,” said sophomore biomedical engineering major Aiden Benmoshe.

Angel Hwang, a researcher at the USC Center for AI in Society, shared her excitement for the new policy.

“I don’t think I can really speak for everyone, but I think the new policy is kind of pushing and encouraging people to have all these conversations about how AI should be used and to what extent in classrooms,” she said.

Although USC has voiced concerns about the decline of critical thinking through the recent investment, their goal in introducing the tool was to enhance student learning.

The partnership aims to give students a space — as well as professional guidance — on how to effectively incorporate AI into their critical analysis of learning, said Leah Belsky, head of education at OpenAI, at the USC AI Summit in November.