USC senior Alex Wong expected to breeze through his final semester of Zoom University, but when classes began, he found himself faced with something he never expected from his STEM degree: open-note essay tests.
“I hate essay tests,” Wong said. “You can’t do essay questions in anatomy.”
Wong, who is on the pre-med track, said two of his biology professors have started multiple lectures complaining about USC’s discontinuation of Respondus Monitor, an exam proctoring software that uses artificial intelligence to combat cheating on online tests. Without the ability to monitor students’ browsers, some professors are switching tests to an open-note, short-answer format.
Wong said his anatomy professor was especially frustrated at the prospect of having to grade hundreds of tests by hand. “The sudden discontinuation [of multiple-choice tests] is really unfair to professors, especially those who have test banks they have been using for years,” Wong said.
USC Provost Charles Zukoski announced the change in a memo to the faculty on Jan. 26, two weeks into the spring semester. Zukoski cited “concerns about fairness and privacy” during the fall semester as the reason for its discontinuation, though Wong said when his professors asked for specifics, they “never got a straight answer.”
Wong’s two professors did not respond to Annenberg Media’s request for comment.
According to the memo, USC has reached out to Respondus regarding the university’s concerns about privacy, and the company is “working on modifying and enhancing their software to address these issues.” The university may consider re-implementing Respondus at a later date, the memo stated, if those changes occur.
Respondus Monitor works to prevent cheating by locking students’ browsers, tapping into their cameras and microphones, and recording them as they take their tests. If the program sees the student looking away from the exam or hears any noise, it “flags” the disruption so professors can review the footage later.
The software uses a combination of machine learning, AI and biometrics, which includes facial recognition, facial detection and eye tracking. Many systems like Respondus Monitor also require the students to scan their entire surroundings to complete a “room scan.”
However, Respondus doesn’t always get it right. For some USC students – like business administration major Sydney Leet – the software’s imperfections became a source of anxiety.
“When I was living at home last semester, I would have to tell everybody in my house, like my parents, to be quiet because I didn’t want to be flagged,” said Leet, who used Respondus in one of her economics classes.
Human biology major Maddie Walker said Respondus created a similarly stressful environment for her.
“I’m usually more concerned and focused on the Respondus rather than my test,” Walker said. “I have five roommates, so I’m always noticing how loud they are because I’m worried that I’ll get flagged.“
The software has also caused students’ computers to crash mid-test, something that happened to Leet last semester. Business administration major Maya Miro said her professors did not use Respondus for this reason.
Others cite privacy issues as a reason not to use the software. A recent petition at the University of Guelph in Canada has gathered almost 5,000 signatures calling for the university to ban the “invasive spyware.”
Douglas Becker, an assistant professor of international relations, environmental studies and political science at USC, said he thinks these concerns for privacy are “quite legit,” which is why he hasn’t used Respondus in his classes. He also cited several complaints about AI monitoring software like Respondus perpetuating racial and gender biases.
A major source of the perceived bias comes from the training data that is used when programming the AI. The software is shown over 14 million labeled images on ImagNet, but certain groups are over-represented – the United States is only 4% of the world’s population, but it makes up 45% of ImageNet data, while China and India together are only 3%, according to a study by Cornell University. This lack of diversity creates an algorithmic bias, kicking out students of color because their faces can’t be recognized by the AI.
While he doesn’t disagree with the university’s decision, Becker is concerned for his colleagues that relied on the software for giving exams. USC needs to make it a priority, he said, to come up with “a way to monitor without using this program that is clearly problematic.”
Some professors have beat USC to that goal. Professor Leslie Berntsen of the Keck School of Medicine has seen the pandemic as a “chance to think bigger and more creatively when it comes to assessment.” Berntsen has designed take-home exams that students can’t just Google answers to, as the questions are meant to demonstrate a deeper understanding of the material.
“I’ve always tried to make my classes more about the process of learning than a grade,” Berntsen said. “I want to ensure that my classes are as equitable and accessible as possible, especially in the midst of – insert exaggerated air quotes – our unprecedented times.”
But Wong’s professors haven’t yet figured out a solution of their own, making his final semester feel much more daunting.
“I was just starting to get into a rhythm with online classes and what to expect when it came to lectures and tests before it suddenly changed,” Wong said. “I just want to graduate in peace.”