Football games are a city-wide event at Clemson University. Every Saturday —rain or shine— students, alumni, faculty and residents flood campus to worship their top-ranked team. The scent of beer, southern barbecue and boiled peanuts wafts through a sea of vibrant orange and purple. Meanwhile, inside their 80,000-seat stadium, blaring marching band horns and intoxicated chants express Clemson’s passion for football and hometown spirit. Normally, this would be a paradise for fans.

But it was all a distant memory this fall—no fans, no beer, no barbecue. Birds singing from trees replaced the stereos blasting country music. Squirrels replaced families of tailgaters. Game day filled fans with an overwhelming sense that something was missing.

As the pandemic dragged on, the university doubled down on a controversial solution: working with two of the world’s largest corporations, Apple and Google, to link smartphones with personal health.

The idea seemed harmless—Apple and Google would use their widely distributed devices to help track who carries the virus, and encourage people interacting with those carriers to get tested. Clemson would act as a test site. If proven effective, this technology would likely expand to the rest of the country.

“We were very interested in tools that could help reopen the state,” said Dr. Leslie Lenert, the director of biomedical informatics for the Medical University of South Carolina, which is collaborating with Clemson on the project.

Professor Lenert had been working closely with Clemson to reduce COVID-19 infections. “One of the obvious ways to do that was to accelerate the reporting of [infected] contacts,” he said, sitting before a wall of meticulously arranged diplomas. “It’s very difficult for [infected people] to go back in time and tell everyone they were near they might have been infected with COVID-19.”

But, while Lenert saw it as a thoughtful, safe, clear-cut plan, others weren’t buying it.

“The understanding is ‘if you don’t have anything to hide, then you don’t have anything to be afraid of,’ which I believe is a fallacy,” said Clemson English professor Jan Holmevik. “Yes, while I think it is a great tool to use to help limit the spread of COVID-19, it has that backside that has pretty serious implications to privacy.”

When the pandemic banished Holmevik to his rustic home office, weeks turned into months. He began progressively upgrading his basic setup until it became a professional-grade, Zoom-ready workstation complete with a studio-grade condenser microphone and high-resolution external webcam.

The Clemson English professor and avid Apple fan began speaking against digital contact tracing and exposure notification. “What is one person’s privacy if you can save two other people? That is a false narrative,” he argued. “You don’t have to sacrifice privacy to save lives.”

Holmevik believes this is only the beginning. The app, in this case, is the foot in the door. By having people adopt a seemingly harmless intrusion by promising public safety, they’re potentially more likely to adopt solutions that further compromise privacy. As Holmevik puts it, “Squeeze the toothpaste out of the tube and try to get it back in.”

Reflecting on Clemson University’s involvement in this app, Holmevik scoffed in disappointment. “When we’re dealing with things like the pandemic and people dying all over the place, it’s very easy to resort to knee-jerk reactions,” Holmevik said as he leaned closer to his microphone. “And that’s when you sacrifice things like privacy for the greater good.”

The Norwegian-born Holmevik consistently follows international responses to the virus. He cites a Norwegian case where the state is prosecuting its former Justice Minister Laila Bertheussen for arson and other alleged crimes. She was accused of setting her car on fire to frame an anti-racist theater group. But, using data from her health app, prosecution claimed her step count was inconsistent with her defense.

“There are countries in the world today that use face-recognition technology and other personal tracing technologies to monitor their citizens and what they say,” Holmevik said. “Now we’re moving into very dangerous territory.”

Holmevik mentions China, which is widely known for its internet censorship. “That information is being collected to keep you subdued,” he said. “If you post something critical about the party on a forum or online, they can find you and they can come and arrest you.”

Holmevik points to Singapore as a cautionary tale. There the government used GPS, credit card records and surveillance cameras to build a public website listing the age, gender, occupation and up to two weeks of retraced movements of coronavirus carriers. And, one day after California issued state-wide stay-at-home orders, Singapore launched a contact tracing app called TraceTogether, which utilized technology similar to Apple/Google’s.

Critics like Holmevik fear the worst. Imagine being exposed to COVID-19 and having your age, gender and occupation broadcasted to your neighborhood. Depending on where you live, it may not be too difficult for neighbors to narrow down whether or not you’re the carrier. What would they think about you? What judgment might follow? Is your privacy worth other’s safety?

Singapore’s TraceTogether and Clemson/MUSC’s app are incredibly similar. Despite this, Clemson/MUSC officials say they are not doing contract tracing.

Considering Apple/Google’s original intentions, this is especially confusing to critics, but not to Jim Pepin, Clemson University’s chief technology officer. In Clemson’s case, he said, unlike contact tracing, “you don’t know where the people are, you don’t know who they are.”

Pepin sat in front of a long, wooden bookcase filled with various knickknacks, staring intently at a screen filled with bar graphs illustrating how many COVID-19 tests the school has conducted. The graphs contained the number of positive tests, negative tests and total tests; there was no sign of individual names or identification codes. Politicians, he said, “didn’t understand the anonymization or that it was a notification system, not a contact tracing system. We’re now in the middle of a political [debate] just so we can use it,” he explained.

The app utilizes Bluetooth—that chip that wirelessly connects your iPhone to your AirPods. In the app, this chip interacts with other chips within a certain proximity, logging each phone that comes near. Later, if someone reports a COVID-19 exposure via the app, phones that were previously within range of that carrier’s phone are then notified. Since most smartphones have some form of Bluetooth, it’s the perfect connection to tie the entire system together.

Additionally, all the logged interactions are stored locally on individual phones. “The whole point of this type of technology is to create a system where there is no central database,” Lenert explained. He said this was meant to reduce or negate the potential for hacking and retrieving the list of interactions.

However, these guarantees of confidentiality have been made before, and they’ve been broken time and time again. When it comes to trusting big tech, say critics like Holmevik, it can be difficult given its history. Two years ago, Facebook removed political data firm Cambridge Analytica from the site after it collected and distributed 50 million users’ data. And just last July, a 17-year-old hacked Twitter, gaining access to multiple accounts including those that belonged to Barack Obama, Jeff Bezos and Elon Musk.

Holmevik isn’t the only one against this app. When Dr. Tanju Karanfil, another Clemson professor, began asking his own students if they’d be willing to activate the app, not a single one answered yes. “Gaining their trust is really important and significant here, especially activating something on a phone because there’s already enough going on on our phones,” he said.

As another faculty member on board with the app, Karanfil explained that “if this is not adapted by 60% to 70% of people, there’s only so much we can do with it.”

COVID-19 exposure notification apps like Clemson’s are voluntary; this is one of its major draws. However, according to Karanfil, only 30% to 40% of Clemson University is willing to actually participate. This means they would need to nearly double that number in order for the app to be effective.

Apple/Google made it difficult for Clemson and other participants to modify or repurpose their technology. They were aware of how dangerously powerful it could be in the wrong hands, said Lenert.

“The dark side of this, I fear, is not so much government privacy invasion. It’s much more that this kind of technology has the power to put precision advertising tools out of business,” he explained. Imagine seeing billboards that personalize advertisements depending on who is traveling past them.

In his office on the MUSC campus, Lenert occasionally paused our interview to take a call, then, phone still in hand, he leaned into his webcam, raising his eyebrows and speaking forcefully.

“They wanted to give public health the tools, and only public health the tools,” Lenert said, “but they didn’t want to give them too many tools. We would probably go a little bit further on the privacy revoking side.” If they could, the app would not end with notification, according to Lenert; it would probably be used as a means of proving test results. It would act more like a passport containing health records. However, despite understanding the app’s implications to privacy, Lenert still supports implementing it.

Though Clemson began working on their app towards the beginning of the pandemic, they are still arguing with the state government six months later. Now, seventeen U.S. states have since implemented Apple/Google’s exposure notification technology, with California being a recently announced participant.

In June, uncertain with the app’s privacy, South Carolina State Senator Thomas Alexander responded to Apple/Google’s technology. “We just want to make sure if and when that technology is used that people have confidence in the work that’s being done,” Alexander said.”and It’s vetted through the legislative process.”

Back at Clemson, just after dinner, Holmevik sat in his office wearing a gray hoodie. Leaning back, he let out a sigh as he remained unconvinced by the assurance of the app’s promoters. “It’s about waking up,” he said, leaning toward the camera, “and becoming aware of the fact that we gave [privacy] up, and we have to reclaim it.” Privacy, he insisted, cannot be a lost cause.