The increase in facial recognition technology on campuses will leave students of color vulnerable to greater scrutiny and discrimination. Credit: Shutterstock

Students, I have bad news. Yes, more bad news. In the midst of the global pandemic and the largest civil rights movement in American history, there is one more thing you should be worrying about. And it is downright Orwellian. 

Due to COVID-19, universities are facing an unprecedented dilemma. They have a massive financial incentive to hold an in-residence semester: Universities desperately need the revenue from full tuition as their athletics departments and health systems lose millions of dollars. College towns are suffering enormous economic damage. In Ann Arbor, many popular businesses have closed their doors for good. Meanwhile, students are sick of being home and miss their friends and social life. 

Related:

But universities have a moral and public health obligation to protect their communities, and holding in-residence instruction is all but guaranteed to further accelerate disease spread. COVID-19 cases and deaths are rising, with no vaccine or effective treatment in sight. And by definition, campus life is not socially distanced: Students live in group housing, eat in communal dining halls and study in the library. And to many, partying is a crucial element of college, so the temptation to socialize will be hard to resist for even the most public-health-conscious students.  

University leaders are trying to navigate this by implementing a variety of surveillance technologies when students come back to campus. Testing and contact-tracing technologies, the thinking goes, could help to trace and control disease outbreaks. Unfortunately, the side effect is aiding the growth of the surveillance state: that is, creating an infrastructure that tracks, records, and analyzes virtually all student movements. This doesn’t just eliminate your privacy. It has plenty of other hidden consequences too.  

Claire Galligan

For the past year, I have worked on the University of Michigan’s Technology Assessment Project. Using historical case studies, we aim to predict the potential consequences of new technologies before they occur so that negative outcomes can be mitigated or eliminated entirely. Our report found that the history of surveillance technologies gives us important insights into the current moment: specifically, they tend to magnify existing biases and further disadvantage already vulnerable populations, normalize surveillance, commodify our data, and erode our privacy.  

 Consider facial recognition. In recent years, universities have installed cameras on campus to enhance school security, take attendance, and allow students to pay for food. Now, some of the companies providing this technology are getting into the COVID-19 game. 

PopID, a California-based company, is already offering employers PopEntry+, which uses facial recognition and a thermal camera to determine whether users have fevers before granting (or denying) entry to work. Considering PopID’s many campus ties, it is easy to imagine that it could introduce facial recognition symptom monitoring tech at California universities soon.  

But PopEntry+ will likely disproportionately burden vulnerable populations. Facial recognition is consistently less accurate in identifying people of color, women, disabled and trans and gender-nonconforming people. As a result, these students may get locked out of class even if they are COVID-19-negative. Or, in the case of a contact tracing, the technology would be less able to notify these students if they’ve been exposed to COVID-19. This could automate and exacerbate the already disproportionately high rates of the disease among minorities, as well as existing inequalities in academia. Furthermore, we know that data from surveillance technologies tends to disproportionately surveil and criminalize these vulnerable populations. Unless universities have clear policies related to data use, this is likely to happen now too. 

You should also worry about other surveillance technologies universities are mobilizing for their COVID-19 response. University of Alabama and University of Arizona’s “exposure alert” apps, developed by COVID Watch, would use Bluetooth signals to alert users if one of their “close contacts” (within 6 feet for more than 15 minutes) later tested positive for the virus. University of California Irvine will deploy the campus’s wireless network: By measuring how many devices were connected to particular WiFi hotspots at any given time, administrators can determine what parts of campus are not properly socially distanced, and can alert people if they crossed paths with someone who tested positive for COVID-19.

Many are stopping short of location-tracking, but will require students, faculty, and staff to provide detailed health data daily. Both the University of Alabama and Duke will require all students and employees to answer daily health questionnaires on symptom tracking apps before allowing them to attend school or work. The University of Kansas will take this a step further by linking responses to their symptom-checking app with building access.

This mandatory collection of sensitive data will teach students that it is normal and unremarkable to be constantly observed, and that not even your health information or location should be private. This is a dangerous precedent that could be easily abused and expanded further into daily life. Also, the constant surveillance of students produces negative psychological outcomes. We’ve found that increased surveillance leads students to report higher rates of anxiety and feelings of powerlessness and resentment toward their schools. When students don’t feel safe or comfortable, their education suffers.  

Some students may say, hey, I’m a law-abiding citizen and my day-to-day activities are pretty uninteresting. Who cares if my university knows where I’m going or what I’m doing? The problem isn’t just what information is collected, but what can be done with it. Once implemented, even if the original purpose was innocent, universities could use surveillance infrastructure to infringe on our civil liberties and privacy, to suppress political dissent (considering the current Black Lives Matter protests, this should concern you), and to criminalize everyday activities. 

There are other costs to losing control of your personal data, too.  The companies that make and maintain these surveillance systems are likely to sell our data without our explicit consent. Because data markets are highly lucrative, and because the United States currently has no federal data security requirements that govern what companies can do with our information, your location history, face print, body temperature, and health and COVID-19 history — whether it is held by companies or the universities themselves — are vulnerable to breach. Are you comfortable giving up all of this data when it might be bought and sold without your knowledge or consent, and not even kept safe? 

Many may argue that desperate times call for desperate measures. COVID-19 is exploding across the United States., and getting worse, not better, and we need to control this virus without sacrificing higher education institutions across the nation.

Sure, if we knew that contact tracing surveillance methods would be rolled back, along with all of their consequences, as soon as life returns to normal, I might accept them as a temporary drastic measure. But more often than not, sexy new technologies like these become entrenched, regardless of their effectiveness. There is always a bias toward more technology rather than less, and there are often financial barriers to rolling back technology once implemented. 

And even if universities were to dismantle surveillance systems after the pandemic, significant damage will have already been done: Students of color would have received a lower quality of contact tracing than their white counterparts while being subjected to greater scrutiny and discrimination. Sensitive health, location, and biometric data would have been collected from students and leaked or sold. A culture of surveillance would have expanded, setting a dangerous precedent against privacy that will exacerbate student anxiety and degrade your education. For these reasons, I urge you: Before embracing these sexy, quick-fix technological solutions, think about whether you’re comfortable letting your college experience become a “Black Mirror” episode, complete with location tracking in the libraries, facial recognition on frat row, and data breaches in the dining halls. And if you aren’t, fight back.

Claire Galligan is a recent graduate of University of Michigan’s Ford School of Public Policy. She was a researcher in the Technology Assessment Project, part of the university’s Science, Technology, and Public Policy Program.

Leave a comment

Your email address will not be published. Required fields are marked *