Students Accuse The University Of Miami Of Using Facial Recognition To Identify Student Protesters. The University Denies It.

Last month, Esteban Wood and eight other University of Miami students received an ominous email. The message only contained a Zoom link and a one-sentence explanation: Dean of Students Ryan Holmes wanted to discuss the “incident that happened on September 4, 2020 at the Whitten University Center.”

Wood and the others attended a peaceful protest that day against the university’s reopening plan, but no one knew why Holmes wanted to talk to them. The message was vague, but what was even more confusing was the choice of recipients. None of the students actually organized the protest. Three were student journalists who covered the demonstration. And only two were part of an activist student group, UMiami Employee Student Alliance, that participated.

“We got to thinking, how did they choose those nine students?” Wood, a member of UMESA who received the email, told Forbes.

The University of Miami said in a statement it “does not utilize facial recognition technology”—but students and digital rights non-profit Fight For the Future aren’t so sure. They cite the campus police chief’s resume, which says the university has an extensive camera system that uses “sophisticated algorithms” for “motion detection, facial recognition, object detection and much more.” They also point to an interview with a student magazine earlier this month in which Chief David Rivero said his department used Florida Department of Law Enforcement facial recognition software to catch a burglar at a fraternity house.

“It sure seems like the University of Miami is using facial recognition to target and intimidate students who are exercising their First Amendment rights,” said Lia Holland, a Fight For the Future organizer, in a Medium post. “If that’s not the case, then they need to tell their Chief of Police to stop claiming they use this technology and ban facial recognition from their campus entirely.”

In an interview with Forbes, Rivero said he was “misleading” on his resume and has since fixed it. He said he was listing the possible uses of the university’s security camera system, but it doesn’t have facial recognition capabilities or a database of photos to compare video footage to, despite the reference on his resume.

“In order to have the cameras be able to use facial recognition, they have to be positioned at the right angles. Our cameras are on top of buildings and in hallways,” he said. “They’re not positioned in order to be able to maximize facial recognition.”

Rivero said the students emailed by the dean about the protest were identified using video footage and “basic investigative techniques,” which he declined to detail.

Anh Nguyen, an assistant professor of computer science at Auburn University, told Forbes that even if the cameras were at the “wrong” angles, that by itself doesn’t discount the possibility of facial recognition being used.

Rivero added that campus police have submitted requests to use software from the Florida Department of Law Enforcement in the past, but that system only compares campus surveillance footage to a database of arrest photos.

“Why would I need to go to FDLE if we had facial recognition?” Rivero said.

Rivero said he isn’t a believer in facial recognition technology. The University of Miami has tested facial recognition software before, he said, but decided not to move forward with it.

“I know that facial recognition doesn’t work, so why would I spend money attaching a system to my cameras that does not work?” he said.  

In the meeting with the student protesters, Dean of Students Ryan Holmes admitted to asking campus police to compile a list of students who attended the protest. He told student newspaper The Miami Hurricane the meeting was only meant to be “educational conversation after the fact” about proper event registration. The demonstration technically violated university guidelines because organizers didn’t formally reserve space for the protest through the Dean of Students office. 

None of the students involved were disciplined, but Wood said he still feels his privacy has been violated. 

“Students are very fearful that they’re going to be called into the Dean of Students and be talked down to, lectured to and feel as if they’re an other of the community,” Wood said. “To be called out like that has damaging effects.”

Much of the attention surrounding facial recognition has focused on public entities, such as law enforcement. San Francisco, Oakland and Boston have banned city agencies and police departments from using the technology. Portland took it a step further and barred corporate use in public areas. Microsoft

MSFT
and IBM

IBM
said they would no longer sell its tech to police departments. Congress is kicking around regulation for the federal government.

The extent to which businesses and private universities, which aren’t subject to public records laws, use facial recognition is more of a mystery.

The University of Southern California is one of the only schools openly acknowledging the use of facial recognition, which is deployed to stop intruders from entering dorms. But the lion’s share of major institutions—more than 60, including MIT, Harvard, UCLA and Vanderbilt—pledged not to use the technology, according to Fight For the Future. Some campuses, including the University of Minnesota and Southern Methodist University, have trial accounts with controversial facial recognition startup Clearview AI, BuzzFeed News reported, but did not pursue full contracts.

Colleges say facial recognition can be used to stop school shootings, catch criminals or offer conveniences, such as scanning your face to enter dorms or pay for food. But critics argue the technology isn’t reliable enough, especially because studies show that algorithms have a tougher time identifying people with darker skin. But even if it was 100% accurate, the potential for abuse is too high, Fight For the Future Campaign Director Caitlin Seeley George said, in the absence of federal policy governing the use of facial recognition.

“That’s part of the problem with this technology,” she said. “You can say that you’re using it for one thing, but as soon as it’s put in place and as soon as it’s normalized, it can be used for a much broader range of concerning reasons.”

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: