Research Science and Technology

Tackling controversial technology face by face

University of Miami researchers are trying to improve how facial recognition software “sees” race and ethnicity.
Facial Recognition
Ahzin Bahraini, a graduate sociology student, sifts through surveys filled out by sociology student raters who are identifying how they perceive mug shots. Photo: TJ Lievonen/University of Miami

Last month, the city of San Francisco banned the use of facial recognition software by police and other agencies. This week federal lawmakers are questioning the FBI and its use of the controversial program. And in New York, some public school districts are expected to begin experimenting with the use of facial recognition to identify potential threats.

Across the country, serious questions abound about how and when the technology should be used, and whether the positives—identifying a suspect or preventing a terrorist attack—outweigh the negatives—infringement on privacy rights.

“We have recognized that these technologies are going to be used in one way or another, so why not try to improve it and hold it to a higher standard if they’re going to be implemented anyway,” said Nicholas Petersen, assistant professor of sociology in the University of Miami College of Arts and Sciences.

Petersen and colleague Marisa Omori, assistant professor of sociology in the College of Arts and Sciences, have combined computer science with sociology and law in a research project supported by the Univerity of Miami Laboratory for Integrative Knowledge (U-LINK) that will help determine how physical characteristics and facial recognition software influence criminal justice outcomes.

The team is developing a machine learning model that can test whether skin tone and other facial features of criminal suspects leads to unequal punishment outcomes in Miami-Dade County’s criminal justice system. As the researchers note, facial recognition algorithms typically are based on data from white faces, increasing the likelihood that darker faces will be flagged as “suspicious.”

“Recent studies have shown that the darker and more feminine faces perform worse on facial recognition software,” said Rahul Dass, a Computer Science PhD student who is working on the project. “Furthermore, if benchmark datasets that face recognition software use for training contain underrepresented demographics, then those racial groups will invariably be subjected to frequent targeting.”

Researchers are currently training student raters to classify facial features from a sample of arrest mugshots of defendants used in a study by the American Civil Liberties Union of Florida conducted last year by both Petersen and Omori. The study found that blacks, particularly black Hispanics, are over-represented relative to their share of the population at every stage—from arrest and pretrial detention to sentencing and incarceration—of the criminal justice system.

Ahzin Bahraini, a graduate sociology student, has created a survey that is being filled out by sociology student raters who are identifying how they perceive every mug shot. The data is then processed through machine learning technology in an effort to help the computer think in a different way.

 “What we’re doing is novel,” said Bahraini. “In terms of machine learning, nothing has looked at facial feature breakdown. A lot of it has been focused on skin color in the past. We are breaking it down by nose, lips, eyes, tattoos, everything. Until now, it has been one plus one equals two for machine learning. We are bringing in multiplication into the mix. Now the machine learning has a whole other layer of depth.”

Petersen points out how machine learning is only as good as the data fed into it.

“A lot of people take for granted that these algorithms are going to come up with the correct answer. We want to make sure that our machine learning model will have enough input on each ethnicity so we don’t run into an error,” he said.

“On one hand, we know this software could be very useful. The reality is that there are people who do really awful things. If machines can help lead to an arrest or help prevent a threatening situation, that could be important,” he said. “On the other hand, it brings about privacy concerns and individual rights and liberties.”

In addition to Petersen and Omori, other faculty members who are part of the U-LINK project ‘Race and Facial Profiling’ include: Ubbo Visser, associate professor in the Department of Computer Science; Tamara Lave, professor in the School of Law; and Cameron Riopelle, data services librarian.


Top