People and Community Science and Technology

New facial recognition software ‘scrapes’ inventory from social media

University of Miami students and professors discuss the issues surrounding Clearview AI, a new software program used by police around the nation.
University of Miami, cybersecurity, facial recognition
Lokesh Ramamoorthi, a software engineering and computer science lecturer, addresses his cybersecurity class. Photo: Evan Garcia/University of Miami

When University of Miami junior Hailie Berman needs to use her phone, she simply holds it in front of her face and the device recognizes her and unlocks its contents.

But Berman knows the technology is imperfect: her younger sister can open her phone as well.

This is one reason the health science major has concerns about a new facial recognition program gaining popularity among U.S. law enforcement agencies. The software program, called Clearview AI (for artificial intelligence) aims to help police track down criminals using its own facial recognition algorithm and a massive database of photos it collects from social media.

Clearview uses a process called data scraping to scour the internet for what it states are “public” photos to stockpile its database. This includes gathering images from Facebook, Instagram, Twitter, YouTube, LinkedIn, Venmo, and other social media applications.

To use Clearview AI, police upload a picture or still from a video of a suspect into the program. The software then checks for matches within its 3-billion—and counting—image database. Since a New York Times investigation recently uncovered how Clearview’s software operates, it has been a topic of conversation in the technology industry, and students in the University’s cybersecurity class debated the ethics of the software last week. Clearview Al helped the Indiana State Police find a suspect within 20 minutes of scanning his photo, the Times reported. Clearview customers also include law enforcement agencies in Pinellas County, Florida, and the Atlanta police department, the New York Times also reported.   

Yet the company’s data collection practice—which remains undisclosed on their website —is raising eyebrows.  

“Clearview justifies data scraping by saying that its public information and implying that these photos are only being used to help law enforcement. But they can extend that argument for using people’s photos indiscriminately and selling a person’s image to any third party without that person’s consent,” said Petros Ogbamichael, a junior studying biochemistry. “There should be a way to contest this legally.”

Industry experts say the software represents an invasion of privacy because there was no consent for Clearview to use their photos. And because of its algorithm, which recognizes faces even in the background of images, people can be identified even if they are unaware their photo was being taken. If the software falls into the wrong hands, it can easily be abused.

“Think of this being used in countries where democracy is not the norm,” Lokesh Ramamoorthi, a software engineering and computer science lecturer told his cybersecurity class last week. “Someone who speaks out against the government could be stalked or apprehended for something they did not do.”

Berman said that she thinks that facial recognition technology still needs to be perfected. Research indicates that often facial recognition programs misidentify minorities, a concern also raised by other students last week.

“Sometimes it may cause confusion if there are look-alikes; or, it could lead to false convictions,” Berman said.

Clearview AI claims the First Amendment protects its right to take public photos from the internet. But while the United States does have a broader interpretation of freedom of speech than other nations, A. Michael Froomkin, a professor in the School of Law, cautioned that the First Amendment argument applies only to personal or corporate claims against government laws and regulations, not to claims between private companies.

However, Froomkin said Clearview may have violated a contract if they created fake accounts to gain access to more photos. Recently, Twitter, Facebook, Google, and YouTube have sent cease-and-desist letters to Clearview, demanding the company stop scraping images from their sites.

“If there’s a term of service [on Facebook or Twitter] which prohibits this activity, then it’s a contractual violation,” Froomkin added, even if the accounts were not fake.

Data scraping is not a new practice, said Ramamoorthi, who worked in the software engineering and cybersecurity industry for 12 years before he began teaching.

“Data scraping has been used for a long time by law enforcement agencies, but it’s being used at the mass level now because of the amount of data we generate today, which makes it easier for them to do it,” he said.

Yet, the power of Clearview’s software underscores how vulnerable everyone is to being tracked today, Ramamoorthi added. He said everyone should proceed with caution before uploading photos to the internet. And, they should make sure their profiles and sharing settings on social media sites are set to private. (If they are public now, Clearview likely has your images already.)

“Whenever you are submitting a photo, you need to understand how these pictures are going to be used,” he added. “And if your sharing permissions are public, you are more likely to be scraped.”

Junior Mark LaRocca, an economics major, said he recently stopped using Google after learning how the company sells his data to advertisers. He said learning about Clearview’s potential to use his images makes him glad that he recently got rid of his Instagram account.

“For my face to be used in any way by someone I didn’t give permission to, I do not agree with that,” he said.

So, what can people do to keep their identity private?

Although social media companies cannot do much to retrieve data that Clearview already has collected, Ramamoorthi said social media companies could watermark photos to distort people’s faces, making it more difficult for data scraping algorithms to pick up a clear image.

In Europe, companies like Clearview AI could have a harder time thriving because of the General Data Protection Regulation (GDPR) adopted by the European Union in 2018. It requires people to know and consent to any new uses of the data collected about them, Froomkin said. Yet, the United States has shied away from a similar federal privacy law.

“In general, in the U.S., we have not created rights to privacy in public,” he said. “If you walk through [most cities], local law enforcement officials can identify your cell phone location and videotape you.”

Some states and cities are trying to stem the tide. San Francisco and Oakland, California, as well as some Massachusetts cities, have banned facial recognition technology. And the states of California and Illinois have adopted privacy (also known as biometric) laws that require a person or business to inform and obtain a written release from a person before obtaining his or her  fingerprint, voiceprint, eye scan, or the person’s facial geometry (used for facial recognition programs). In Illinois, several cases already are challenging Clearview under the biometric act.

Students do not seem overly concerned about the technology, although some said they may be deleting their social media accounts soon.

Halle Miller, a sophomore health science major, said she uses Instagram, Snapchat, and sometimes Facebook; but, she does not post many photos. So, she was not overly concerned about Clearview eroding her privacy.

“I’m not posting anything I wouldn’t want people to see, but I do feel it is creepy to know that if anything was to happen and I did get in trouble with the law it wouldn’t be hard for them to find me,” she said.

Kristine Padgett, a senior physical therapy major in the cybersecurity course, said she is planning to delete her social media accounts after learning about yet another way her digital identity can be compromised.

“If people can hack into the National Security [Agency], they could also use this information in a bad way, so I find [Clearview] really terrifying,” she said.


Top