scroll to top arrow or icon

Your face is all over the internet

Your face is all over the internet

On the subway, you see someone out of the corner of your eye. Do you recognize them? A former classmate? A coworker from three jobs ago? Maybe a short-lived fling? That question nags in your head: Who are they?

AI has an answer: You covertly snap a photo when they’re not looking and upload it to a facial recognition software that searches millions of webpages for that same unique face. Ping! That face pops up in the background of a photo at Walt Disney World, and there they are at a protest, and there they are on someone’s old Flickr page. Oh, but actually one links to a wedding album. They were in the bridal party. The website is still active. A face. A name. Identity unlocked. You finally figured out who they were – the mystery is solved.


That’s perhaps the most harmless, best-case scenario — and even that’s more than a little bit creepy. But that reality is already here.

Facial recognition services like PimEyes and Clearview AI do just this, using machine learning to sift through enormous troves of faces with startling accuracy. They’re essentially reverse search engines that make your face all that a stranger — or the government — needs to gather your personal information.

I uploaded my face to PimEyes to test it out. The company brags about its creepiness: “For $29.99 a month, PimEyes offers a potentially dangerous superpower from the world of science fiction,” reads a New York Times quote featured prominently on its homepage.

For $300 you get “deep searches” and unlimited access to the software. GZERO ain’t buying it, but a highly motivated individual could pay the full price to find someone, to stalk them, to uncover their identity and whereabouts, and to connect them to a time and place.

Most of the results were pictures I had uploaded: profile pictures for various websites, mainly, as well as photos from my own wedding on our photographer’s website. But there were also a slew of pictures with me in the background of a press conference. In late 2018, I covered CNN reporter Jim Acosta’s court battle to get his White House press pass back. PimEyes surfaced multiple photos of me in the background of Acosta’s interview. The $30 version of PimEyes didn’t shock me, but it was jarring to see my previously unlabeled face from a press conference pop up in less than a minute.

Meanwhile, Clearview AI doesn’t sell directly to the public, instead opting for the lucrative business of selling to law enforcement, government, and public defender offices, according to its website. It’s being used in war right now: Time Magazine wrote that Clearview AI is Ukraine’s “secret weapon” in its conflict with Russia, using the technology to identify Russian soldiers and search for hostages taken across the border.

New York Times reporter Kashmir Hill has written about both companies and told The Verge last year that she’s viewed Clearview AI searches of herself — conducted by the company’s co-founder — and said it was much more extensive than PimEyes and surfaced 160 photos of her “from professional headshots that I knew about to photos I didn’t realize were online.”

In 2011, Google co-founder Eric Schmidt said that facial recognition is the only technology his company had built and decided to stop for ethical reasons. “I’m very concerned personally about the union of mobile tracking and face recognition,” he said, noting that dictators could weaponize it against their own people.

There are positive uses: Prosecutors could use facial recognition to destroy an alibi, or police could use it to find a missing person and their kidnapper. Journalists can find out who was on the scene of key events and track down leads, or quickly put names to faces in the field. But it’s easy to see Schmidt’s fears come to life with an expansive surveillance state that’s always watching.

While there aren’t currently facial recognition laws on the books federally in the US, there are biometric privacy laws in Illinois, Texas, and Washington, which may limit the ways people’s faces can be used online.

Democratic Senators asked the Justice Department earlier this year to look at whether police departments are using facial recognition in a way that curtails civil rights. And the Federal Trade Commission even banned Rite Aid from using facial recognition for five years after it repeatedly and falsely identified women and people of color as shoplifters.

Xiaomeng Lu, director of Eurasia Group’s geo-technology practice, said there are clear benefits for facial recognition technology, such as face-scanning at airports to verify the identities of passengers. She said that “misuse of such tools can violate [individual] privacy,” and regulations such as the European Union’s data privacy law, which deemed facial recognition sensitive data. Ground rules in the US would help address the risks of the technology, Lu added.

The rise of facial recognition technology is quite possibly a step too far in the artificial intelligence boom, something that will make citizens, advocates, and some regulators shudder at its possibilities for abuse. But it also augurs the end of anonymity — where stepping out into the physical world could create another entry in a large database that seemingly anyone can access for a small sum.

GZEROMEDIA

Subscribe to GZERO's daily newsletter