Article Image
IPFS News Link • Surveillance

Here's the File Clearview AI Has Been Keeping on Me, and Probably on You Too

• Vice.com

After a recent, extensive, and rather withering bout of bad press, the facial recognition company Clearview AI has changed its homepage, which now touts all the things it says its technology can do, and a few things it can't. Clearview's system, the company says, is "an after-the-fact research tool. Clearview is not a surveillance system and is not built like one. For example, analysts upload images from crime scenes and compare them to publicly available images." In doing so, it says, it has the power to help its clients—which include police departments, ICE, Macy's, Walmart, and the FBI, according to a recent Buzzfeed report—stop criminals: "Clearview helps to identify child molesters, murderers, suspected terrorists, and other dangerous people quickly, accurately, and reliably to keep our families and communities safe."

What goes unsaid here is that Clearview claims to do these things by building an extremely large database of photos of ordinary U.S. citizens, who are not accused of any wrongdoing, and making that database searchable for the thousands of clients to whom it has already sold the technology. I am in that database, and you probably are too.

If you live in California, under the rules of the newly enacted California Consumer Privacy Act, you can see what Clearview has gathered on you, and request that they stop it.

Do you work at Clearview or one of its clients? We'd love to talk to you. From a non-work device, contact Anna Merlan from a non-work device at anna.merlan@vice.com or Joseph Cox securely on Signal on +44 20 8133 5190 , Wickr on josephcox, OTR chat on jfcox@jabber.ccc.de , or email joseph.cox@vice.com .

I recently did just that. In mid-January, I emailed privacy-requests@clearview.ai and requested information on any of my personal data that Clearview obtained, the method by which they obtained it, and how it was used. (You can read the guidelines they claim to follow under the CCPA here.) I also asked that all said data be deleted after it was given to me and opted out of Clearview's data collection systems in the future. In response, 11 days later, Clearview emailed me back asking for "a clear photo" of myself and a government-issued ID.

"Clearview does not maintain any sort of information other than photos," the company wrote. "To find your information, we cannot search by name or any method other than image. Additionally, we need to confirm your identity to guard against fraudulent access requests. Finally, we need your name to maintain a record of removal requests as required by law."

After a moment of irritation and a passing desire not to give these people any more of my information, I emailed Clearview a photo of my work ID badge and a redacted copy of my passport. About a month went by, and then I got a PDF, containing an extremely curious collection of images and an explanation that my request for data deletion and opt-out had been processed. "Images of you, to the extent the [sic] we are able to identify them using the image that you have shared to facilitate your request, will no longer appear in Clearview search results," the "Clearview Privacy Team" wrote.

The images themselves are indeed all photos of me, ones that I or friends have put on social media, and they are exceedingly odd. (The source of them is odd, not my face, although, that too.)

The images seen here range from around 2004 to 2019; some are from my MySpace profile (RIP) and some from Instagram, Twitter, and Facebook. What's curious is that, according to Clearview, many of them weren't scraped from social media directly, but from a collection of utterly bizarre and seemingly random websites.

"You may have forgotten about the photos you uploaded to a then-popular social media site ten or fifteen years ago... but Clearview hasn't," Riana Pfefferkorn, associate director of surveillance and cybersecurity at the Stanford Center for Internet and Society, wrote in an email. "A lot of data about individuals can quickly become 'stale' and thus low-value by those seeking to monetize it. Jobs, salaries, addresses, phone numbers, those all change. But photos are different: your face doesn't go stale."

midfest.info