Artist's campaign targets biometric surveillance

Artist and researcher Adam Harvey has set out to raise awareness about the increasing pervasiveness of biometric tracking on the web and in everyday life.

Posters included in Adam Harvey’s “Think Privacy” series.

Courtesy of Adam Harvey

June 10, 2016

"Today's selfie is tomorrow's biometric portrait," read the bright red text of a poster plastered to the New Museum’s window in New York City’s Bowery neighborhood.

It's a simple, arresting message that artist, researcher, and privacy activist Adam Harvey hopes will compel anyone who sees it to reconsider how they use technology, especially as internet companies and government agencies deploy increasingly sophisticated biometric software to track, surveil, or identify individuals.

"If people knew that a company was quantifying and commercializing their age, race, gender, mood, clothing, the amount of lipstick they wear, the amount their midriff is exposed, and the expression on their face, would they act differently? I think so," says Mr. Harvey. "The problem is we can't see how we're being seen. And as a result we can't know how to act accordingly."

Tracing fentanyl’s path into the US starts at this port. It doesn’t end there.

Based in Berlin, Harvey is among a growing number of artists who are gaining attention for their focus on digital issues in the wake of the Edward Snowden revelations of widespread National Security Agency surveillance.

While artists such as Laura Poitras, who directed the documentary about Mr. Snowden "CitizenFour," and British graffiti artist Banksy have focused on mass surveillance and US drone policies, Harvey’s work attempts to raise awareness about the use of biometric technologies to record and track individuals’ unique characteristics such as facial features, speech patterns, and even their manner of walking. 

Pieces in Harvey’s "Think Privacy" series that includes the selfie poster have appeared online as well as in galleries from Hong Kong to Berlin. And as he expands the collection this month, biometric surveillance is advancing rapidly. 

In fact, computers are now more than 98 percent accurate when it comes to recognizing faces, according to a Chinese University of Hong Kong research paper. And, machines appear to be getting smarter at identifying specific people due to the massive amounts of photos uploaded to the web.

But even though commercial stores, law enforcement, and intelligence agencies justify using this technology to detect shoplifters or to suss out criminals and terrorists, Harvey says that most people remain uncomfortable with biometric software. Specifically, he pointed to a survey from research firm First Insight that found that 75 percent of respondents wouldn't shop in a store that used facial recognition technology. 

Some consumers have even started pushing back against companies that use biometrics. Both Google and Facebook are facing a lawsuit over the collection and storage of their users’ facial images. The case claims that both tech giants unlawfully collected the information in violation of Illinois’s Biometric Information Privacy Act. The outcome of the case could set a precedent for how the technology evolves in the US. In Europe, regulators already have strict rules that limit facial tagging on the internet. 

Many tech advocacy groups in the US also want more transparency around the use of facial recognition technology. In a letter to the Justice Department, a group of 45 civil liberty and privacy groups opposed an FBI plan to exempt its biometric database from Privacy Act provisions.

But biometric technology is hardly new. Tools first emerged the 1960s when a Japanese team created software to detect the outline of someone’s head. By the 1970s, computers could pinpoint eyes, noses, and mouths. For the first time in the 1990s, the Department of Defense created a large database to spot combatants at a distance, and by 2001 the standard "Viola Jones" detector, which is still in use, was created. 

Yet it wasn't until 2010 that biometrics emerged into the mainstream, and these formerly military technologies were for the first time applied to a slew of celebrity look-alike websites, says Harvey. Many Internet users freely sacrificed their information to be paired with their star doppelganger on Facebook.

"We were so busy having fun that we didn’t realize we were training the largest facial recognition device in the world," says Harvey.

Since then, researchers have created websites such as MegaFace that include vast archives of photos gleaned from the internet. While this data is largely collected from commercial sources, it's also gathered for governmental purposes, says Harvey. Starting in 2012, programs such as Janus from the government's Intelligence Advance Research Projects Activity (IARPA) agency have turned to social media to gather biometric data.

"The IARPA Janus program was developed to remotely identify an enemy of the state while Facebook's goal is to know everything about you in order to sell ‘advertisements,' " says Harvey. "But is it still an 'advertisement' when it's also a psychological taunt driven by statistical models of your consumer behavior? I'd argue that primary use in both cases is social control."

Such information is also being "power-grabbed" from the Internet without regards to ethics or user consent, says Harvey, and used for academic purposes, such as a one recent Twitter facial analysis that reveals the demographics of presidential campaign followers. Google recently backed a research paper on how to use clothing – studied from public but copyrighted online images – as metadata.

Havey is optimistic that as the public becomes aware of the pervasiveness of biometric tracking they will advocate change. For instance, he said, a decade ago it was difficult to find easy-to-use software that could encrypt sensitive communications. Now, that technology is on every iPhone.

People need to have a better way of avoiding biometric detection than just “opting out of dubious tracking technologies every day,” he said. “You can’t make a good decision if you are only told the benefits and the costs are hidden.”