Amid revelations of the NYPD’s biometric surveillance programs, photojournalists are forced to reconsider photography in public space.

On June 2 hundreds of demonstrators convened in Manhattan’s Washington Square Park as part of the largest civil rights demonstration in world history. There they observed a moment of silence for George Floyd, Breonna Taylor, and the countless other African Americans slain by police. The solemnity was punctuated by an irksome din from overhead.

“Cover your tattoos and your faces!” cried out one demonstrator before a fusillade of water bottles and projectiles were flung from the crowd to shoot down two NYPD drones hovering overhead.

The overhead drones (flying with the subtlety of a mosquito in your ear) were deployed by the Technical Assistants Response Unit (TARU). TARU is one of the most technical and opaque units in the NYPD’s growing surveillance arsenal. TARU’s responsibility, in the Department’s own words, is to “enhance investigations through the recovery of surveillance video footage; record police action at large-scale demonstrations and arrest situations; and provide crucial live video to incident commanders during ongoing emergency situations.”

TARU and its Unmanned Aerial Vehicles (UAVs) [read: drones] do not use facial recognition software, according to a Department press release. However, that language is disingenuous. The drones are not equipped with facial recognition software, but according to one NYPD source, these UAVs are equipped with high-definition cameras, which can later be used retroactively by the Department’s specially-trained Facial Recognition Unit.

The growing penumbra of state surveillance powers throws into stark relief a tableau of stakeholders—activists, photojournalists, lawmakers, and experts in the tech community—who are increasingly alarmed by biometric surveillance and civil liberties violations. Those stakeholders are also rethinking the relationship between public space and the visual image—as well as their relationships with each other. And as the aperture narrows on state surveillance powers, the once symbiotic relationship between photojournalists and activists is beginning to fray, making way for a new paradox: two vital freedoms are now pitted against one another—freedom of assembly and freedom of speech. A division sewn, ironically, by state surveillance.

“I’m afraid of retribution,” one activist leader told me. “I’m scared of being personally attacked,” another activist said, adding, “I have friends at One Police Plaza and the District Attorney’s office and I know, for a fact, they are using facial recognition to track us.”

“This isn’t Covid,” says another protester, pointing to his surgical mask. “Not showing my face is the only reason for wearing this. Please blur my face if you take my photograph.”

Face blurring is one of the latest trends to emerge from the online activist left. While facial blurring technologies are meant to protect the identities of demonstrators and leaders against retribution from law enforcement, photographers and editors alike bristle at the prospect of this censorship.

An officer from the NYPD’s Technical Assistance Response Unit records a crowd of  Close the Camps protestors in New York City. August, 2019. Photo by Nina Berman/NOOR Images

“If a conversation around facial blurring or blurring faces came up in our newsroom, I would flip over a virtual desk,” says Brent Lewis, a photo editor at the New York Times, who helped break a major Times investigation around face recognition technology. (Meaghan Looram, photo director at the New York Times, declined, through a spokesperson, to comment when asked if the Times’ has an official policy on facial redaction.) 

“These are not conversations in our newsroom,” continues Lewis. “[Face blurring] seems to be coming from a subset of people on the internet or social media—people who don’t really understand journalistic ethics—and from those who don’t understand the care with which photojournalists and editors approach touchy subjects like these protests. 

“I got into an Instagram argument with a woman who photographs cakes. She said she heard that she should blur photos and was pushing really hard on me, when I was saying we should not [blur faces]. So that should let you know that this isn’t coming from the photojournalism side of things.”

Some photo editors and photojournalists consider the ethics within the larger trajectory of American protests and civil rights movements.

“I think many of us feel that blurring faces or censoring faces would be a redaction of really important historical documents,” says one photo editor, who works at a major New York City-based magazine. “But at the same time, I would, of course, never forgive myself if we ran a photograph that later endangered someone’s life.” 

Those sentiments exemplify the moral ambiguity arising in the age of the wild, wild web. And for photo editors and journalists, the already interpretive bounds of journalistic ethics are further obfuscated by misinformation—largely due to the fact that the NYPD’s surveillance capabilities are still unarticulated to the general public.

“I worry that this sort of photojournalism, if used without safeguards, can endanger the very activists that journalists are seeking to document.”

“We definitely know that facial recognition has been used to monitor political activity in the past and we think, even if people aren’t arrested in the short-term or tracked in the short-term, that this creates a repository of information that the NYPD can always revisit as facial recognition becomes more powerful and prolific,” says Albert Fox Cahn, who is the executive director of the Surveillance and Technology Oversight Project (STOP) at the Urban Justice Center. “The NYPD ran approximately 8,000 facial recognition searches last year and that number will only grow.” Cahn continues. “We’re quite concerned about how facial recognition will be used to monitor political protest and further expand the NYPD’s surveillance of communities of color.

“Given how heavily surveilled public spaces are right now, it’s fundamentally changed the notion of what it means to be outdoors in a city. I worry that this sort of photojournalism, if used without safeguards, can endanger the very activists that journalists are seeking to document.”

There is still a filminess that mires the NYPD’s facial recognition software on the technical side. 

The NYPD currently uses a face recognition software known as Dataworks Plus, a system integrator that employs facial recognition algorithms manufactured by two separate companies, NEC and Iridium. (Dataworks Plus, NEC, and Iridium all ignored multiple requests for comment on this article.)

“If a photo exists, they can run a search against it.”

NYPD compares two sets of data in facial recognition diagnostics: comparison photos and probe images. Comparison photos match photographs of individuals against legal repositories—such as drivers’ licenses. A probe image is an image of an individual who is photographed or videotaped near a crime scene or a protest. For that sort of image the NYPD can draw from any video or photographic source—CCTV, drones, privately owned cameras that are used in partnership with the NYPD, and even Google Images. 

“If a photo exists, they can run a search against it,” says Cahn. 

And while new revelations of law enforcement’s advancing face recognition programs surface, many photographers and photojournalists, who weigh the morality and ethics of redaction, have sought the genesis of the facial recognition debate. 

Several photographers and editors pointed me to the death of six Black Lives Matters activists in Ferguson, MO as the impetus for anxiety around facial recognition. 

On August 13, 2014, Ed Crawford Jr. picked up a tear gas canister. With a bag of potato chips in his left hand, and wearing an American flag tank top, Crawford hurled the canister back towards Ferguson riot police. The moment was captured by St. Louis Post-Dispatch staff photographer, Robert Cohen. The Post-Dispatch was awarded the 2015 Breaking News Photography Prize for its coverage in Ferguson. The photograph turned Crawford into a local hero and lionized him globally. 

Then, on May 5, 2017, Crawford died of a self-inflicted gunshot wound. Crawford’s uncle, Lester Davis, told CNN a day later, “I don’t want people to think it’s some conspiracy theory. I don’t believe my nephew killed himself either—maybe it was an accident.”

The redacting-of-faces issue is new to the protests. Not brand new, as we’ve seen it in previous years, but definitely ramped up,” says Cohen, who snapped the photo of Crawley. “For protesters that ask me not to photograph them, I always remind them that they are talking to one photographer and they need to realize there are often dozens on the scene, not to mention hundreds of cellphones and police surveillance cameras and drones.”

In 2015, Baltimore law enforcement used comparative face recognition technology to track and arrest protesters following the death of Freddie Gray. It also emerged that, in addition to Maryland, several states have allowed face recognition technologies access to official driver’s license photographs to build repositories for law enforcement.

Black Lives Matter protestors in New York City. June, 2020. Photo by Nina Berman/NOOR Images

As law enforcement departments across the nation build up repositories of images and surveil the public through available footage, should photographers be concerned about their contribution in aiding law enforcement?

Chris Facey, a thirty-year-old African American photographer from Brooklyn, has emerged as one of the top photographers documenting the New York City protests. His photographs have appeared in the New Yorker and New York Magazine.

For me, I would never publish images of anyone in a compromising position that can land them in any trouble or worse,” Facey says. “The editors that I have been in talks with are making sure they are aware that they aren’t putting individuals in those dangerous situations.

“No [demonstrators have] really had an issue with me photographing them. If I feel like it’s an issue, I’ll ask [their permission]. I will never apologize for my work but I will defend it. So I just do that.” 

Privately, some editors expressed concern that facial redaction can add another layer of bias. “If some protesters are looting, but the overwhelming majority are protesting peacefully, and everyone’s faces are blurred out or censored, does it imply that everyone is doing something illegal?” one editor asked.

Many photographers and editors who have come under fire for not blurring faces have pointed out that protesters are wearing masks. However, it’s unclear how efficiently masks can aid us in evading facial recognition softwares.

“Face recognition performance has evolved quite substantially over the last five, six, seven years. There’s been a lot of machine learning improvements,” says Ralph Gross, Postdoctoral Fellow at Carnegie Mellon University, who has been studying facial recognition for almost two decades. 

“In general, face recognition algorithms tend to put more emphasis on the eye region, because it deforms less with facial expression than, perhaps say, the mouth region. There’s certain advantages in exercising that area of the face over other areas.” Gross says.

“There are legitimate uses for the technology, but [there is also] this veil of secrecy around it, so nobody really knows how the technology is being used on a day-to-day basis. That’s certainly concerning.”

Two weeks ago Brendan Klare, co-founder and CEO of Denver-based Rank One Computing announced that his facial recognition technology can be deployed to circumvent mask coverings. Though law enforcement have expressed skepticism about the products’ efficacy.

The technological advancements are luring biometric surveillance into public discourse.

“Part of the reason why this debate is intensifying now is that the performance of facial recognition technology has gotten a lot better. Ten years ago you couldn’t have these algorithms,” Gross says. “There are legitimate uses for the technology, but [there is also] this veil of secrecy around it, so nobody really knows how the technology is being used on a day-to-day basis. That’s certainly concerning.”

Under the most intense and confusing conditions—the pendulum swing from pandemic-mandated isolation to the largest mass civil rights protests in recent US history—New York City has become a testing ground for Orwellian surveillance policing. Gross admits that there is a plethora of publicly known facial recognition programs, and it is hard to generalize and rank the efficacy of each.

NEC, the company that manufactures the algorithm in the NYPD’s facial recognition software, was put to the test in 2017 when the South Wales Police used the technology in an effort to identify criminals at the UEFA Champions League final in Wales. Scanning a crowd of more than 60,000 soccer-lovers, the facial recognition AI found 2,000 known criminals. It was later widely-reported that 93 percent of those identified as criminals were false positives.

An NYPD official recently told me that detectives are using face recognition and comparison technology to apprehend and arrest suspects who engaged in criminal activity (including arson and looting) throughout the 2020 protests. Watchdogs have further pointed out that the NYPD has paired up with the FBI’s High Intensity Drug Trafficking Area (HIDTA) program during the current wave of protests, merging the NYPD’s biometric repository with the FBI’s. The two agencies have also sought protest footage from the public in an effort to identify and arrest criminals.

As this technology is being piloted and perfected, legislation to curb potential abuses cannot keep pace.

“The law is decades behind!” Albert Fox Cahn, the executive director of the Urban Justice Center’s S.T.O.P. project, told me in a phone call. He points to a Supreme Court decision from as recently as 2018, in which it was finally ruled unconstitutional for law enforcement to track an individual’s location for an extended period of time using their cell phone without a warrant. “If you’re tracking my face for an extended period of time, you should need a warrant or judicial oversight,” Cahn says.

In January, the New York Times revealed that Clearview AI, a shadowy facial recognition company, had violated the terms of services of a number of social media platforms—Facebook, Twitter, and LinkedIn, among others—to scrape together nearly four billion images, creating the largest-ever repository of images from which law enforcement could draw comparisons.

(The American Civil Liberties Union is currently suing Clearview in Illinois, one of the few states with articulable laws around biometrics and privacy.)

“I’m guessing most protesters don’t realize that there’s currently nothing preventing the NYPD from identifying every protester by using a facial recognition database.”

Many feel that the state, and all its panoply of power, should not widen the chasm between freedom of assembly and freedom of speech. While there are currently no federal laws addressing facial recognition (or even broader biometrics), many state-level lawmakers have called for new legislation to curtail these softwares and practices.

New York State Sen. Brad Hoylman has been one legislator loudly sounding the alarm bells of reform. In January, Sen. Hoylman, along with Assemblywoman Deborah Glick, introduced legislation to place a moratorium on use of face recognition softwares by law enforcement, while a legislative task force studies potential rights violations. (The bill is currently in committee in the New York State Legislature.) 

“Tens of thousands of New Yorkers have taken to the streets over the past two weeks to demand an end to the horrific and racist ways law enforcement has treated Black and Brown people for generations,” Sen. Hoylman told Document. “I’m guessing most protesters don’t realize that there’s currently nothing preventing the NYPD from identifying every protester by using a facial recognition database.”

Black Lives Matter protestors in New York City. June, 2020. Photo by Nina Berman/NOOR Images

As protests bring to light a morass of technological and legalistic uncertainty, many have turned to privacy technologies as a stopgap to combat surveillance abuses.

Two weeks ago, Signal, the largest and most highly encrypted messaging system in the world, released an in-app facial-blurring feature to allay fears of those who want to disseminate protest images without revealing individuals’ faces. 

“The world is becoming more and more aware of the hidden costs that are built into tech from Silicon Valley companies,” Moxie Marlinspike, renowned cryptologist, CEO and founder of Signal told Document (over Signal). “We’re excited about developing a different model that demonstrates how it’s possible to build tech better. It shouldn’t be necessary for everyone to sacrifice privacy for utility.”

Some demonstrators have turned to Mr. Checkpoint, a user-generated content app, to help protesters evade police presence, by mapping police presence in real time.

While this technocratic fire-fight is a provisional solution, none of these technologies address the moral and ethical issues raised among image-makers.

“If I start removing faces from this group, what’s next? Trump tells me not to photograph his face and I say, ‘Yes, master?’”

The legendary documentary photographer Nina Berman has been covering protests for decades. This is the first time protesters in public spaces have asked her to either blur or avoid photographing faces out of fear of retribution.

My question to those who don’t want any faces in pictures is: what are you suggesting instead? And, do you really think that [obscuring demonstrators’ identities] is a good thing for democracy, for social movements, for a desire to communicate beyond your own peers?” Berman says.

Berman, like many photographers, is worried what censorship might lead to—very much viewing this as a first amendment concern. “If I start removing faces from this group, what’s next? Trump tells me not to photograph his face and I say, ‘Yes, master?’”

“Nonviolent protest is a deliberately and proudly public act—people stepping out before their neighbors to take a stand. While there are plenty of abuses being photographed at a rally is not in itself normally a danger to a protester.”

Bruce Shapiro, a professor at the Columbia University Graduate School of Journalism and  Executive Director of The Dart Center for Journalism and Trauma, is also weighing the ethics of photojournalism and facial recognition.

“Protecting the safety of sources and subjects is an important journalistic principle, whether that involves images or important identifying information in stories.” Shapiro told me through email. “But how to interpret safety—particularly around surveillance and the dangers of protesters being targeted for retaliation—varies dramatically from place to place. Under a repressive regime with an aggressive political surveillance system—say, for, instance Iran—there is a strong argument for caution in depicting activists who may be endangered. 

“However, in the US and many other democracies, there’s less of a direct threat. Nonviolent protest is a deliberately and proudly public act—people stepping out before their neighbors to take a stand. While there are plenty of abuses being photographed at a rally is not in itself normally a danger to a protester.” 

Although the US has not seen as brazen abuses as other nations in the treatment of journalists and protesters, not everyone in the photojournalism community shares Shapiro’s sentiments.

“A better solution is to change the stories being told about the protests, going beyond just showing them happening and connecting more deeply with a narrative that engages protesters more completely.”

“People fall on either end of a spectrum,” says Mike Davis, Chair for Documentary Photography at the S.I. Newhouse School of Public Communications at Syracuse University. “[One end] follows traditional journalism dictums, which say that you can photograph anyone doing anything in public without qualms, and if you can’t get their names so be it—by being in the public, people suffer the consequences of their actions. But, at the other end of the spectrum, some believe we should follow humanistic practices that more completely consider the consequences of putting images into the publishing and social media ether.”

The rules of journalism are different from the rules of activism, but the two camps are intertwined in American life. “Actions in one setting—demonstrating—have different consequences in another—publishing. So how do we resolve the differences in the effort to avoid causing people undeserved harm?” Davis asks. “Blurring faces is one way but really grinds against those dictums. Another is to engage with people you’re photographing, which isn’t always possible [when] using a long lens from a distance. [Shooting at a distance] precludes getting names and that breaks another journalism dictum.

“A better solution is to change the stories being told about the protests, going beyond just showing them happening and connecting more deeply with a narrative that engages protesters more completely. Putting those stories on top of the journalism pyramid would engage readers more completely and treat demonstrators fairly. Win-win.”

“Journalists have long seen themselves as being outside of the story. But they are not outside of the story.”

Last week, a young African American activist spoke in Washington Square Park about her experience with police officers in her own neighborhood. “We never have good interactions with them where I’m from,” she said. “They come here when there’s trouble or they make trouble.”

Might she also have been talking about photojournalists?

“The industry needs to be diversified. Desperately. For so many reasons,” says Berman, who is also a professor at the Columbia University Graduate School of Journalism. “Journalists have long seen themselves as being outside of the story. But they are not outside of the story.

“If you study the history of surveillance of social movements in the United States—and it is a big, rich history—you understand that it’s police infiltrators and FBI that protesters should be more worried about [rather] than someone taking a great picture at Ferguson.

“The other side of me says, ‘Am I just living in my head with some outdated understanding of what photography is and what it’s been, and does it need to be reimagined in this time of facial recognition?’ I don’t know what that would look like reimagined. But I do know what would be lost. I think that you can’t underestimate the power of images as drivers of social change.”

“I want to see my people in the most accurate light. I don’t want to open up a magazine, a newspaper, a website and not be able to recognize my own people.”

And yet, there are profound questions that the world of photojournalism ought to ask of itself—beyond surveillance, beyond all the noise of legislation and technology. 

“Some of that has to do with the language photography has used—shooting, capturing, taking,” Berman says. “[Some of it is because] our profession has largely looked like the police profession—white men, big lenses. So, I get it.” 

Lewis, the New York Times photo editor who sparred with a cake photographer on Instagram, has spent years calling for diversity in the photojournalism world. He likens these times to his upbringing in the Southside of Chicago, where he first started taking pictures of his own community. “I have skin in the game,” Lewis says. “So I want to see my people in the most accurate light. I don’t want to open up a magazine, a newspaper, a website and not be able to recognize my own people.” In 2017, Lewis co-founded Diversify Photo, a movement and resource that promotes photographers of color.

At the heart of these protests is a consternation from communities that are tired of abuse of power; this, for photojournalists, is a learning moment. Photographers are now tasked with creating new lines of dialogue with the communities they chronicle, and reimagine how these communities are represented in images. When law enforcement abdicates its duty, it is even more essential that journalists protect and serve communities. 

We all fall somewhere in the photojournalism-surveillance morality matrix. The more that is revealed about biometric surveillance and its abuses, the more those fuzzy ethical guidelines become clear, become demarcated. Only one thing is for sure, Robert Cohen told me after a day of photographing demonstrations. “There is no privacy in 2020.”

Tags