close
close

Association-anemone

Bite-sized brilliance in every update

In the Baltimore police spy plane
asane

In the Baltimore police spy plane

Photo source: Matthew Binebrink – CC BY-SA 4.0

In 2020, the Baltimore Police Department had an aerial surveillance plane that could photograph and track every person in public view. of Benjamin Snyder Spy plane reveals what happened to this controversial police experiment. Drawing on incredible access and first-hand observations from the for-profit tech startup that ran the Baltimore detective program, sociologist Benjamin H. Snyder recounts real criminal cases as they were worked by police using this untested tool.

Deploying powerful camera planes built by a small company called Persistent Surveillance Systems, the spy plane program has promised to help police “solve otherwise unsolvable crimes” by tracking the whereabouts of suspects in violent crime cases. Created for the battlefields of Iraq, it had never been adapted on such a large scale in a US city. This gripping book offers an unprecedented look into the dark world of for-profit law enforcement technological experimentation, explaining why police and community leaders put so much faith in unproven technology to solve urban violence, but keep stopping.

What motivated you to write? Spy plane?

I have a long-standing interest in how people ascribe almost magical qualities to technology, particularly the idea that it can “solve” social problems. In 2017, I heard an episode of the podcast Radio labwho covered the story of the first spy plane test in Baltimore in 2016. He made a lot of claims that I found exaggerated. My wife’s family is also from Baltimore, so I care deeply about the city. I thought there might be a more complex story on the plane and I was well placed to investigate, so I started looking.

How did you gain access to this data and the dark world of law enforcement technology? What did this insider perspective allow you to see that others did not?

I was able to gain access thanks to the unusual openness of Ross McNutt, one of the inventors of spy plane technology and CEO of Persistent Surveillance Systems. In 2017, I asked him if I could study his company and he said yes. Basically it was as simple as that.

Why did he give me access? McNutt is what technologists call a “techno-solutionist.” He was confident that if the public could see how the technology worked, they would accept aerial surveillance as a relatively cheap and effective “fix” to crime in a city otherwise ill-equipped to address the root causes. So he wanted me to provide “total transparency,” as he called it. He let me walk the analysts, even giving me card access to the company’s data terminals where I could look at the raw footage from the plane. I was also able to establish relationships with the detectives, who would come in and out of the operations center to get help on their cases.

This access allowed me to see beyond the spy plane hype. In the media, the program has often been described as either a cutting-edge piece of military hardware – an all-seeing eye in the sky – or the epitome of Orwellian dystopia – Big Brother is watching. Seeing how the tool was implemented in a live situation revealed something far more complex, though no less troubling. Much of this complexity would have been impossible to see if I had had to rely on the public parts of the program, such as interim reports from police and auditors or PSS marketing materials.

While the book focuses on this specific case in Baltimore, what larger story does it tell us about surveillance, technology, and law enforcement in the US?

The main thing I found is that the spy plane was flawed, unreliable, and unleashed further harm on the public. For example, it turned out to be relatively easy for spy plane analysts to accidentally follow the wrong person at a homicide scene and not realize it – what’s called a “false positive.” At least once, this resulted in the arrest of a completely innocent person. Fortunately, the mistake was caught in time, but the fact that it might even occur was not revealed to the public before the program was released. Importantly, the risk of false positives was concentrated in already economically disadvantaged, predominantly black neighborhoods because those neighborhoods were identified as good “test sites” for the experiment. This is really common for law enforcement technologies. If you research, you’ll find that things like facial recognition cameras, gunshot detection systems, police body cameras, and even CCTV have a long history of being first installed in black neighborhoods as a field test, often without oversight. These tests often trigger unforeseen damage that the general public rarely hears about.

Why don’t these issues get more attention? Public debate about these somewhat more mundane, though certainly harmful, risks is sidelined by what I call the boomer-doomer hype cycle. Then boomers (or boosters) promote a technology as an innovative silver bullet for stopping crime. Then the convicts cheer that technology is a massive threat to humanity, heading for a dystopian future where the state can “watch over us all”. We are seeing this with so-called “AI” right now.

What the spy plane case taught me is that both ideas are forms of hype. When they feed off each other, it makes it difficult to have a clear debate about the basics: How does the technology actually work in a live deployment? Does anyone know if it works or is it an experiment? If it hasn’t already been tested, do we know what kinds of bugs and failures are possible? Could it trigger new risks that are difficult to foresee? Who is most likely to be hurt by these? Who can say when to pull the plug if problems arise? These are questions about implementation, not some distant (dys)utopian future. If communities were better prepared with this practical language, they might be able to avoid some of the immediate dangers before technologies ever become something as dystopian as Big Brother.

What was a surprising moment, insight or story from the process of writing the book?

A few months into my field work in the operations center, I realized that I could be subpoenaed in a criminal or civil lawsuit. My field notes contained some pretty sensitive information about the messy reality of spy plane investigations that various parts of the justice system might find useful to their side. However, I promised confidentiality to all respondents in my study (except McNutt). If my field notes were given to a judge, it would be an ethics violation. It would also sour future relationships between startups and ethnographers. I went to the sociological literature on citing field notes and found that others had been in the same position. Those who refused to comply even went to prison! I wish I had thought about it more before going in. Finally, I was never asked for my notes, thank God. In the methodological appendix of the book, I offer future researchers some tips and tricks for approaching these kinds of situations more carefully than I did.

What is one key message you hope readers will take away from the book?

We live in a time when international conflicts (in Gaza, Ukraine, Kashmir, etc.) are used as testing grounds for the development of new, sometimes horrific, surveillance technologies. These tools, many made by for-profit start-ups, will most likely one day find their way into American police forces, just as the spy plane made its way from Iraq to Baltimore. We in the US must anticipate the next iteration of the war-police pipeline by being careful and resisting these experiments abroad now. We also need to create resistance strategies at home before these experiments continue.

This post was originally published on University of California Press Blog and is reprinted here with permission.