You are using an older browser version. Please use a supported version for the best MSN experience.

Surveillance as a tool for racism

TechCrunch TechCrunch 25/04/2016 Angelique Carson

I have to be honest with you. I’ve been working in the privacy space for the last six years, and privacy never really mattered that much to me.

When I first understood, via Edward Snowden, that the U.S. government was collecting data on all of us, I didn’t knee-jerk freak out. I can’t say I was ever in the “if-you’re-not-doing-anything-wrong, you’ve-got-nothing-to-hide” camp, but I also wasn’t arms-up-in-the-air outraged that some corporation potentially handed my data over to the police. No, I didn’t love the idea that the feds maybe knew I’d once Googled “is a yam different from a sweet potato?” but, it didn’t keep me up at night, either.

Then I attended an event called The Color of Surveillance at Georgetown Law and the hair on my arms stood up straight.

I’d missed it completely.

I’d missed the entire reason privacy isn’t just a concern for those who logged into Ashley Madison or researched something more nefarious than the difference between starches. I missed that it should matter to me because there are people for whom it has to matter, by virtue of their socioeconomic or racial status. And while I have the luxury, by virtue of my own socioeconomic status and race, of ignoring reality and letting this not be my problem, that’s not how wrongs are righted.

I finally saw surveillance not as something mildly offensive to my own sense of civil liberties, but as a tool of institutional racism. It suddenly became clear to me — and I’m so embarrassed it didn’t prior — that the people most stripped of their privacy rights in this surveillance age are the people who are already vulnerable.

But the powerful surveilling the powerless, and I’m specifically talking about race here, is nothing new. It existed even in the earliest days of slavery. Surveillance and power have long been closely linked to institutional racism, from slave owners branding their slaves so they couldn’t move freely and privately, to plantation owners building homes tall enough to surveil the entire plantation. Slavery may have been abolished, but now we see racism and oppression in a new power structure in which the powerful hold the data on the less powerful.

Being watched changes how you move, how you think.

Here’s one example. Khiara Bridges is a professor at Boston University who studied pregnant women applying to Medicaid. All of them poor, most of them of color. Her research found a system “fundamentally flawed by design,” in which women relying on government assistance to have a child were required, before ever seeing a health practitioner, to be “informationally canvassed” via coerced consultations that ask the kinds of degrading questions that white, privately insured women would never be asked at a healthcare facility.

These women are routinely drilled on whether they’ve missed prenatal care appointments, whether the pregnancy was planned or accidental, whether they’ve ever abused controlled substances, been domestically abused, been homeless. And if they say yes, more information is gathered. That information is then funneled to other state bureaucracies, including immigration, customs enforcement or even criminal justice. In order to continue their Medicaid care, the women are then tracked and surveilled in demeaning ways, which Bridges calls the “poverty of privacy rights.”

That’s the trade off. You want our help? We’ll use your data how we see fit.

Or, for further evidence, we can look to black men. According to research presented by Hamid Khan of the Stop LAPD Spying Coalition, 30 percent of suspicious activity reports in Los Angeles are written on blacks –- even though they make up less than 10 percent of the total population.

Besides arrests, Khan reported, there are more covert surveillance methods employed, in which housing authorities are forging partnerships with police to track movements of residents; government-subsidized cell phones are distributed and their GPS-chips used to track.

Think about walking around your own neighborhood and having policemen watching your every move suspiciously, waiting for a reason to pull you aside for a “chat.”

Being watched changes how you move, how you think. Forget about locking up black men. The way they are harassed and stalked creates the prison of the mind, as Foucualt described. And that kind of policing is something black men, and now Muslims in increasing instances, in many neighborhoods experience every day.

And technology is only making surveillance easier. Between facial-recognition technology, algorithms used for “predictive policing” and, most recently, Stingrays, the watching done by the human eye historically will be done faster, easier, cheaper and en masse.

Georgetown’s Alvaro Bedoya said that, to him, “privacy is black kids being able to make mistakes without the law watching their every move.”

I like that. That’s the way I grew up. I made mistakes in my neighborhood all the time. In public. But in my public, there was even still some privacy, or at least obscurity. And I was allowed to get smarter, more mature from those mistakes rather than be forever marked by them.

As always, we’re keeping the vulnerable vulnerable. Asserting power just because we can. Because he who owns the data holds the power.

And that’s why privacy professionals, in this context, are so important. There’s been a lot of talk lately about the need to continue making ethics-based decisions and not just compliance-based decisions, and I can’t think of a context more worthy than the restoration of human dignity — for everyone.

Privacy professionals get to make decisions that have real impacts on people’s lives, though it may sometimes seem more abstract than that. Decisions like, should we even collect this data to begin with? If we do, who might own it someday? And it may seem absurd that these questions are related to social injustice and an imbalance of power, but guess what, they really are.

Maybe privacy matters more to me than I thought.

More from TechCrunch

image beaconimage beaconimage beacon