Seattle’s New Crime Analytics Program Threatens to Perpetuate Racism in Policing

Tuesday, October 20, 2015
The Seattle Police Department (SPD) has announced a new “Real Time Crime Center” that would use historical crime data in decisions about deploying police officers. Although this may sound like a smart move to incorporate analytics technology in law enforcement, in practice it would perpetuate existing institutional racism in policing. Identifying crime “hotspots” based on historical data threatens to reinforce police department practices that already harm communities of color.

The core problem with this approach is that it treats historical data about crime as neutral. And it treats the results of processing it with algorithms as “objective” factors that will enable police to better respond to and prevent crime.

But these data sets are actually highly skewed, and so the outputs are not objective.

That’s because not all criminal activity is reported and not all laws are applied equally. Police departments already disproportionately police and surveil poor communities and communities of color.

Certain communities, often white and middle-class, that place greater trust in police departments are more likely to report crimes and Black communities are more likely to be reported as suspected of committing crimes. So, using historical data places greater emphasis on communities of color and results in recommending increased police presence in their neighborhoods. In effect, relying on analytics based on this data can create a feedback loop of disproportionate use of policing.

As an example, the failed war on drugs has meant greatly disproportionate arrest levels for Black and Latino communities, despite the fact that drug laws are routinely broken by white communities as well. Racially skewed data is the result in this context and many others, and we shouldn’t be using that data to determine how to apply our police forces. Doing so will simply perpetuate the racial disproportion in law enforcement.

SPD has gone to lengths to distance their implementation from what is commonly referred to as “predictive policing” by using terms like “forecasting” or “crime analytics.” Regardless of what terminology is used, there must be a careful and thorough public discussion of the harms that can be caused by relying on historical department data to make decisions about where to apply police power.

Far too often, organizations rapidly apply new technologies to complex and difficult problems. The temptation to view these technologies as solutions in and of themselves is powerful. And governments are too quick to proclaim direct relationships between using new technologies and apparent improvements in performance – without any rigorous evaluation.

Failing to think critically about Seattle’s new predictive policing program could result in more biased policing for communities that historically have suffered from it. City policy-makers should rethink how they are collecting data to be used for this new program and be transparent about how the underlying algorithms function and interpret this data. They should also seek input from people in the communities it will affect.