Data Driven Policing: Holy Grail or Death Knell of Policing By Consent?

Posted in: Data, politics and policy, Evidence and policymaking, Law, law enforcement and crime, Public services

Dr Ian Kearns is IPR Visiting Fellow and Senior Associate Fellow at the Police Foundation. He is a former Deputy Director and Acting Director of the Institute for Public Policy Research (IPPR) where he provided strategic direction on digital government, the new digital economy, national security and crime.

The evidence is beginning to mount that data-driven policing can deliver real benefits to police and public alike. These benefits are widely spread. They cover better crime prevention and detection and improved public safety, to name just a few. Pursuing a data-driven approach is therefore essential for any police force wishing to maximise the impact of limited resources while retaining the trust and confidence of the public. But caution is also advised. As more and more data becomes available, and as the police become better at joining it up and extracting real intelligence out of it, public anxiety about the methods being used is likely to increase. If the police get it wrong, the consequences for a model of consent-based policing could be severe.

That a data-driven approach is worth pursuing seems clear from the many examples of it that have already delivered results. In Vancouver for example, the Police Department has implemented a city-wide predictive policing tool to target property crime. The system uses machine learning and both historic and current crime data to predict where break-ins are likely to occur. It pushes that information to the onboard computers of patrol vehicles at two hourly intervals so officers can alter their patrol locations with a view to preventing it. A six-month pilot project in 2016 saw property crime reduced by as much as 27 per cent in areas where it was tested, compared to data held on the previous four years.

In Chicago, civilian analysts and police officers have been working together in Strategic Decision Support Centres (SDSCs) deployed, as of May 2018, in 13 of 22 police districts across the city. These bring together data from surveillance cameras and gunshot detection systems with analysis of data on previous crime patterns to identify the places where violent crime is likely to occur. The evidence suggests they are already having a significant effect. In 2017 in the district of Englewood, a poor, mainly black neighbourhood, shootings fell by a massive 67 per cent and murders by 44 per cent. While full causality cannot be demonstrated, there is a direct correlation between where the new approach is being deployed and the steepest falls in violent crime in the city.

In the state of New York, the Department of Motor Vehicles has used an upgraded facial recognition system in its vehicle licence checking and issue process to identify over 7,000 cases of possible identity theft and fraud. Some 4,000 arrests have been made as a result, in what has been a major technology and data-driven crack-down on individuals seeking to use multiple identities. Many criminals who otherwise would have been involved in behaviour that was threatening to public safety have been taken off the roads as a result.

In the UK, the West Midlands Police has linked up internal databases, hired its own data analysts, and is now engaged in the data driven mapping of networks of criminal actors and the use of improved data analytics to get a better handle on crime in its area. This should soon facilitate better and more proactive interventions to deal with them.

These are just a few examples drawn from what is a vast and increasing global landscape of policing innovation using better acquisition and interrogation of data to improve both processes and decision-making. A report from our project on Public Value and Data-Driven Policing, to be released in January 2019, will feature many more.

While the benefits of data-driven policing are all to be welcomed, some are already pointing to worrying potential consequences for the relationship between citizens and the police. In particular, there is a concern that as the police access many more data sources, whether they be Automatic Number Plate Recognition Systems (ANPRS), historic crime databases, facial recognition systems or mobile phone data, and as these data sets are joined up not only with each other but with data drawn from health, employment, and other records, the right to privacy will effectively disappear.

It is notable too that some systems, such as the rapid growth in ANPRS use in the UK, have been introduced with no effective parliamentary scrutiny. And as machine learning algorithms take on roles in decision-support, there is public unease about both bias in the data that such systems are trained on, and the appropriateness of such a development at all.

What this all points to is a set of transformative technological developments that cannot be ignored and, in many ways, need to be embraced, but only in the context of an ongoing process of engagement with both parliamentarians and the public about how we want our digital society to be policed. We need new forms of police-public interaction on these issues, and improved mechanisms for monitoring what the police are doing.

When it comes to adopting data-driven approaches, police forces would be wise to seek permission first, not forgiveness after, such systems have been introduced. If they do not, they could quickly find the legitimacy of what they are doing comes into question, the public begins to withdraw its consent, and negative consequences for police-community relations, and for the willingness of the public to help the police in the prevention and detection of crime all follow.

This post was originally published by the Police Foundation, 8 November 2018.

Interested in this research? Sign up now for IPR Research Seminar, ‘Public Value and Data Driven Policing’ from Dr Ian Kearns on 28 November.

Posted in: Data, politics and policy, Evidence and policymaking, Law, law enforcement and crime, Public services

Respond

  • (we won't publish this)

Write a response