14 British police forces are using ‘discriminatory’ algorithms to predict crime

Fourteen police forces across the UK are using, or plan to use, algorithms and other types of crime predictive software to predict crime, according to human rights organisation, Liberty.

The human rights group said it had sent out a total of 90 Freedom of Information requests last year to determine which forces in the UK used the technology.

The report by Liberty, entitled Policing by Machine, warned that the tool can lead to biased decisions and inaccurate predictions.

Instead of providing new ways to reduce crime, the report states, the software will lead to more unfair targeting of ethnic minorities and lower-income communities.

The campaigned urged the police to stop using predictive software, saying that the technology relies on “problematic” historical arrest data, while individual risk assessment programmes “encourage discriminatory profiling”.

Police forces using or trialling predictive mapping programmes are Avon and Somerset Constabulary, Kent, Cheshire Constabulary, Dyfed Powys, Greater Manchester Police, Lancashire Constabulary, Merseyside Police, Metropolitan Police, Northampton Police, Warwickshire and West Mercia Police, West Midlands Police, and West Yorkshire Police.

In addition, three other forces – Avon and Somerset, Durham and West Midlands – are using or trialling individual risk assessment programmes.


Norfolk police, for instance, is testing a system that calculates whether a burglary case is worth investigating, while Durham Constabulary’s Harm Assessment Risk Tool (Hart) provides guidance to custody officers, and West Midlands uses hotspot mapping to target crime and any anti-social behaviour.

Liberty said at the very least, police should be more transparent about their use of algorithms in policing technology.

A strategic adviser to the West Midlands police and crime commissioner, Tom McNeil, said: “We are determined to ensure that any data science work carried out by West Midlands Police has ethics at its heart. These projects must be about supporting communities with a compassionate public health approach.”

Durham Constabulary has been developing its HART tool for more than five years. It uses a method called “random forests”, to look at vast numbers of combinations of “predictor values”, the majority of which concentrate on the suspect’s offending history, as well as age, gender, and geographical location.

A spokesperson for Durham Constabulary said: “We are proud of HART, which is part of our intervention programme to help repeat offenders turn their lives around, break away from the revolving door of prison and reduce crime.

“All decisions are ultimately made by an experienced custody officer, but the HART advisory tool gives them a clear indication as to who might be more at risk of re-offending – not so they are stereotyped, but so we can give them more support to turn away from crime.”

Assistant Chief Constable, Jon Drake, of the National Police Chiefs’ for Intelligence, said: “Policing is underpinned in the UK by a strong set of values and ethical standards, as well as a significant amount of legislation.

“At all times we seek to balance keeping people safe with people’s rights. This includes the way in which we police hot-spots for a crime.”

“Innovative” technology

“For many years’ police forces have looked to be innovative in their use of technology to protect the public and prevent harm and we continue to develop new approaches to achieve these aims.”

But Hannah Couchman, author of the Liberty report, said predictive mapping bolsters the notion of “pre-criminality” and puts a “glossy sheen” of technology on existing biases.

“And it fails us because it focuses on technology + big data as the solution to policing problems which are deeper, systemic issues requiring a much more considered, radical + compassionate response,” she said on Twitter.

Liberty also said that London Metropolitan police should conduct a review of its Gangs Matrix database and Prevent programme.

Related Posts