Police Seek ‘Balance’ In Use Of AI To Predict Crime

Police have said they are seeking “balance” in the use of artificial intelligence to predict crimes, after freedom of information requests found that 14 UK police forces were deploying, testing or investigating predictive AI techniques.

The report by Liberty, “Policing by Machine”, warned that the tools risk entrenching existing biases and delivering inaccurate predictions.

The civil liberties group urged police to end the use of predictive AI, saying mapping techniques rely on “problematic” historical arrest data, while individual risk assessment programmes “encourage discriminatory profiling”.

The forces using or trialling predictive mapping programmes are Avon and Somerset Constabulary, Cheshire Constabulary, Dyfed-Powys Police, Greater Manchester Police, Kent Police, Lancashire Police, Merseyside Police, the Metropolitan Police Service, Norfolk Constabulary, Northamptonshire Police, Warwickshire Police and West Mercia Police, West Midlands Police and West Yorkshire Police, while a further three forces – Avon and Somerset, Durham and West Midlands – are using or trialling individual risk-assessment programmes.


Risk-assessment

Norfolk Police, for instance, is trialling a system for identifying whether burglaries should be investigated, while Durham Constabulary’s Harm Assessment Risk Tool (Hart) provides advice to custody officers on individuals’ risk of re-offending, and West Midlands Police uses hotspot mapping and a data-driven analysis project.

Liberty said that at a minimum, police should be more transparent about their use of algorithms in policing.

A West Midlands Police spokesperson said the force was determined to ensure that its data science work had “ethics at its heart”.

A Durham Constabulary spokesperson said the force was “proud” of Hart, which it said advised officers as part of an intervention programme to help offenders “turn their lives around”.

“All decisions are ultimately made by an experienced custody officer, but the Hart advisory tool gives them a clear indication as to who might be more at risk of re-offending – not so they are stereotyped, but so we can give them more support to turn away from crime,” the spokesperson said.

Innovation

The National Police Chiefs’ Council said such programmes were part of ongoing “innovative” use of technology and were deployed with a background of “strong” ethical standards and legislation.

“Policing is underpinned in the UK by a strong set of values and ethical standards, as well as a significant amount of legislation,” said assistant chief constable Jon Drake, the group’s lead for intelligence.

“At all times we seek to balance keeping people safe with people’s rights. This includes the way in which we police hot-spots for crime.

“For many years police forces have looked to be innovative in their use of technology to protect the public and prevent harm and we continue to develop new approaches to achieve these aims.”

But Hannah Couchman, author of the Liberty report, said predictive policing adds to the notion of “pre-criminality” and puts a “glossy sheen” of technology on existing biases.

“It fails us because it focuses on technology and big data as the solution to policing problems which are deeper, systemic issues requiring a much more considered, radical and compassionate response,” Couchman said on Twitter.

Liberty also called on the London Metropolitan Police to carry out a review of its Gangs Matrix database, which the Information Commissioner’s Office recently called “unjustifiably excessive”.

Budget cuts

The use of AI comes amidst a climate of ongoing budget cuts that have significantly reduced police staffing.

Durham police chief Michael Barton said he has had to cut staff every year since becoming chief officer.

“I have now got a third less money than I had in 2010,” he told the Financial Times. “We have had the fifth biggest cut, pro rata, of any police force in the UK.”

Barton said the force was testing the use of AI because it had the potential to be free of the “inherent biases” that affect the judgements of human custody sergeants.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Northvolt Mulls US Bankruptcy Protection – Report

Troubled battery maker Northvolt reportedly considers Chapter 11 bankruptcy protection in the United States as…

6 hours ago

FTC Plans Investigation Into Microsoft Cloud Business – Report

Microsoft's cloud business practices are reportedly facing a potential anti-competitive investigation by the FTC

8 hours ago

Programmer Sentenced To Five Years In Prison For Bitcoin Laundering

Ilya Lichtenstein sentenced to five years in prison for hacking into a virtual currency exchange…

10 hours ago

Hate Speech Watchdog CCDH To Quit Musk’s X

Target for Elon Musk's lawsuit, hate speech watchdog CCDH, announces its decision to quit X…

1 day ago

Meta Fined €798m Over Alleged Facebook Marketplace Violations

Antitrust penalty. European Commission fines Meta a hefty €798m ($843m) for tying Facebook Marketplace to…

1 day ago

Elon Musk Rebuked By Italian President Over Migration Tweets

Elon Musk continues to provoke the ire of various leaders around the world with his…

1 day ago