An algorithm is predicting how solvable crime cases are Andrew Medina/Getty
A police force in the UK is using an algorithm to help decide which crimes are solvable and should be聽investigated by officers. As a result, the force trialling it now investigates roughly half as many聽reported assaults and public order offences.
This saves time and money, but some have raised concerns that the algorithm could bake in human biases and lead to some solvable cases being ignored. The tool is currently only used for assessing assaults and public order offences, but may be extended to other crimes in the future.
When a crime is reported to police, an officer is normally sent to the scene to find out basic facts. An arrest can be made straight away, but in the majority of cases聽police officers use their experience to decide whether a聽case is investigated further. However, due to changes in the way crimes are recorded over the past few years, police are dealing with significantly more cases.
Advertisement
The Evidence Based Investigation Tool (EBIT) instead uses an algorithm to produce a probability score of a crime鈥檚 solvability. Kent Police, which has previously experimented with using algorithms (see 鈥淧redicting crime鈥, below), has been using EBIT for a year to assess the solvability of assaults and public order offences, such as threatening someone in the street. These types of offences account for around a third of all crime in the area.
Before the force began using EBIT, officers decided to pursue around 75 per cent of cases. This has now dropped to 40 per cent as a result of the algorithm, while the number of charges and cautions has remained the same, according to Kent Police.
鈥淧olice officers naturally want to investigate everything to catch offenders. But if the solvability analysis suggests there is no chance of a successful investigation, the resources might聽be better used on other investigations,鈥 says Ben Linton at聽the Metropolitan Police, who isn鈥檛 involved with the project.
Blind tests
Kent McFadzien, at the University of Cambridge, created EBIT by training the algorithm on thousands of assaults and public order cases. It identified eight factors that seemed to affect whether a case was solvable, including whether there were witnesses, CCTV footage or a named suspect.
As these factors could change聽over time, EBIT always recommends one or two crimes with low solvability scores for investigation each day. The officers involved aren鈥檛 aware of this score, so this is a blind test of the algorithm鈥檚 effectiveness. 鈥淚t鈥檚 a permanently ongoing trial,鈥 says McFadzien.
However, because the technology bases its predictions on past investigations, any biases contained in those decisions may be reinforced by the algorithm. For example, if there are areas that don鈥檛 have CCTV and police frequently decided not to pursue cases there, people in those places could be disadvantaged.
鈥淲hen we train algorithms on the data on historical arrests or reports of crime, any biases in that data will go into the algorithm and it will learn those biases and聽then reinforce them,鈥 says Joshua Loftus at New York University.
McFadzien鈥檚 blind tests are a good way to help tackle that issue, but there is a separate problem with police algorithms that can鈥檛 be so easily remedied, says Loftus.
Police forces only ever know about crimes they detect or have reported to them, but plenty of crime goes unreported, especially in communities that have less trust in the police.
This means the algorithms are making predictions based on a partial picture. While this sort of bias is hard to avoid, baking it聽into an algorithm may make its decisions harder to hold聽to account compared with an聽officer鈥檚. John Phillips, superintendent at Kent Police, says that for the types of crimes that EBIT is being used for, under-reporting isn鈥檛 an issue and so shouldn鈥檛 affect the tool鈥檚 effectiveness.
Predicting Crime
Kent was the first police force in the UK to experiment with predictive policing, a technology used to suggest areas where crime is likely聽to聽occur. It used a proprietary machine-learning algorithm, provided by US firm PredPol, to predict potential crime hotspots over聽a five-year period.
The force spent 拢150,000 a year on the contract with PredPol, but stopped using the tech last year.
An internal review, published in聽2014 and obtained through a freedom of information request by New 女生小视频, reveals that officers struggled to make use of the system鈥檚 predictions due to time constraints. The algorithm suggested some 520 hotspot boxes per day, but police only visited 86, on average. 鈥淥fficers are not getting to enough of聽the boxes to make a significant impact on聽crime,鈥 the review said.
Article amended on 14 January 2019
Topics:



