It has been interesting to read that police in Durham are preparing to introduce an artificial intelligence (AI) system designed to help officers decide whether or not a suspect should be retained in custody.
The Data for the Harm Assessment Risk Tool (Hart) classifies suspects as being of low, medium or high risk of offending.
It was tested in 2013 by using police records of offending history between 2008 and 2012 to see how accurate it was in predicting whether suspects would offend or not over the next two years.
Sheena Urwin, head of criminal justice at Durham Constabulary, has said the system is likely to be used by officers at the force in the ‘next two to three months’.
It could therefore be used when deciding whether to keep a suspect in custody for a few more hours, whether to release them on bail before a charge, or, after a charge has been made, whether to remand them in custody.
System has proved accurate – but there are concerns over potential to wrongly detain suspects
There can be no denying that the system proved to be highly accurate during the testing period, with forecasts that a suspect was low risk turning out to be accurate 98 per cent of the time, while forecasts they were high risk accurate 88 per cent of the time.
However, although these stats look impressive, given my role in supporting victims of wrongful arrest and unlawful detention at Hudgell Solicitors, I have concerns over 12 per cent of suspects wrongly being assessed as ‘high risk’.
If considered using this Hart score alone, there would certainly be a danger of holding a suspect in custody, when in fact, they could be proved completely innocent and no threat to committing further crime.
It is also being reported that the software has been designed to be more likely to classify someone as medium or high risk, therefore erring on the side of caution to avoid releasing suspects who may commit a crime.
In my opinion, it is a dangerous move to introduce a system which is influenced in any way to reach a certain conclusion. Whatever happened to innocent until proven guilty?
Police forces must have a genuine reason for arresting a person and then detaining them in custody, and they must follow set procedures to stay within the law themselves.
If people are held for longer than appropriate, it can be unlawful imprisonment, so this technology, particularly given what has been revealed, must be used with caution.
Recommendations must be carefully considered, and impact of system must be scrutinised
The introduction of Hart as a system for officers to use is being described as a ‘live experiment’, and officers will only use the system in a random selection of cases, so that its impact can be compared to what happens when it is not used.
Its developers have been keen to stress it will be used in an “advisory” role, and should not remove discretion from the police officer using it.
That will be key in ensuring mistakes are not made, as no matter what volume of stats and data are available to a piece of software – no matter how advanced – it will never match the subjective, informed and knowledgeable view of experienced police officers.
Ms Urwin has said that suspects with no offending history would be less likely to be classed as high risk by Hart, though if they were arrested on suspicion of a very serious crime such as murder, it would have an “impact” on the output.
In such a case, computer software would surely be at a huge disadvantage to the officers involved in investigating the offences and benefitting from being aware of continually updating evidence on the case, and speaking to witnesses.
Helen Ryan, head of law at the University of Winchester, has described Hart as “incredibly interesting” in principle, saying that ‘potentially, machines can be far more accurate – given the right data – than humans.”
That is something yet to be proved, and until that is the case, it should be used with caution, and only one aspect influencing decisions over detentions of suspects.