It was interesting to learn that Humberside Police have become the first force in the UK to trial the use of new facial recognition technology in a port, using it to identify wanted people entering and leaving the country.
The force has revealed that a two day trial, run in conjunction with the Metropolitan Police, took place at King George Docks, covering the arrival of an inbound ferry on Wednesday, June 13 and the departure of the morning ferry the following day.
Working with the existing CCTV system at the port, the technology was used to scan the faces of passengers against a database of more than 100 people from East Yorkshire and Lincolnshire who are currently wanted.
The force believes the new technology can play a big role in tackling human trafficking, drugs and cash being taken in and out of the UK, as well as helping to identify criminals travelling under false identities.
The experiment is being hailed a success by the force, which says the system correctly identified a number of people passing through the port who were on its watch list. It has also stressed that no passengers whose pictures were not on the list were picked out by the software.
Technology use by police must be used in ‘reasonable and proportionate’
Whilst I welcome new technology which can bring offenders to justice, it is imperative that the use of facial recognition be reasonable and proportionate, and not used in an in-discriminate way which unnecessarily impacts on peoples’ lives.
It appears this initial trial has been a success and it is certainly reassuring to hear that no innocent passengers were picked out, but I do have concerns that there is not yet any clear or accepted data about the overall accuracy of facial recognition technology. Much more needs to be know before it becomes widely used.
I also feel clear policies in respect of its use must be agreed, nationwide for all forces, before its use becomes commonplace.
This is an area we already see inconsistencies in with regards to body worn cameras in police forces – a great weapon in tackling crime and gathering evidence, but a use of technology which is currently inconsistent not only across UK forces, but within individual forces themselves.
Such a lack of clarity over usage would hinder officers as they will be unclear as to when they can or cannot use it, which can also then undermine the integrity and reliability of investigations if there are inconsistencies as to how evidence was gathered.
It can’t become a resource used ad-hoc. There has to be set parameters.
Police forces’ management of pictures of innocent people must be error-proof
Perhaps a worrying aspect of this, particularly given recent Information Commissioner’s Office (ICO) findings against forces for data breaches, one of which involves Humberside Police and we are instructed in, is how this significant amount of data collected will be managed and processed, so as not infringe on people’s rights.
Arrivals and departure lounges in ports and airports are busy places, with potentially hundreds of images of innocent people captured each and every time they are used. The more images of innocent people collected, the greater the risk of innocent people being arrested or linked with a crime in error. Systems must be error-proof.
Assistant Chief Constable Chris Noble has said the force will now carry out consultation with communities, other UK forces, civil liberties groups and other key stakeholders, before deciding on the future use of this approach.
That is something I certainly welcome and feel is needed, as if innocent law-abiding people they think they are being put under surveillance for no good reason, they may fear being incorrectly targeted by police, and that would damage public trust.
That trust is something our police forces cannot afford to lose.