The Guardian view on facial recognition technology: mistaken identities are a political issue | Editorial

It is a familiar story. Extravagant claims are made on behalf of novel computerised tools. The public are told that this or that digital application or system is going to change the world for the better. Efficiencies will be unlocked and problems solved as human limitations are overcome by networked devices plugged into vast stores of data. Anyone who questions the narrative is a pessimist or, perhaps, a criminal.

This appears to be the logic behind arguments put forward on behalf of one such tool – live facial recognition technology. Law-abiding citizens have “nothing to fear” from the police’s increased reliance on mounted cameras, said the Home Office minister, Sarah Jones, last month, after a high court challenge brought on human rights and privacy grounds failed. The use of AI-powered identification software, made by the Japanese company NEC, “only locates specifically wanted people”, she added. Last year, Ms Jones described the technology as “the biggest breakthrough for catching criminals since DNA”.

Read More:  Wallaroos’ hopes of breakthrough win washed away by Canada in stormy conditions | Australia women's rugby union team

Sir Mark Rowley, the Metropolitan police commissioner, is equally enthusiastic, and London’s mayor, Sir Sadiq Khan, gave his blessing to a pilot scheme. There is no doubt that policing is under pressure, despite sharp falls in homicides and knife crime. Shoplifting has recently risen across England and Wales, as have religious and racial hate crimes. It is not hard to see why, from the police’s point of view, the ability to match the faces of passersby with those stored on a database of suspects is very handy.

The warnings carried in last weekend’s Guardian exclusive regarding weak oversight and misuse of these systems are a reminder of other priorities. The biometrics watchdog for England and Wales, Prof William Webster, and his equivalent in Scotland, Dr Brian Plastow, both believe that the Information Commissioner’s Office is not up to the job of monitoring this kind of data use, and that a new regulator and rules are needed. An audit of the Met’s use of facial recognition was postponed and has not been rescheduled.

Read More:  Southern Lebanon’s only functioning hospital damaged by Israeli strikes | News

The UK government is reviewing the legal framework, so some updating is expected. The Home Office has acknowledged issues with racial bias after tests showed higher numbers of false positive identifications of black and Asian faces. But with this technology in widespread use by retailers, as well as police forces, once again politicians are playing catch-up, looking for ways to right wrongs that have already happened.

One thing that is needed urgently is an improved system of redress for people who have been misidentified, whether by police or private security guards. Ministers must also directly address the claims made by a whistleblower who said he knew of up to 15 instances of innocent people being added to watchlists maliciously by security employees with scores to settle. Race bias in the software must also be shown to have been eliminated.

Read More:  ICE agents reportedly detain wife of US soldier just days after their marriage | ICE (US Immigration and Customs Enforcement)

But practicalities aside, surveillance tools raise political questions about civil liberties, privacy and state and corporate overreach. The rollout of these tools is a choice, not an inevitability, and there are alternatives. The policing minister may honestly believe that most people should not fear databases containing biometric data by which individuals can be identified. That doesn’t make it true. The pattern whereby tech outpaces attempts to keep track of its impact, defying democratic checks and balances, needs to be broken.

  • Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

Facebook Comments Box