They used to talk about the arm of the law. Now, in the halls of power, they refer to “the algorithms of the law”. That was the phrase Lord Justice Haddon-Cave used as he began his statement in the world’s first ever court case on facial recognition, which concluded on Wednesday. “The algorithms of the law,” he intoned, “must keep pace with new and emerging technologies.”
As soon as the words were out, you could tell it was going to be a tough day for those fighting against police use of facial recognition.
According to the courtroom schedule, this was the case of South Wales Police versus Edward Bridges. In reality, it was far bigger. Mr Bridges, who’d had his face scanned twice in Cardiff during police trials of facial recognition, accused South Wales Police of infringing his privacy and data protection rights.
But, as the judges acknowledged during the hearing in May, the trial’s real subject was not the police, but the law itself.
Did existing legislation give the police the right to scan hundreds of thousands of faces in order to catch a single criminal? Did it offer sufficient safeguards to protect innocent citizens such as Mr Bridges?
Lord Justice Haddon-Cave, sitting with Mr Justice Swift, concluded that it did.
The decision was unexpectedly emphatic. As I made my way to Cardiff on Wednesday morning, a friend texted: “99% sure the high court decision is going to be summarised as: ‘It’s complicated, a higher court will have to decide’.” That, we agreed, was “just always what happens”.
Not this time. The judges dismissed all three arguments made by Mr Bridges, who was represented in court by lawyers for the human rights organisation Liberty. “The current legal regime is adequate,” they said, declaring South Wales Police’s use of the technology “consistent with the requirements of the Human Rights Act, and the data protection legislation”.
In other words: everything is fine. South Wales Police, you go right ahead.
Liberty and Mr Bridges were naturally disappointed, and declared their intention to fight on. South Wales Police, by contrast, were quietly delighted. After the judgement was finished, I went to the force’s headquarters to meet Deputy Chief Constable Richard Lewis.
“We’re feeling good,” he told me. “We are the police, after all, and we want to be using things lawfully. I think it’s good that the judges have handed down a judgement that we used the technology within a legal framework, which we tried very hard to do and I’m glad that’s found to be the case.”
The judgement isn’t a complete green light. By confirming that existing law applied to facial regulation, it reaffirmed the importance of regulations such as the Data Protection Act. The Information Commissioner’s Office was quick to point that out, saying it was “reviewing the judgement carefully” and warning police forces that “data protection law and guidance still apply”.
For this reason, perhaps, the Deputy Chief Constable didn’t want to be drawn into gung-ho statements about the potential of facial recognition. Yes, South Wales Police was using it to counter County Lines drug gangs. Yes, drugs were carried in from neighbouring regions. No, he didn’t want to suggest the police forces in those areas should necessarily adopt technology.
The Home Office didn’t show the same caution. It responded to the judgement with a fact sheet, linking – via some dubious “facts” about the accuracy of facial recognition – the South Wales decision to the Metropolitan Police’s use of facial recognition in London. In recent months, the Home Office has been pushing police forces to start using the technology, only to be thwarted by local reluctance. That barrier has just been severely weakened.
It’s easy to see where this is headed, because we’ve been here before. In the 1990s and 2000s, the government pushed CCTV cameras; now they pockmark every town centre in the country. In the 2010s, the government embraced automatic number plate recognition (ANPR) cameras; now they make 40 million data “reads” a day (including up to a million errors).
Is this the future for facial recognition? It’s hard to see what can prevent it. Perhaps that’s okay; after all, it’s hard to argue that we should make it harder for the police to keep people safe – and when you see the professional, open operation in South Wales, you feel it must be possible to do so in a way that’s proportional and respectful of individual liberty.
But I wonder. If CCTV and ANPR don’t stop very much crime, how will facial recognition be any better? Can South Wales Police’s standard of conduct be maintained by police forces across the country once the Home Office starts to put them under pressure to use facial recognition to bring down crime rates?
But they’re mainly to do with the nature of the transformation we are experiencing.
Facial recognition isn’t an isolated technology, it’s a step on the path in which digital technologies suffuse every aspect of life.
Some of this can be handled by compromise and muddling through. But when surveillance technology fundamentally alters the balance between state and society, we also need clear rights and rules beyond the “common law” favoured by the judges.
Simply saying we “must keep pace with new and emerging technologies” isn’t good enough. As Silicon Valley developers are discovering, the question isn’t what we can do, but whether we should do what we can.
The judges almost had it right, but in the future we won’t talk about the algorithms of the law. Increasingly, the algorithms are the law. Rusty old democratic legislation needs to catch up, and quick.