Updated: Apr 13, 2020
Elizabeth Denham Posts on the ICO Blog about Live Facial Recognition
Elizabeth Denham (Information Commissioner) puts forward an insightful new post on the ICO blog. The post can be found here
Here’s a copy of the post to digest if you missed it;
“As far back as Sir Robert Peel, the powers of the police have always been seen as dependent on public support of their actions. It’s an ideal starting point as we consider uses of technology like live facial recognition (LFR). How far should we, as a society, consent to police forces reducing our privacy in order to keep us safe?
That was the starting point to my office’s investigation into the trials of LFR by the Metropolitan Police Service (MPS) and South Wales Police (SWP). LFR is a step change in policing techniques; never before have we seen technologies with the potential for such widespread invasiveness. The results of that investigation raise serious concerns about the use of a technology that relies on huge amounts of sensitive personal information.
We found that the current combination of laws, codes and practices relating to LFR will not drive the ethical and legal approach that’s needed to truly manage the risk that this technology presents.
The absence of a statutory code that speaks to the specific challenges posed by LFR will increase the likelihood of legal failures and undermine public confidence in its use. As a result, the key recommendation arising from the ICO’s investigation is to call for government to introduce a statutory and binding code of practice on the deployment of LFR.
This is necessary in order to give the police and the public enough knowledge as to when and how the police can use LFR systems in public spaces. We will therefore be liaising with Home Office, the Investigatory Powers Commissioner, the Biometrics Commissioner, the Surveillance Camera Commissioner and policing bodies on how to progress our recommendation for a statutory code of practice.
We also recommend that more work should be done by a range of agencies and organisations including the police, government and developers of LFR technology to eliminate bias in the algorithms; particularly that associated with ethnicity. This will help and maintain public confidence and cross-community support.
Taken together, the recommendations from our investigation have such far reaching applications for law enforcement in the UK that I have taken the step of issuing the first Commissioner’s Opinion under our data protection laws.
That Opinion makes clear that there are well-defined data protection rules which police forces need to follow before and during deployment of LFR. The Opinion recognises the high statutory threshold that must be met to justify the use of LFR, and demonstrate accountability, under the UK’s data protection law. That threshold is appropriate considering the potential invasiveness of this technology. My Opinion also sets out the practical steps police forces must take to demonstrate legal compliance.
This Opinion is significant because it brings together the findings in our investigation, the current landscape in which the police operate, and the recent judgment from the High Court in R(Bridges) v The Chief Constable of South Wales, in which a member of the public had concerns that his image may have been captured on LFR from a police van while he was out shopping in Cardiff city centre. He brought the case to ask the courts to decide whether the use of facial recognition in this way by SWP was lawful. The High Court judged that in these instances, SWP used LFR lawfully.
However the SWP case was a judgment on specific examples of LFR deployment. It is my view that this High Court judgment should not be seen as a blanket authorisation for police forces to use LFR systems in all circumstances. When LFR is used, my Opinion should be followed. My Opinion recognises there is a balance to be struck between the privacy that people rightly expect when going about their daily lives and the surveillance technology that the police need to effectively carry out their role. Therefore it makes clear that police forces must provide demonstrably sound evidence to show that LFR technology is strictly necessary, balanced and effective in each specific context in which it is deployed.
My office’s investigation has concluded, but our work in this area is far from over. We have undertaken our own research to understand the public’s thoughts on the subject.
Public support for the police using facial recognition to catch criminals is high, but less so when it comes to the private sector operating the technology in a quasi-law enforcement capacity. We are separately investigating this use of LFR in the private sector, including where LFR in used in partnership with law enforcement. We will be reporting on those findings in due course.
From LFR to the development of artificial intelligence systems that analyse gait and predict emotions based on facial expressions, technology moves quickly. It is right that our police forces should explore how new techniques can help keep us safe. But from a regulator’s perspective, I must ensure that everyone working in this developing area stops to take a breath and works to satisfy the full rigour of UK data protection law. Moving too quickly to deploy technologies that can be overly invasive in people’s lawful daily lives risks damaging trust not only in the technology, but in the fundamental model of policing by consent. We must all work together to protect and enhance that consensus.”