The UK information commissioner is “deeply concerned” about the inappropriate and reckless use of live facial recognition (LFR) technologies in public spaces, noting that none of the organizations investigated by her office could fully justify its use. In a blog post published on 18 June 2021, information commissioner Elizabeth Denham said that although LFR technologies “can make aspects of our lives easier, more efficient and more secure”, the risks to privacy increase when it is used to scan people’s faces in real time and more public contexts.
“When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant,” Denham wrote, adding that although “it is not my role to endorse or ban a technology”, there is an opportunity to ensure its use does not expand without due regard for the law.
“Unlike CCTV, LFR and its algorithms can automatically identify who you are and infer sensitive details about you,” she said. “It can be used to instantly profile you to serve up personalized adverts or match your image against known shoplifters as you do your weekly grocery shop.
“It is telling that none of the organizations involved in our completed investigations could fully justify the processing and, of those systems that went live, none were fully compliant with data protection law requirements. All of the organizations chose to stop, or not proceed with, the use of LFR.”
Informed by her interpretation of data protection law and six separate investigations into LFR by the Information Commissioner’s Office (ICO), Denham has also published an official “Commissioner’s Opinion” to guide companies and public organizations looking to deploy biometric technologies.
“Today’s Opinion sets out the rules of engagement,” she wrote in the blog. “It builds on our Opinion on the use of LFR by police forces and sets a high threshold for its use.
“Organisations will need to demonstrate high standards of governance and accountability from the outset, including being able to justify that using LFR is fair, necessary, and proportionate in each specific context in which it is deployed. They need to demonstrate that less invasive techniques won’t work.”
In the Opinion, Denham noted that any organization considering deploying LFR in a public place must also carry out a data protection impact assessment (DPIA) to decide whether to proceed.
“This is because it is a type of processing which involves the use of new technologies, and typically the large-scale processing of biometric data and systematic monitoring of public areas,” she wrote. “Even smaller-scale uses of LFR in public places are a type of processing which is likely to hit the other triggers for a DPIA set out in ICO guidance.
“The DPIA should begin early in the project’s life before any decisions are taken on the actual deployment of the LFR. It should run alongside the planning and development process. It must be completed before the processing, with appropriate reviews before each deployment.”
On 7 June 2021, Access Now and more than 200 other civil society organizations, activists, researchers, and technologists from 55 countries signed an open letter calling for legal prohibitions on using biometric technologies in public spaces, whether by governments, law enforcement, or private actors.
“Facial recognition and related biometric recognition technologies have no place in public,” said Daniel Laufer, Europe policy analyst at Access Now. “These technologies track and profile people as they go about their daily lives, treating them as suspects and creating dangerous incentives for overuse and discrimination. They need to be banned here and now.”
On top of a complete ban on using these technologies in publicly accessible spaces, the civil society coalition is calling on governments worldwide to stop all public investment in biometric technologies that enable mass surveillance and discriminatory targeted surveillance.
“Amazon, Microsoft, and IBM have backed away from selling facial recognition technologies to police,” said Isedua Oribhabor, US policy analyst at Access Now. “Investors are calling for limitations on how this technology is used. This shows that the private sector knows biometric surveillance’s dangers to human rights.
“But being aware of the problem is not enough – it is time to act. The private sector should fully address the impacts of biometric surveillance by ceasing to create or develop this technology in the first place.”
The European data protection supervisor has also been critical of biometric identification technologies, previously calling for a moratorium on its use and advocating for it to be banned from public spaces.
Speaking at CogX 2021 about the regulation of biometrics, Matthew Ryder QC of Matrix Chambers said that although governments and companies will often say they only deploy the technologies in limited, tightly controlled circumstances without retaining or repurposing the data, legislation will often build in a range of exceptions that allow exactly that to happen.
“The solution to that can be many harder-edged rules than we would normally expect to see in a regulatory environment because both governments and companies are so adept at gaming the rules,” said Ryder, adding that although it may not be a malicious exercise, their constant “stress testing” of the regulatory system can lead to use cases which, “on the face of it, you normally wouldn’t be allowed to do”.
He added that regulators and legislators must get comfortable setting “hard lines” for tech companies looking to develop or deploy such technologies. “I would err on the side of harder regulations, which then get softer, rather than allowing a relatively permissive regulatory view with many exceptions,” he said.