Watchdog Warns Police Over Use Of Facial Recognition
Police warned about data protection implications of using of facial recognition technology
The Information Commissioner’s Office (ICO) has warned that any organisation using facial recognition technology, and who then scan large databases of people to check for a match, is processing personal data.
And the Information Commissioner had a stark warning to the police who are using the tech to identify individuals, saying “the potential threat to privacy that should concern us all.”
Her warning came after a study last week found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology are innocent, and that the overwhelming majority of people identified are not on police wanted lists.
Facial recognition
The Information Commissioner Elizabeth Denham in a blog post pointed to the use of live facial recognition (LFR) technology by South Wales Police, who in 2017 used facial recognition software at the Champions League Final in Cardiff to scan the face of every fan attending the game.
The use of facial recognition systems by South Wales police in shopping centres is also currently under judicial review, and Denham has previously criticised “a lack of transparency about its use”.
Now she expanded upon her concerns, and warned that the police are processing personal data by using the tech.
“We understand the purpose is to catch criminals,” Denham wrote. “But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives. And that is a potential threat to privacy that should concern us all.”
“LFR is a high priority area for the ICO,” she added. “My office has been conducting an investigation, monitoring the trials carried out by the police. The relevant forces piloting this technology have cooperated with our investigation and the ICO has learned a lot from our deep dive in examining how it works in practice. Legitimate aims have been identified for the use of LFR. But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the rollout of this technology.”
Denham pointed to the key concern from a legal standpoint, relates to the need for a detailed framework for safeguards prior to making decisions to implement LFR systems and governing its use at all stages.
Police guidance
Denham advised for police forces considering LFR to use the following guidance.
“Carry out a data protection impact assessment and update this for each deployment – because of the sensitive nature of the processing involved in LFR, the volume of people affected, and the intrusion that can arise,” she warned.
“Produce a bespoke ‘appropriate policy document’ to cover the deployments – it should set out why, where, when and how the technology is being used,” she added. “And ensure the algorithms within the software do not treat the race or sex of individuals unfairly.”
Denham also advised police forces to familiarise themselves with the ICO’s Guide to Law Enforcement Processing covering Part 3 of the Data Protection Act 2018.
And it is not just the police being warned by the data protection watchdog.
“In recent months we have widened our focus to consider the use of LFR in public spaces by private sector organisations, including where they are partnering with police forces,” Denham wrote. “We’ll consider taking regulatory action where we find non-compliance with the law.”
Legal advice
The cautionary note from the Information Commissioner was echoed by legal experts.
“There’s a lot of excitement around the use of face recognition systems,” explained Tamara Quinn, Partner at international law firm Osborne Clarke.
“While the benefits are endless, businesses must also consider the risks that arise from deploying face recognition systems as they need to take appropriate steps to comply with the law,” said Quinn. “Facial recognition and video surveillance are covered by a complex web of regulations which isn’t easy to navigate, plus there is reputational risk if companies aren’t seen to be taking privacy seriously.”
“Under the GDPR, use of biometrics, such as facial recognition systems, is covered by stricter safeguard than ordinary personal data,” she added. “For many companies, this means that they may need to get consent from every person scanned and prove that these individuals were fully informed and have given consent freely, without pressure or being penalised for not participating.”
“With the ICO promising to pay closer attention to private organisations that use facial recognition systems that cover public areas, businesses should act now to ensure that their software doesn’t break the law,” Quinn warned.
“And this can include reassessing the use of external cameras overlooking the street, public parking or other communal spaces,” she added. “ As well as making sure that their systems comply with strict legal requirements, companies should be looking at their contracts with external suppliers of these systems, to make sure that they have strong legal protections in place.”
Facial concerns
There are also doubts as to the effectiveness of such facial recognition systems.
These systems has been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.
Microsoft for example has recently refused to install facial recognition technology for a US police force, due to concerns about artificial intelligence (AI) bias.
And Redmond reportedly deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.
San Francisco meanwhile has banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.
And facial recognition can also be fooled as well. In 2017 Vietnamese cybersecurity firm said it had tricked the facial recognition feature on the iPhone X using a 3D-printed mask.
Can you protect your privacy online? Take our quiz!