Concern Mounts Over Facial Recognition At Kings Cross District

King’s Cross Central Limited Partnership

UK’s biometrics commissioner urges government to update laws surrounding use of facial recognition

There is growing concern after it emerged that a private company that manages the King’s Cross area of London is using facial recognition to track individuals.

The UK’s biometrics commissioner has said that the government needs to update the laws surrounding the technology, the BBC reported.

It comes after it was revealed earlier the week that the 67-acre King’s Cross area, which is developed and managed by the King’s Cross Central Limited Partnership (KCCLP) and Argent,is using the tech for unknown purposes. The area is the site of Google’s UK headquarters and other businesses, Central Saint Martins college, schools and retailers.

facial scanning

Facial recognition

“These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public,” a KCCLP spokesperson had told the Financial Times.

But now the BBC reported that developer Argent had confirmed it uses the technology to “ensure public safety”.

But the UK watchdog has described the deployment as “alarming”, and the UK biometrics commissioner Professor Paul Wiles has called for the government to take action over the use of facial recognition technology by the private sector as well as by law enforcement.

At present, facial recognition does not fall under his remit as current legislation only recognises DNA and fingerprints as biometrics.

“I have no idea what they’re trying to do in King’s Cross,” Prof Wiles told the BBC. “There’s no point in having facial-matching tech unless you are matching it against some kind of database – now what is that database?”

“It’s alarming whether they have constructed their own database or got it from somewhere else,” he added. “There is a police database which I very much hope they don’t have access to.”

“Historically an area like that would have been public space governed by public control and legislation,” Prof Wiles added. “Now a lot of this space is defined as private but to which the public has access.”

Official investigations

Last month the information commissioner’s office (ICO) said it was carrying out an investigation into trials of facial recognition by police forces, including those in London and South Wales.

Last December the Metropolitan Police carried out voluntary trials of the tech in central London, but said those who preferred not to be scanned would not necessarily be considered under suspicion.

In July the House of Commons Science and Technology committee argued police trials should be halted until the relevant regulations were in place.

And these systems has been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.

Microsoft for example has recently refused to install facial recognition technology for a US police force, due to concerns about artificial intelligence (AI) bias.

It also reportedly deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.

San Francisco meanwhile has banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.

Can you protect your privacy online? Take our quiz!