Kings Cross Facial Recognition Cameras Shut Down

The private firm that developed and managed the 67-acre King’s Cross area, has confirmed that it is no longer using facial recognition cameras.

It comes after it was revealed last month that King’s Cross Central Limited Partnership (KCCLP) was using facial recognition cameras for unknown purposes.

The discovery prompted questions from London’s mayor Sadiq Khan, as well as an investigation by the UK’s data protection watchdog, the Information Commissioners Office (ICO), after it said it was “deeply concerned” about the use of the technology.

Facial recognition

After weeks of silence, KCCLP has now issued a statement on the matter.

“There has been considerable media interest recently regarding the use of facial recognition technology (FRT) at the King’s Cross Estate, located next to St Pancras International and King’s Cross stations,” it stated.

It said it was co-operating with the ICO investigation, and will not comment whilst the probe is ongoing, except to clear up a few matters for the public record.

It confirmed that the King’s Cross Estate does not currently use FRT. It said there was two FRT cameras, covering a single location at King’s Boulevard, that was operational between May 2016 and March 2018.

It said that during that period, all FRT data was “regularly deleted, with the final deletion taking place in March 2018.”

It said that the King’s Cross Estate team has since undertaken work on the potential introduction of new FRT, but this work has stopped.

“There has been no operational FRT system at the King’s Cross Estate since March 2018,” it insisted, and said the tech was never used for marketing or other commercial purposes.

“The system was used only to help the Metropolitan Police and British Transport Police prevent and detect crime in the neighbourhood and ultimately to help ensure public safety,” it stated. “In the meantime, KCCLP has no plans to reintroduce any form of FRT at the King’s Cross Estate.”

Facial concerns

The use of FRT remains hugely controversial due to concerns about its accuracy and racial bais.
Last month a facial recognition system in the United States wrongly identified 26 Californian lawmakers as criminals.

Most of those misidentified were people of colour.

In July the ICO warned the police about the use of the technology, after a study found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology are innocent, and that the overwhelming majority of people identified are not on police wanted lists.

That same month the House of Commons Science and Technology committee argued police trials should be halted until the relevant regulations were in place.

Can you protect your privacy online? Take our quiz!

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

X’s Community Notes Fails To Stem US Election Misinformation – Report

Hate speech non-profit that defeated Elon Musk's lawsuit, warns X's Community Notes is failing to…

1 day ago

Google Fined More Than World’s GDP By Russia

Good luck. Russia demands Google pay a fine worth more than the world's total GDP,…

1 day ago

Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud signs up Spotify, Paramount Global as early customers of its first ARM-based cloud…

2 days ago

Meta Warns Of Accelerating AI Infrastructure Costs

Facebook parent Meta warns of 'significant acceleration' in expenditures on AI infrastructure as revenue, profits…

2 days ago

AI Helps Boost Microsoft Cloud Revenues By 33 Percent

Microsoft says Azure cloud revenues up 33 percent for September quarter as capital expenditures surge…

2 days ago