Cardiff Police Will Use Facial Recognition At UEFA Champions League Final
Fans are against the use of such technology and there are doubts as to the effectiveness of such systems
South Wales Police will use controversial facial recognition software at next month’s Champions League Final in Cardiff to scan the face of every fan attending the game.
The software is being piloted at The Principality Stadium for Europe’s biggest football event, which will see around 70,000 fans enter the stadium possibly unaware that their faces will already have been scanned when they entered the city.
Cameras will be positioned in Cardiff’s main train station and at other locations around the city to scan the fan’s faces and compare them to a police database filled with around 500,000 “persons of interest”.
Motherboard spotted a government tender issued by the South Wales Police for an ‘automated facial recognition solution.’ The tender involves a £177,000, two-year contract, running until 25 April 2019.
Fan scan
Around 170,000 people are expected to visit Cardiff on the day of the match, with the majority probably unaware that their images are being captured and compared in real time to images stored in the police information and records management system.
According to the tender, the pilot programme aims to “effectively utilise the advancements in automated facial recognition (AFR) technology.” The strategy will consist of ‘real time facial recognition’ and ‘slow time static face search’, with “real time facial recognition being specifically linked to a live pilot on Saturday 3rd June 2017 in and around the principality stadium and Cardiff central train station”.
Recent incidents such as the attack on the Borussia Dortmund team bus before a Champions League game in Monaco earlier this month have increased the interest in these types of technologies within law enforcement agencies, yet its use has been critisised by fans for going against privacy rights.
“Police forces have a big task protecting the public, and therefore it makes sense they are relying more on technology to assist in finding suspects, said Javvad Malik, Security Advocate at AlienVault. “However, security and privacy needs to be addressed on how the data is collected, processed, stored, and deleted.
“Without doing so, it is possible the data could end up in the wrong hands. Secondly, as with any new technology such as facial recognition, it hasn’t reached peak maturity, so inevitably there are false positives and negatives that need to be taken into account, and the automated system should not be relied on solely.”
There are also doubts as to the effectiveness of such systems. It was recently critisised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.
The issue of “non-cooperative” subjects is also a concern, especially prevalent in busy areas such as a football stadium or city centre. Reports have found suggested that accurate facial recognition can only be achieved in controlled environments as faces can be easily obscured in crowded environments.
Speaking to Motherboard, policy director at human rights and civil liberties at Liberty Rachel Robinson said: “The chasm between the increasingly advanced surveillance technology rolled out by police and the lack of legal safeguards for the public is growing wider and more alarming all the time.”
“Instantaneous facial recognition technology with the potential to identify anyone in a crowd of thousands, alongside ongoing police storage of huge numbers of innocent people’s photographs, is a seriously intrusive combination.”
Can you protect your privacy online? Take our quiz!