Accuracy and racial bias concerns about facial recognition technology continue with the news of a lawsuit filed by a New Jersey man, Nijeer Parks, against local police, the prosecutor and the City of Woodbridge in New Jersey.
According to the New York Times, Nijeer Parks is the third person known to be falsely arrested for a crime he did not commit based on a bad face recognition match. The other two were Robert Williams and Michael Oliver. All three falsely arrested men are black it is reported.
This particular case began on a Saturday in January 2019, when two police officers showed up at the Hampton Inn in Woodbridge (New Jersey) after receiving a report about a man stealing snacks from the gift shop.
The alleged shoplifter was a black man, nearly 6 feet tall, wearing a black jacket, who was reportedly visiting a Hertz office in the hotel lobby, trying to get the rental agreement for a gray Dodge Challenger extended.
The officers confronted him, and he apologised, according to the police report. According to the New York Times, the suspect said he would pay for the snacks and gave the officers a Tennessee driver’s license.
When the officers checked the license, they discovered it was a fake drivers licence. This, coupled a bag of suspected marijuana in the man’s jacket, resulted in the officers trying to arrest the suspect. But the suspect ran away and drove off in his rental car, hitting a parked police car in the process, as well as a column in front of the hotel.
One of the police officers had to reportedly jump out of the way of the vehicle to avoid being hit.
The rental car was later found abandoned in a parking lot a mile away.
What happened next is that a detective in the Woodbridge Police Department sent the photo from the fake driver’s license to state agencies that had access to face recognition technology.
The next day, state investigators said they had a facial recognition match: which happened to be Nijeer Parks, who lived in Paterson, N.J., 30 miles away, and worked at a grocery store.
The detective compared Parks’s New Jersey state ID with the fake Tennessee driver’s license and agreed it was the same person. After a Hertz employee confirmed that the license photo was of the shoplifter, the police issued a warrant for Parks’s arrest.
“I don’t think he looks like me,” Parks was quoted as saying. “The only thing we have in common is the beard.”
Parks it should be noted has previous criminal convictions for selling drugs.
The only problem for the police was that Parks was 30 miles away at the time of the incident, but that did not stop Parks being arrested by local police and spending ten days in jail, coupled paying around $5,000 to defend himself.
Parks was able to get proof from Western Union that he had been sending money at a pharmacy in Haledon (New Jersey), when the incident happened.
Parks told the judge he was willing to go to trial to defend himself. But a few months later in November 2019, his case was dismissed for lack of evidence.
Parks is now reportedly suing the police, the prosecutor and the City of Woodbridge for false arrest, false imprisonment and violation of his civil rights.
“I was locked up for no reason,” Parks reportedly said. “I’ve seen it happen to other people. I’ve seen it on the news. I just never thought it would happen to me. It was a very scary ordeal.”
The case drew the attention of the American Civil Liberties Union (ACLU).
“Multiple people have now come forward about being wrongfully arrested because of this flawed and privacy-invading surveillance technology,” Nathan Freed Wessler, senior staff attorney for the ACLU’s Speech, Privacy, and Technology Project told Silicon UK.
“There are likely many more wrongful interrogations, arrests, and possibly even convictions because of this technology that we still do not know about,” said Freed Wessler. “Unsurprisingly, all three false arrests that we know about have been of Black men, further demonstrating how this technology disproportionately harms the Black community. Law enforcement use of face recognition technology must be stopped immediately.”
On this side of the pond, the use of facial recognition, especially by authorities, has also proven to be controversial.
In August the Court of the Appeal ruled that the use of automatic facial recognition (APR) by South Wales Police had breached privacy rights, data protection laws and equality legislation.
And in 2019 the Information Commissioner’s Office (ICO) warned that any organisation using facial recognition technology, and which then scans large databases of people to check for a match, is processing personal data, and that “the potential threat to privacy should concern us all.”
Indeed in 2019 an academic study found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology were innocent, and that the overwhelming majority of people identified are not on police wanted lists.
And in August 2019, the ACLU civil rights campaign group in the United States ran a demonstration to show how inaccurate facial recognition systems can be.
It ran a picture of every California state legislator through a facial-recognition program that matches facial pictures to a database of 25,000 criminal mugshots.
That test saw the facial recognition program falsely flag 26 legislators as criminals.
Despite that, in July 2019 then Home Secretary Sajid Javid gave his backing to police forces using facial recognition systems, despite growing concern about the technology.
Facial recognition systems have also been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.
And tech firms have begun boycotting the supplying of the tech to police forces.
Microsoft first refused to install facial recognition technology for a US police force a year or so ago, due to concerns about artificial intelligence (AI) bias.
This boycott was subsequently been joined by Amazon and IBM, among others.
Microsoft has also deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.
San Francisco banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.
But the police remain in favour of its use.
In February this year, the UK’s most senior police officer, Metropolitan Police Commissioner Cressida Dick, said criticism of the tech was “highly inaccurate or highly ill informed.”
She also said facial recognition was less concerning to many than a knife in the chest.
Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…
Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…
Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…
Welcome to Silicon In Focus Podcast: Tech in 2025! Join Steven Webb, UK Chief Technology…
European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…
San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…
View Comments
Not so much about facial recognition or even race - Just extremely poor and incompetent police work, his previous drug dealing history probably didn't help, but that's no excuse for the police not doing their job and checking the alibi - 101 of police work