Protesters in Detroit have been calling for police reform and an end to the city’s facial recognition contract.


Seth Herald/AFP via Getty Images

Activists in Detroit have been waiting a long time for July 24. Since the city’s contract with DataWorks began in 2017, community members have been pushing to stop the software company’s facial recognition services from expanding in their neighborhoods.

On that day, Detroit’s $1.2 million contract with DataWorks is set to expire — unless the City Council votes to renew the deal for another two years for an additional $219,934.50. 

After years of privacy concerns, issues with technology’s racial bias, evidence showing that surveillance doesn’t reduce crime in Detroit and a damning wrongful arrest from a facial recognition mismatch, activists want to make sure that the technology is kicked out of the city for good. 

When the renewal came up for a vote on June 16, the City Council had been expected to vote in favor of extending the contract. It’s since delayed the vote due to public outcry, and community organizers say they’re doing everything they can to swing the vote against facial recognition. 

“Even the manufacturers of this technology are pulling back and rethinking things, and we’re renewing. It makes absolutely no sense,” said Tawana Petty, director of the Data Justice Program for the Detroit Community Technology Project. “In the Blackest city in America, while the rest of the world is crying racial injustice and asking for overhaul, we’re doubling down.”

A weekly public report by the Detroit police department showed that the department has used its facial recognition primarily on Black people in 68 out of 70 cases this year. 

The vote in Detroit, which hasn’t been scheduled yet but is expected to happen before July 24, is part of a growing national debate over the use of facial recognition, and some of the inherent flaws found in the technology when it comes to identifying people of different ethnicities or even gender. 


Now playing:
Watch this:

From Jim Crow to ‘Jim Code’: How tech exacerbated racial…



16:42

Calls for police reform after the May 25 police killing of George Floyd have sparked demands to end facial recognition surveillance used by law enforcement. Boston outlawed the technology, and members of Congress proposed legislation that would indefinitely ban facial recognition from police use. Companies like Amazon, IBM and Microsoft have stopped providing facial recognition services to police, calling for national regulations on the technology. 

Cities like San Francisco and Somerville, Massachusetts, have also banned the tech, and Petty understands the significance of ending facial recognition in her city. 

“Detroit is 80% Black, and we have high levels of quality-of-life crime,” Petty said. “Our City Council — their main argument is that community members want to feel safe. My argument is that I live in Detroit too, and I want to feel safe too.”  

Facial recognition is known to have racial and gender bias, with multiple studies that’ve looked at how the technology is more likely to misidentify a person of color than a white man. 

For Robert Williams, a Black resident, that bias wasn’t just a data point in a study. Thanks to an error in Detroit’s facial recognition system, he was arrested by city police and in custody for nearly 30 hours — for a crime he didn’t commit.

False alarms 

Williams was arrested in January, in front of his family, and accused of stealing about $3,800 in watches from the Shinola store in the city. Surveillance footage caught a photo of the thief’s face, and DataWorks’ facial recognition tech matched the image to Williams’ driver license photo, according to court documents.  

williams-aclu

Robert Williams was wrongfully arrested by Detroit police after facial recognition mismatched him as a potential suspect in a robbery.


ACLU

Williams’ case is the first known US occurrence of a facial recognition mismatch leading to a wrongful arrest. But if police continue using the technology, it won’t be the last, said the American Civil Liberties Union of Michigan. 

“This technology is particularly inaccurate when it comes to people of color, older people, younger people and especially women of color,” said Phil Mayor, the ACLU of Michigan’s senior staff attorney. “It’s just not plausible with using this inaccurate and racist technology that there haven’t been other false arrests, false convictions and false plea bargains.”

Todd Pastorini, DataWorks’ executive vice president, said the company wasn’t at fault in Williams’ arrest, pointing out that the company only implements facial recognition software from providers like NEC and Rank One. 

He noted that the wrongful arrest came from errors related to the surveillance footage’s grainy quality, as well as a security guard for the Shinola store who selected Williams’ photo from a lineup of pictures. 

“When a probe from a surveillance video like the one in question is given to the system, our product results back to the users about 250 candidates,” Pastorini said. “A human is always involved in selecting the most appropriate results.” 

However, Williams’ photo wouldn’t have been in the lineup if the DataWorks software hadn’t flagged him as a potential suspect, according to court documents. 

And Williams wouldn’t have known that faulty facial recognition landed him in police custody if an officer hadn’t remarked that “the computer must have gotten it wrong” before releasing him, according to Mayor. 

Detroit has limits on facial recognition, saying it can be used only in cases involving violent crimes. The crime Williams was falsely accused of didn’t warrant using the technology, and he was tipped off by that police slipup, his attorney said. 

Detroit’s police chief, James Craig, apologized for the wrongful arrest, blaming the mistake on “shoddy investigative work.” At a City Council hearing on June 29, Craig stressed that the department didn’t rely on facial recognition alone, and that police need analysts and detectives to interpret the results. 

“If we were to just use the technology by itself to identify someone, I would say 96% of the time, it would misidentify,” Craig said at the hearing. “When our analysts are doing their work, they’re methodically going through each of the photos, and many times they’ve found the right pick is down the list. We know the technology is not perfect, it’s an aid only.”

Civil rights advocates say facial recognition plays a big part in causing investigative mistakes.  

“I think it creates [that shoddiness],” Mayor said. “We’re taught to trust the newest technology, and it’s a natural and unsurprising instinct that when a computer tells a police officer ‘that’s the guy’ they then undertake an investigation assuming it’s him.” 

Facial recognition’s racial bias creates a real danger in Detroit, a majority Black city that once had the highest rate of fatal shootings by police, activists said. A federal investigation in 2000 prompted Detroit to undergo major police reform on its use of force, but citizens are still concerned about racial profiling, and facial recognition’s flaws could make it worse. 

“Depending on the crime, there could be a false positive in which the police are approaching someone who they believe is a violent criminal, and they’re going to be on edge and may approach innocent citizens as potentially violent criminals,” said Dawud Walid, the executive director of the Michigan chapter of the Council on American Islamic Relations. “That’s a very dangerous scenario, especially in somewhere like Detroit.” 

Calls for reform

Over the last three years, the fight against surveillance has been an uphill battle in Detroit. 

The city’s surveillance program, Project Green Light, has grown from eight cameras to 700 since 2017. All those cameras can use DataWorks’ Face Watch Plus real-time facial recognition technology. Project Green Light cameras are the third most popular source of images for facial recognition searches, said Aric Tosqui, a captain with the Detroit police. 

dtcp

A map showing Project Green Light’s expansion in Detroit from 2016 to 2019.


Detroit Community Technology Project/Cyrus Peñarroyo

Detroit’s City Council members backed facial recognition in 2017, believing the technology would help reduce crime in their communities. Three years later, there’s been no evidence to support that notion, activists said. 

Crime trends tend to fluctuate, and facial recognition technology hasn’t been shown to have a direct impact for Detroit. Though violent crimes dropped overall in Detroit from 2018 to 2019, both homicides and nonfatal shootings have risen, according to statistics released in January. 

Craig said Detroit’s facial recognition investments have paid off, helping solve homicide and theft investigations in the past. The majority of facial recognition searches haven’t led to positive matches, Craig said, with 75% of searches leading to no arrests from May 29 to June 18. 

Activists are arguing that the money devoted to facial recognition, like the $1.2 million already spent and the nearly $220,000 to renew the DataWorks contract, would be better used to reduce crime by investing in resources for the community, such as after-school programs and job training. 

That’s an argument many supporters of the “Defund the Police” movement have backed.

“Communities that are better resourced are those same communities with lower crime rates,” said Rodd Monts, the ACLU of Michigan’s campaign outreach coordinator. “You’re going to have less crime and less need for police.” 

The activists in Detroit have found a growing voice against facial recognition since the tech first came to town. Petty had been among a handful of people arguing against the technology in 2017, but it’s grown into hundreds of critics showing up at City Council meetings. 

But the most important change still needs to come from local lawmakers.

Detroit’s City Council members were silent after Williams’ case made national news, and activists are concerned that they’ll vote to renew the city’s facial recognition contract. When the contract renewal came up for a vote on June 16, only two members were expected to vote against it, and it needs no five votes to expire. 

Letting the facial recognition contract expire won’t be the end of surveillance in Detroit, however. 

Activists are also hoping to pass the Community Input Over Government Surveillance ordinance, which would allow for public input into any surveillance technology purchased by the city. 

The ordinance was introduced by Detroit City Council Member Mary Sheffield, who also plans to vote against renewing the facial recognition contract.

“The importance is having community input and strong oversight and transparency for any type of surveillance technology, especially facial recognition,” Sheffield said. 

If the contract expires, Detroit would no longer have a facial recognition subscription, but it could still reach out to partners like the Michigan State Police to run searches with the technology. 

Activists are hoping to convince the city’s Board of Police Commissioners to issue policies preventing facial recognition from being used. That could also lend momentum to an eventual statewide ban on the technology, starting from Detroit. 

“It would be a great statement for one of the Blackest cities in America,” the ACLU of Michigan’s Mayor said, “to be a leader in opposing the use of this racist technology instead of a leader for using it to surveil its own residents.” 

Source Article