in ,

Pregnant lady claims she was wrongfully detained when eight months pregnant.

According to a complaint filed against the Detroit Police, the police wrongfully detained a mother of three based on a false match utilizing facial recognition software.

According to civil rights groups, the case shows the concerns of bias in utilizing such technology in police operations, which have been found in tests to perform a bad job of matching photographs of non-white persons.

Six Detroit police officers appeared at the door of Porcha Woodruff, 32, early on February 16th.

“Ms Woodruff, 32, was eight months pregnant at the time and was assisting her two children with their school preparations.”

The mother, who is Black, was arrested for a carjacking and robbery that occurred in January.

“Are you serious about carjacking?” “Do you see how pregnant I am?” she asked authorities, according to a federal complaint filed last week.

She later learnt that police identified her as a suspect by reviewing surveillance video via the department’s facial recognition software, then inserting a 2015 mugshot from a previous arrest into a picture lineup in which the carjacking victim identified Ms Woodruff as her assailant.

Though the case was eventually dismissed, Ms Woodruff claims in her lawsuit that the entire experience, which reportedly caused her stress, dehydration, and stress-induced contractions, is an example of the hazards of biased facial-recognition software.

“Despite its potential, law enforcement’s reliance on facial recognition has led to wrongful arrests, causing humiliation, embarrassment, and physical injury, as evident in this particular incident,” the lawsuit claims.

The Detroit Police Department stated that they are looking into her accusations.

“We are taking this matter very seriously, but we cannot comment further at this time due to the need for additional investigation.”  “We will provide additional information once we have more facts and a better understanding of the circumstances.”

According to the American Civil Liberties Union, this is at least the sixth similar instance in the United States.

According to the organization, three of these incidents occurred at the DPD, which began employing face recognition technology in 2018, and all six of those wrongfully detained in the United States were Black.

“It’s deeply concerning that the Detroit Police Department is aware of the devastating consequences of using flawed facial recognition technology as the basis for someone’s arrest and still relies on it,” Phil Mayor, senior staff attorney at the ACLU of Michigan, said in a statement.

“As Ms. Woodruff’s terrifying experience shows, the Department’s use of this technology must come to an end.” Furthermore, the DPD continues to conceal its misuse of this technology, compelling victims of abuse to disclose its wrongdoing case by case. DPD should not be allowed to evade openness and conceal its own wrongdoing from the public while continuing to expose Detroiters to dragnet monitoring.”

In 2019, a 25-year-old Black man named Michael Oliver was wrongfully charged with felony theft and breaking of a teacher’s cell phone.

Even though he had prominent tattos and a different body type, skin tone, and hair style than the suspected thief, facial recognition software convinced police to include his photo in a lineup.

Robert Julian-Borchak was arrested in front of his family the next year for allegedly stealing from a high-end Detroit Shinola store.

His identity was revealed to authorities after a security contractor supplied surveillance video to the DPD, who passed it to the Michigan State authorities, who used a face recognition technique to match MrBorchak’s name.

The case was eventually dropped.

“This is not me,” Robert Julian-Borchak Williams informed police following his arrest. “You think all Black men look alike?”

Algorithmic bias, or preferences incorporated into purportedly neutral technology as a result of biased human construction, can affect facial-recognition software.

Facial ID techniques, for example, are frequently trained on big picture data sets, and employing a varied data set can result in technologies that fail to reliably identify persons of color.

Spread the love

Leave a Reply

GIPHY App Key not set. Please check settings

What do you think?

16 Points
Upvote Downvote
Avatar photo

Written by Anthony Peters