Police departments are turning to AI-powered facial recognition technology – leading to a rise in WRONGFUL ARRESTS
Police departments are increasingly turning to AI-powered facial recognition technology to identify and arrest suspects.
Many departments are making these arrests without doing additional basic police work to verify evidence, leading to a rise in wrongful arrests.
There have been at least eight documented cases of wrongful arrests made due to facial recognition errors.
Many police departments are not being transparent with their use of facial recognition tech, strongly suggesting that the number of wrongful arrests made by AI is much higher.
AI facial recognition algorithms struggle with low-quality images, poor lighting and unusual angles and exhibits significant biases, particularly against women and people of color with darker skin tones.
More and more police departments are turning to artificial intelligence-powered facial recognition software – leading to a rise in false arrests.
An investigation found that police departments nationwide are relying on facial recognition software to identify and arrest suspects, often without corroborating evidence. Shockingly, most departments are not required to disclose or document their use of this technology.
Among 23 departments with available records, 15 across 12 states made arrests based solely on AI matches, frequently violating their own internal policies that mandate additional evidence before taking such drastic action.
From the information gathered, there have been eight documented cases of wrongful arrests in the U.S. made based on information from AI facial recognition, with two of these cases previously unreported. All the charges against the individuals were eventually dismissed. The investigation notes that basic police work such as verifying alibis and comparing physical evidence could have easily prevented these wrongful arrests.
The report further warns that data on wrongful arrests from facial recognition technology is very sparse and the true scale of the problem remains unknown, as most departments lack transparency and refuse to disclose their use of AI facial recognition tech.
Algorithms used by AI facial recognition tech failing to detect actual criminals
Facial recognition software uses algorithms to analyze and compare facial features in photographs or video footage. The technology maps unique characteristics – such as the distance between the eyes, the shape of the jawline and the contours of the nose – to create a digital “faceprint.” This faceprint is then compared against a database of images to find potential matches.
While this might sound like a foolproof system, the reality is far more complicated. Facial recognition algorithms are far from perfect. They can struggle with low-quality images, poor lighting or unusual angles.
More troublingly, these systems often exhibit racial and gender biases. Studies have shown that facial recognition software is significantly more likely to misidentify women and people of color, particularly those with darker skin tones.
Or Porcha Woodruff, a visibly pregnant woman who was arrested for carjacking despite being physically incapable of committing such an act.
Robert Williams, the first documented victim of a wrongful arrest due to facial recognition, was accused of stealing thousands of dollars worth of watches — while he was driving home at the time of the alleged crime. (Related: SURVEILLANCE? U.S. expands biometric technology in airports.)
At the heart of this issue lies a deeply human problem: confirmation bias. When facial recognition software identifies a suspect, law enforcement officers are often quick to accept the result as definitive, ignoring contradictory evidence.
As Mitha Nandagopalan, a staff attorney with the Innocence Project, explains, “When police showed up to Porcha’s house, she was visibly eight months pregnant, yet there was nothing in the victim’s description of the suspect that mentioned pregnancy. The circumstances described would be very difficult for someone near the end of a pregnancy to carry out, and yet they went forward with the arrest.”
Rare clouds which have historically been blamed for a number of UFO sightings have been seen over northern skies in the UK.The special cloud formation got its nickname from the smooth, symmetrical oval or round shapes it produces, which many have compared to the science fiction depiction of flying saucers. Some also refer to them as pancake clouds, because they sometimes appear flat and stacked.But they are formally called lenticular […]
Post comments (0)