We all knew facial-recognition technology was flawed,Yuna Ogura is Opened Up By A Train Thief Who Comes To Her House (2025) just perhaps not this flawed.
A new study from the National Institute of Standards and Technology, published on Dec. 19, lays out in painstaking detail how facial-recognition tech misidentifies the elderly, young, women, and people of color at rates higher than that of white men. In other words, more at risk populations are also the ones more likely to suffer false matches and any associated legal troubles that follow.
Just how bad is it? Let's let the NIST study authors explain.
"We found false positives to be higher in women than men, and this is consistent across algorithms and datasets," they wrote. "We found elevated false positives in the elderly and in children; the effects were larger in the oldest and youngest, and smallest in middle-aged adults."
And that's not all. "With mugshot images," the authors continued, "the highest false positives are in American Indians, with elevated rates in African American and Asian populations."
Why does this matter? Well, law enforcement uses the technology, and as such false positives can lead directly to mistaken arrests and harassment.
This study, which claims "empirical evidence" for its findings, is sure to add support to lawmakers' calls to ban the controversial tech.
"We have started to sound the alarm on the way facial recognition technology is expanding in concerning [ways]," wrote congresswoman Alexandria Ocasio-Cortez in July. "From the FBI to ICE to Amazon, the bar for consent and civil liberties protection is repeatedly violated, and on top of it all has a disproportionate racial impact, too."
She now has additional evidence to back up that latter claim.
SEE ALSO: Here's why San Francisco's vote to ban facial-recognition tech mattersImportantly, the congresswoman isn't alone in her concern. In a statement published by the Washington Post, Senator Ron Wyden reacted to the NIST findings by stating that "algorithms often carry all the biases and failures of human employees, but with even less judgment."
A growing number of cities, including San Francisco and Berkeley, recently moved to ban some government use of the tech. Perhaps this study will encourage others to follow suit.
Topics Facial Recognition
(Editor: {typename type="name"/})
IRS refund tracker: How to track your refund online
For second time, U.S. to withdraw from major climate treaty, this time the Paris Agreement
Biden Administration to slow down EV shift to appease automakers
Rocket Lab plans to launch its rockets off a cliff
Obama photographer Pete Souza on Trump: 'We failed our children'
In Paris Agreement speech, Trump never acknowledged the reality of global warming
Air Canada loses court case after its chatbot hallucinated fake policies to a customer
接受PR>=1、BR>=1,流量相当,内容相关类链接。