Axon body-camera supplier will not use facial recognition in its products — for now
Madeline Purdue, USA TODAY Published 5:04 a.m. ET July 1, 2019 | Updated 2:17 p.m. ET July 1, 2019
LOS ANGELES — A major supplier of body cameras to law-enforcement agencies across the country has decided to forgo selling facial-recognition technology with its products.
Axon, which supplies 48 police departments in major cities with body cameras, made the decision after the company’s ethics board concluded the technology was not accurate enough to be implemented in the field and could potentially cause major trust issues between law-enforcement officials and their communities.
In a 42-page report, the ethics board detailed concerns with the inaccuracy of the software, saying results showed it was less accurate when identifying women, younger people, and “worsens when trying to identify people of color compared to white people, a troubling disparity that would only perpetuate or exacerbate the racial inequities that cut across the criminal justice system.”
The board is composed of experts from different professions, including law enforcement, robotics and policy, and advises Axon on the potential effects its technology could have on society.
“Some police departments are sophisticated and would understand the limitations on (facial recognition) and would put their own guardrails in place around the use of that technology by their officers. But there are some police departments that would not take those same precautions and may not appreciate the implications of using a technology that’s not ready yet for prime-time,” said board member Jim Bueermann, who served as a police officer for over 30 years and is now president of the National Police Foundation.
There are companies out there actively promoting using facial recognition on body cameras,” Smith said. “From our perspective, we believe taking the time to do the ethical analysis up front in the product design process will have much better outcomes in the long haul.”
Smith says the company might reconsider incorporating the technology in their products in the future once the inaccuracies are solved, but must balance that with privacy and safety and put in “safeguards” to avoid misuse of facial recognition.
“We don’t believe that the right answer is that the police should deploy face recognition with no controls and go wild with it, but similarly we don’t believe that it makes sense to say police should never use face-recognition technology because there’s many cases where it’s pretty universally known that it will be a good thing,” Smith said.
Facial recognition has been a hot-button topic around the country as concerns about privacy and accuracy have made some cities skeptical of the technology, leading San Francisco to ban it altogether, with cities such as Oakland and Berkeley considering following suit. California may enact a statewide ban if Gov. Gavin Newsom signs the Body Camera Accountability Act this summer. The Somerville City Council in Massachusetts banned facial recognition just this week.
While these cities are turning their backs to this technology, others are welcoming it. The Orlando Police Department is testing Amazon’s ‘Rekognition’ software, although not in public or for investigative use. According to the New York Times, Detroit signed a $1 million deal to set up facial recognition in the city’s surveillance cameras with broad rules on how law enforcement can use the footage.
Bill Johnson, executive director of the National Association of Police Organizations, compares facial-recognition software to when law enforcement started using DNA testing as an investigative tool.
“There was a lot of concern then as well about ‘the government is going to have my DNA, they’re going to know all about me, how is this going to be used?’ and I think today those fears have generally gone by the wayside,” Johnson said.
Johnson says facial recognition can be utilized in the same way DNA testing is — to confirm identities, solve crimes and exonerate the innocent.
“I don’t think it’s providing the government any more identification (information) than they already have,” said Johnson. “I think if people are concerned about surveillance, they’ve kind of missed the boat.”
Facial recognition is being used by more than law enforcement. The New York school district will be the first in the country to deploy the software for added security, including to identify potential school shooters and sex offenders.
“AI is so powerful, it is going to be either the greatest or the worst thing we as a human species has ever developed, and using such a powerful tool, I think the developers of that should have a mechanism of objective, interested people they can rely on to say, ‘Just because we can do this, should we?’” Bueermann said.
Categories: Biometric Scanning