Facial recognition technology, unbeknownst to citizens, is used in a variety of public settings, but some U.S. lawmakers say that the technology should not be deployed freely until security, privacy, and accuracy concerns can be mitigated and civil liberties guaranteed.
The House Committee on Oversight and Reform held a hearing on the use of facial recognition (FR) technology on Wednesday, the third in a three-part series. The hearings are an effort to understand the scope of how private and public companies are using this technology, so these companies can be held accountable to ethical standards.
The use of facial recognition technology is increasing. It can be found in home security systems, social media sites, sports arenas, and elsewhere for advertising, security, access, photo, and video data identification, and accessibility.
The National Institute of Standards and Technology (NIST) issued (pdf) a report in December analyzing private facial recognition systems companies. The report found that “across demographics, false positives rates often vary by factors of 10 to beyond 100 times,” and that Africans and Asians were more often misidentified.
Rep. Eleanor Holmes Norton (D-Wash.) expressed concern that consumers are unaware of the security issues that facial recognition on their cellphone poses. She asked the panel if there were any means by which consumers could confirm that these cellphone manufacturers are storing their biometric or other data on their servers.
Meredith Whittaker (pdf), Co-Founder of the AI Now Institute at New York University said that this technology “is hidden behind trade secrecy.” She added: “This is a corporate technology that is not open for scrutiny and auditing by external experts. I think it’s notable that while NIST reviewed 189 algorithms for their latest report, Amazon refused to submit their recognition algorithm to NIST, and they claimed they couldn’t modify it to meet NIST standards.”
Whittaker expressed suspicion about the multibillion-dollar company’s non-compliance with the NIST research and pointed to their global reach and innovations. She said whatever the reason for not disclosing information about their facial recognition technology, “we have to trust these companies, but we don’t have many options to say no or to scrutinize the claims they make.”
Rep. Brenda Lawrence (D-Mich.) introduced HR153, which addresses the need for the development of guidelines for the ethical development of transparency and ethics in the AI systems processes, and the implications of it. Lawrence said that currently there are no checks on how and when the technology is used, and what companies are doing with the data.
“Right now, we have the wild, wild west when it comes to AI,” she said.
Lawrence’s bill addresses the fact that artificial intelligence isn’t the only emerging technology that requires the development of ethical guidelines; the same concerns exist for facial recognition technology.
The congresswoman represents a district in Michigan that is 67 percent composed of people from a minority ethnic background, and is concerned about the findings of the NIST report, which confirmed that African and Asian people are more often misidentified by facial recognition algorithms.
Lawrence said: “We in America have the right to know if we’re under surveillance and what are you doing with it. Any release of data that you [are] gathering should be required to go through some type of process for the release of that.”
Other lawmakers echoed this concern and asked the experts what could or should be done to regulate the industry and ensure citizens’ civil liberties.
“I think we need to pause the technology and let the rest of it catch up so that we don’t allow corporate interests and corporate technology to race ahead to be built into our core infrastructure without having put the safeguards in place,” said Whittaker.
Congressman Jim Jordan (R-Ohio) said this technology left unchecked could be a threat to Americans’ fundamental civil rights. “You said this facial recognition poses an existential threat to democracy and liberty. My main concern is how [a] government may use this to harm our First Amendment and Fourth Amendment liberty.”
Rep. Rashida Tliab (D-Mich.) said she is disturbed that in her district facial recognition is being used in low-income government housing facilities.
“I don’t think being poor or being working class means somehow that you deserve less civil liberties or less privacy,” Tlaib said.
It was made clear that the public has no substantial knowledge about the use of this technology and its accuracy and that the NIST report is only a piece of the puzzle.
“We don’t have a way to audit whether NIST’s results in the laboratory represent the performance in different contexts, like amusement parks or stadiums, or wherever else so there’s a big gap in the auditing standards,” said Whittaker. “Although the audits we have right now have shown extremely concerning results.”
Lawmakers and experts agreed that communities should be educated about when this technology is being used, the harm it can do to their communities, and have a say in where it is used.