59.3 F
Washington D.C.
Friday, April 26, 2024

Skin Color, Gender Confound Face Recognition Software, Adding to Controversy Over Federal Use

As the Transportation Security Agency and Customs and Border Protection are broadening their use of facial recognition to identify travelers leaving the United States through airports and border crossings, new research indicates the technology is less reliable in recognizing women and people of darker skin than light-skinned men, Wired reported Feb. 6.

Study results reported in September 2017 by Joy Buolamwini, a researcher at the MIT Media Lab, showed that three commercial facial recognition softwares, one each from IBM, Microsoft and Face++, all were more likely to misidentify the gender of darker-skinned women than lighter-skinned men. Microsoft’s error rate for darker-skinned women was 21 percent while the rates for IBM and Face++ were almost 35 percent, The New York Times reported Feb. 9. For light-skinned men, all three had error rates below 1 percent. All three performed better for lighter than darker-skinned people.

Given the rapid expansion in use of face recognition, Buolamwini and co-author Timnit Gebru recommended in a recent paper prepared from the study rigorous reporting on performance metrics for face recognition and increased transparency and accountability in artificial intelligence.

At least 117 million Americans are included in law enforcement face recognition networks, according to an October 2016 Georgetown Law School Center on Privacy & Technology study. A year of research including 100 police departments demonstrated that African-Americans are more likely than people of other ethnicities to be stopped by law enforcement and to be subjected to face recognition searches.

Homeland Security Department agencies have been testing face-scanning technology on departing passengers at airports in Boston, Atlanta, Chicago, Las Vegas, Miami, New York City, Houston, and Washington, D.C., according to a December 2017 study by the Georgetown center. The biometric exit program “stands on shaky legal ground,” according to the study, because Congress “has never clearly authorized the border collection of biometrics from U.S. citizens using face recognition technology.” DHS also has failed to conduct the public rulemaking process required before starting such a program, according to the center.

CBP hopes to extend the use of face recognition, including for U.S. citizens, throughout airports and to airlines in what it calls “The Biometric Pathway,” The Verge reported in May 2017.

“As soon as you check in for arrivals or departure, we’re going to stage your photo in that database,” John Wagner, deputy assistant commissioner at CBP, said at the May 2017 ConnectID conference in Washington, D.C., The Verge reported. “We want to make it available for every transaction in the airport where you have to show an ID today.”

Currently, photographs of citizens captured in the face recognition pilot tests are discarded. Wagner did not reject the notion that they could be kept in the future. “If we find a need to keep that, we’ll work through the privacy approvals to be able to do so,” Wagner said, according to the report.

Buolamwini noted that the databases of facial images used to “train” recognition systems account for some of the misidentification problems they display. Those databases “have already been documented to contain significant bias,” she wrote, even when collectors, such as the federal Intelligence Advanced Research Projects Activity (IARPA), strove for geographic diversity. The IARPA database, for example, showed an overrepresentation of white males and underrepresentation of darker-skinned people, especially females. In response, Buolamwini developed a new database, the Pilot Parliament Benchmark, using photos of 1,270 members of parliament from three African and three European countries. She used that database to test the three commercial systems.

When Buolamwini shared her results with the companies, IBM said it had improved its software and is issuing a new system this month promising a tenfold increase in accuracy on darker-skinned women, the NYT reported. Microsoft had “already taken steps to improve the accuracy of our facial recognition technology,” it said, and is investing “to recognize, understand and remove bias.” Megvii, the Chinese company that makes Face++, did not respond. Buolamwini has founded an organization to fight bias in machine learning called the Algorithmic Justice League.

author avatar
Homeland Security Today
The Government Technology & Services Coalition's Homeland Security Today (HSToday) is the premier news and information resource for the homeland security community, dedicated to elevating the discussions and insights that can support a safe and secure nation. A non-profit magazine and media platform, HSToday provides readers with the whole story, placing facts and comments in context to inform debate and drive realistic solutions to some of the nation’s most vexing security challenges.
Homeland Security Today
Homeland Security Todayhttp://www.hstoday.us
The Government Technology & Services Coalition's Homeland Security Today (HSToday) is the premier news and information resource for the homeland security community, dedicated to elevating the discussions and insights that can support a safe and secure nation. A non-profit magazine and media platform, HSToday provides readers with the whole story, placing facts and comments in context to inform debate and drive realistic solutions to some of the nation’s most vexing security challenges.

Related Articles

Latest Articles