Customs and Border Protection’s Office of Field Operations launched a website this month intended to foster transparency with stakeholders and the public about their biometrics entry/exit program.
The website details where facial comparison technology is currently in use and provides information on how to request alternative screening.
“Long before the COVID-19 crisis emerged, we have steadfastly worked to implement a biometric entry-exit process to expedite safe, secure travel. Biometric Facial Comparison provides a touchless, hands-free verification process that reduces the sharing of documents and spread of germs, protecting not only travelers, but airline and airport employees as well as our own Officers. You can learn more about the safety and health benefits of Biometric Facial Comparison,” OFO said in a message on the website. “As CBP continues to proudly protect our nation, we look forward to welcoming everyone back to travel when the time is right.”
Promoting facial recognition technology as a “better public health option,” the site explains the process: A traveler has his or her photo taken at the point of boarding or arrival where a passport would normally be presented for inspection. That photo is compared against an existing passport or visa photo. The CBP officer interviews the traveler “to validate results, establish the purpose and intent of travel, and determine admissibility,” and then “all traveler photos of U.S. citizens are deleted and no photos are ever shared with industry partners.”
Facial Comparison Technology for Entry is deployed at 18 airports — 20 terminals total, including four Preclearance locations. Biometric Exit is deployed at 20 airports, or 21 terminals. Seven seaports use the technology for cruise line passengers.
Along land borders, biometric technology for people crossing in vehicles is available at the Veterans crossing in Brownsville, Texas, and the Peace Bridge in Buffalo, N.Y. Biometric Facial Comparison for pedestrians is available at Brownsville, El Paso, Nogales, Laredo, and San Luis, Ariz.
The website includes the privacy disclosure signage that travelers see when using one of the facial recognition entry/exit points.
Earlier this year, Office of Field Operations Deputy Executive Assistant Commissioner John Wagner, who recently retired from CBP and was instrumental in advancing the department’s use of biometrics, told Congress that the department was advancing its facial recognition technology into a new algorithm and maintained that software used by the Department of Homeland Security is not reflecting racial or gender bias.
“Since CBP is using an algorithm from one of the highest-performing vendors identified in the report, we are confident that our results are corroborated with the findings of this report,” Wagner told the House Homeland Security Committee at a hearing on the National Institute of Standards and Technology’s recent report studying facial recognition hits and misses. “More specifically, the report indicates …the highest performing algorithms had minimal to undetectable levels of demographic-based error rates.”
Tests in the December report, Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects, showed a wide range in accuracy across developers, with the most accurate algorithms producing many fewer errors. In one-to-one matching, NIST reported higher rates of false positives for Asian- and African-American faces, notably African-American females, compared to Caucasians; elderly and young subjects were also more prone to false positives. U.S.-developed algorithms yielded more high rates of false positives in one-to-one matching for Asians, African-Americans and native groups, while some algorithms developed in Asian countries showed “no such dramatic difference in false positives” in comparing Asian and Caucasian faces.
NIST tested 189 face recognition algorithms from 99 developers using four collections of photographs with 18.27 million images of 8.49 million people, using images provided by the State Department, DHS and FBI.
“The report also highlights some of the operational variables that impact error rates such as gallery size, photo age, photo quality, numbers of photos of each subject in the gallery, camera quality, lighting, human behavior factors all influence the accuracy of an algorithm,” Wagner noted to the committee. “That is why CBP is carefully constructed the operational variables in the deployment of the technology to ensure we can attain the highest levels of match rates, which remain in the 97 percent to 98 percent range.”
Wagner noted that NIST “did not test the specific CBP operational construct to measure the additional impact these variables may have, which is why we have recently entered into an MOU with NIST to evaluate our specific data.”
Wagner said that CBP had met multiple times with “representatives of the privacy advocacy community as well as discussions with the privacy and civil liberties oversight board and the DHS privacy and integrity advisory committee.”