The Government Accountability Office (GAO) was asked to conduct a technology assessment on the use of forensic algorithms in federal law enforcement. Forensic algorithms help forensic experts partially automate the process of assessing whether or not evidence collected in an investigation may have originated from an individual, potentially increasing the speed of investigations and reducing human bias and error.
GAO is conducting its assessment in two phases. The first phase describes algorithms being used by federal law enforcement agencies and how these technologies work. The second phase will assess the approaches and challenges related to how federal law enforcement agencies apply these technologies and will identify policy options for addressing these challenges going forward.
GAO reported on May 12 that federal law enforcement agencies are primarily using three types of forensic algorithms to help assess whether or not evidence collected in a criminal investigation may have originated from an individual: probabilistic genotyping (PGS), latent print (fingerprint and palm print) analysis, and face recognition. To a lesser extent, agencies also use algorithms to compare iris images, speech, and handwriting.
Each type of algorithm uses different characteristics in its assessment. For example, probabilistic genotyping uses statistics to analyze biological samples found during a criminal investigation to assist in comparisons to a known DNA sample taken from a suspect, or to DNA data profiles from a database of known persons. The Federal Bureau of Investigation currently uses probabilistic genotyping and latent fingerprint algorithms to help assess whether or not evidence collected in a criminal investigation may have originated from an individual and face recognition to generate investigative leads. The National Institute of Standards and Technology (NIST) and other organizations have developed standards to facilitate transmission of data between agencies.
GAO found four standards that agencies use to facilitate the transmission of data between agencies for examination by PGS, latent print, and face recognition algorithms: One international standard, one U.S. standard, and two standards specific to a federal agency.
The international standard was developed jointly by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) to enable the interoperability and data interchange among biometric applications and systems. It includes guidance for fingerprints, facial images, and DNA data (used for PGS).
NIST developed a standard for prints, facial images, and DNA data. The American National Standards Institute (ANSI)/NIST standards were developed for federal agencies to specify a common format for data exchange across jurisdictional lines or between dissimilar systems made by different manufacturers. According to NIST officials, these standards were developed with criminal justice in mind.
The FBI developed a standard for electronically encoding and transmitting biometric image, identification, and arrest data known as the Electronic Biometric Transmission Specification (EBTS). This standard, based on the ANSI/NIST-ITL 1-2011, Update: 2015 standard, applies to the FBI’s database of biometric and criminal history information (NGI System) and helps ensure that the data format for prints and facial images matches that of the NGI System. Similarly, DOD developed the EBTS, based on the ANSI/NIST-ITL 1-2011, Update: 2015 standard, to interface with DOD’s biometric database.
For its review, GAO obtained information from NIST, the Department of Justice, the Department of Homeland Security, and the Department of Defense; convened an interdisciplinary panel of 16 experts with assistance from the National Academies of Sciences, Engineering, and Medicine; conducted interviews with additional stakeholders, including nonprofit groups and legal experts; conducted literature searches; and reviewed relevant literature and case law.