63.3 F
Washington D.C.
Sunday, September 26, 2021
spot_img

GAO Finds Cargo Screening Weaknesses as TSA Plans Covert Testing at Foreign Airports

The threat of explosives reaching the United States via air cargo is significant. Terrorists have attempted to use air cargo shipments to smuggle concealed explosives into the United States previously, and will continue to do so. The Transportation Security Administration (TSA) therefore requires air carriers to x-ray or screen the cargo by other means before it enters the U.S. But the Government Accountability Office (GAO) has found weaknesses in TSA’s analysis of imaging technology.

TSA qualifies explosives detection systems for use in screening air cargo through the Air Cargo Screening Qualification Test process. Once a system is certified at DHS’s Transportation Security Laboratory against TSA threat detection standards, it can enter the air cargo qualification screening process. Following initial certification, the vendor submits a written proposal, known as a qualification data package, containing information about how the system meets TSA’s requirements for air cargo screening. After TSA reviews and accepts the qualification data package, TSA conducts two additional rounds of testing: factory testing and a field assessment. Factory testing consists of TSA’s review of vendor-provided data and independent audits of the system at the vendor’s factory to verify it meets requirements. After successful factory testing, the system is “approved” by TSA. An “approved” system then undergoes a field assessment where the system must meet all requirements in an air cargo screening environment. Following successful completion of the field assessment, the system is “qualified” by TSA for use.

From January 2020 through April 2021, TSA conducted a field assessment on the use of a computed tomography (CT)-based explosives detection system to screen air cargo as part of its ongoing process to qualify the system for use by air carriers. This type of system produces images of parcels that are examined by computer for signs of explosives. However, GAO found that TSA’s assessment did not fully meet three of five key design and evaluation practices. According to GAO, TSA incorporated the key practices of clarifying program goals and developing evaluation questions but only partially incorporated the other key practices of selecting an appropriate design approach, collecting all relevant data, and analyzing that data in a way that allows them to draw valid conclusions from the CT field assessment. 

Since TSA officials cannot use live explosives in the field to measure the probability of detection, they rely on image quality testing, using a manufacturer’s test kit to compare system performance in the field with earlier tests performed in a laboratory with live explosives. However, TSA GAO found that did not validate that the test kit was an acceptable alternative test method for determining the CT system’s probability of detection in the field. 

The watchdog noted in its July 29 report that “TSA did not independently validate that the test kit captures all ways system performance could degrade or collect any of the underlying quantitative data from the test kit”. TSA officials told GAO they did not validate the test kit because its performance was certified during laboratory testing at DHS’s Transportation Security Laboratory; however, officials from the Transportation Security Laboratory told GAO they do not certify the performance of test kits. 

Without a suitable alternative testing approach to determine the probability of detection, GAO is concerned that TSA will not have all relevant data to assess whether the CT system meets TSA’s detection standard requirements in the field and should be qualified for use by air carriers.

It is worth noting that TSA began efforts in January 2021 to expand performance measurement regarding adherence to air cargo security requirements that would move beyond measuring the level of air carriers’ compliance with the requirements. These efforts would potentially result in additional measures to assess the overall effectiveness of the various air cargo security requirements in preventing threats to air cargo security. Officials told GAO that TSA is working toward using air cargo covert testing at foreign airports as a potential proxy outcome measure for assessing the effectiveness of security requirements in preventing threats on flights to the United States. 

GAO also considered U.S. Customs and Border Protection’s (CBP) role in securing air cargo. CBP shares information with air carriers through its Air Cargo Advanced Screening program (ACAS), which requires air carriers to submit shipment data on U.S.-bound air cargo prior to departure from last point-of-departure airports. ACAS serves as an information-sharing mechanism because it allows CBP to communicate potential cargo security concerns directly with air carriers.The watchdog noted that “TSA and CBP have separate procedures for assessing risk for inbound air cargo and share information to inform their respective risk assessments; however, they do not have a documented process that ensures the full exchange of relevant risk data”. Officials from both agencies told GAO that they share relevant threat information, but also stated that trend data from CBP’s ACAS, while provided to TSA’s Intelligence and Analysis office, is not provided directly to TSA’s International Risk Branch, which is responsible for assessing air cargo security risk. A senior TSA International Risk Branch official said they would benefit from receiving ACAS trend data as a potential input to the agency’s air cargo risk assessment.

GAO’s report makes four recommendations:

  • TSA and CBP should establish a documented process to ensure that relevant officials from both agencies are aware of and have access to applicable data to inform their inbound air cargo risk assessment efforts. 
  • Prior to designating the explosives detection system for air cargo screening currently under evaluation as “qualified” on the air cargo screening technology list, TSA should verify through additional data collection or analysis that the system’s probability of detection in the field matches the performance measured in laboratory testing.
  • TSA should ensure that necessary data are collected during field assessments to independently verify that the probability of detection of explosives detection systems for air cargo screening in the field matches the performance measured in laboratory testing, prior to designating systems as “qualified” on the air cargo screening technology list. GAO says TSA could provide this verification either through live explosives testing or, when operational considerations limit TSA’s ability to use live threat materials, TSA should use an independently validated, fully documented alternative testing strategy. 
  • Finally, TSA should ensure statistical techniques are used to analyze data from TSA field assessments, including data from the current field assessment, of explosives detection systems for air cargo screening, prior to designating systems as “qualified” on the air cargo screening technology list. This statistical analysis should include calculating error values for each quantitative measurement, identifying all necessary performance thresholds, and comparing the measured values and errors against each threshold to determine the statistical confidence of the results. 

The Department of Homeland Security (DHS) agreed with each recommendation. It stated that in January 2021, TSA and CBP officials discussed possible data sources necessary to support improved risk analysis. They agreed to conduct a formal discussion to identify next steps, such as conducting a risk information exchange between TSA and CBP, identifying variables of interest and defining how best to exchange relevant information, and drafting a Memorandum of Understanding to guide the future exchange of relevant information. TSA and CBP estimated they would complete these tasks by February 28, 2022. 

For the second and third recommendations, TSA argued that its actions over the course of the field assessment to verify probability of detection with image quality techniques—specifically the use of a manufacturer-developed operational test kit during its field assessment—are sufficient to ensure that probability of detection in the field matches system performance measured in the laboratory. GAO countered that TSA’s use of an operational test kit—without independent validation—is “not an acceptable alternative to test the system’s probability of detection in the field”, adding that “while there is general industry acceptance of image quality testing for CT systems, to rely on a manufacturer-developed operational test kit to evaluate probability of detection in the field, TSA must independently validate that the operational test kit’s results correlate with the system’s ability to detect threats (as measured in laboratory testing) and accurately capture ways in which the system’s performance could degrade”.

To address the fourth recommendation, TSA stated it has plans to conduct statistical analysis on data from its field assessment. However, GAO said it had been unable to substantiate that TSA’s planned analysis will include all necessary statistical elements, as identified in our recommendation. The watchdog said that during the course of the review, TSA officials informed GAO they would include statistical analysis in their final field assessment report. “However, despite multiple requests, TSA has yet to provide documentation—such as a data analysis plan—of its plans to evaluate metrics using a statistical analysis that would include all the elements identified in our recommendation.”

TSA also stated that it did not set performance thresholds for some field assessment metrics because it did not need them to assess CT system performance in the field. GAO however noted how variations in false alarm rates, such as those that TSA observed in its field assessment, can indicate changes to the system’s core performance. The watchdog therefore determined that establishing performance thresholds for these metrics would not be an “arbitrary constraint,” as TSA contends in its written comments, but would further ensure the CT system continues to meet key requirements when operating in a field environment. GAO also pointed out that TSA does evaluate these metrics qualitatively against the performance of CT systems used in its passenger checked baggage program, clearly recognizing the necessity of comparing data measured in this field assessment against a performance standard.

This last point illustrates that even though cargo security efforts have improved considerably, they are sometimes still playing catch-up to passenger screening. Given the very clear intent from those who continue to pursue air cargo as an attack vector, it is vital that it does not again become a weak link in the aviation security chain.

GAO’s report comes as TSA implements new International Civil Aviation Organization security standards for air cargo. The new standards, effective from July 1, require the U.S. and other countries to screen 100% of cargo before it is loaded on freighter aircraft, as they are currently required to do with belly freight. The increase in screening has caused backlogs and carriers have reported confusion surrounding the new documentation requirements.

Read the full report at GAO

Kylie Bielby
Kylie Bielby has more than 20 years' experience in reporting and editing a wide range of security topics, covering geopolitical and policy analysis to international and country-specific trends and events. Before joining GTSC's Homeland Security Today staff, she was an editor and contributor for Jane's, and a columnist and managing editor for security and counter-terror publications.

Related Articles

STAY CONNECTED

- Advertisement -

Latest Articles