This is the accessible text file for GAO report number GAO-14-357 entitled 'Advanced Imaging Technology: TSA Needs Additional Information before Procuring Next-Generation Systems' which was released on April 30, 2014. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Requesters: March 2014: Advanced Imaging Technology: TSA Needs Additional Information before Procuring Next-Generation Systems: GAO-14-357: GAO Highlights: Highlights of GAO-14-357, a report to congressional requesters. Why GAO Did This Study: TSA accelerated the deployment of AIT systems, or full-body scanners, in response to the December 25, 2009, attempted terrorist attack on Northwest Airlines Flight 253. Pursuant to the Federal Aviation Administration Modernization and Reform Act of 2012, TSA was mandated to ensure that AIT systems were equipped with ATR software, which displays generic outlines of passengers rather than actual images, by June 1, 2013. All deployed AIT systems were equipped with ATR software by the deadline. GAO was asked to evaluate TSA's AIT-ATR systems' effectiveness. This report addresses the extent to which (1) TSA collects and analyzes available information that could be used to enhance the effectiveness of the AIT-ATR system and (2) TSA has made progress toward enhancing AIT capabilities to detect concealed explosives and other threat items, and any challenges that remain. GAO analyzed testing results conducted by the Transportation Security Laboratory and TSA personnel at airports and interviewed DHS and TSA officials. This is a public version of a classified report that GAO issued in December 2013. Information DHS and TSA deemed classified or sensitive has been omitted, including information and recommendations related to improving AIT capabilities. What GAO Found: The Department of Homeland Security's (DHS) Transportation Security Administration (TSA) does not collect or analyze available information that could be used to enhance the effectiveness of the advanced imaging technology (AIT) with automated target recognition (ATR) system. Specifically, TSA does not collect or analyze available data on drills using improvised explosive devices (IED) at the checkpoint that could provide insight into how well screening officers (SO) resolve anomalies, including objects that could pose a threat to an aircraft, identified by AIT systems, because it does not enforce compliance with its operational directive. TSA's operational directive requires personnel at airports to conduct drills to assess SO compliance with TSA's screening standard operating procedures and to train SOs to better resolve anomalies identified by AIT-ATR systems. GAO found that TSA personnel at about half of airports with AIT systems did not report any IED checkpoint drill results on those systems from March 2011 through February 2013. According to TSA, it does not ensure compliance with the directive at every airport because it is unclear which office should oversee enforcing the directive. Without data on IED checkpoint drills, TSA lacks insight into how well SOs resolve anomalies detected by AIT systems, information that could be used to help strengthen existing screening processes. Potential weaknesses in the screening process could be caused by TSA not clarifying which office is responsible for overseeing TSA's operational directive, directing that office to ensure enforcement of the directive in conducting these drills, and analyzing the data. Further, when determining AIT-ATR system effectiveness, TSA uses laboratory test results that do not reflect the combined performance of the technology, the personnel who operate it, and the process that governs AIT-related security operations. TSA officials agreed that it is important to analyze performance by including an evaluation of the technology, operators, and processes and stated that TSA is planning to assess the performance of all layers of security. By not measuring system effectiveness based on the performance of the technology and SOs who operate the technology or taking into account current processes and deployment strategies, DHS and TSA are not ensuring that future procurements meet mission needs. TSA completed the installation of ATR software upgrades intended to address privacy concerns for all deployed AIT systems; however, it has not met proposed milestones for enhancing capabilities as documented in its AIT roadmap—a document that contains milestones for achieving enhanced capabilities to meet the agency's mission needs. For example, TSA began operational test and evaluation for Tier II upgrades 17 months after the expected start date. Moreover, TSA did not use available scientific research or information from experts from the national laboratories or vendors on the technological challenges that it faces in developing requirements and milestones, because, according to TSA, it relied on time frames proposed by vendors. Thus, TSA cannot ensure that its roadmap reflects the true capabilities of the next generation of AIT systems by using scientific evidence and information from DHS's Science and Technology Directorate, the national laboratories, and vendors to develop a realistic schedule with achievable milestones that outlines the technological advancements, estimated time, and resources needed to achieve enhanced capabilities as outlined in TSA's roadmap. What GAO Recommends: GAO recommends that TSA, among other things, clarify which office should oversee its operational directive, better measure system effectiveness, and develop a realistic schedule before procuring future generations. TSA concurred with GAO's recommendations. View [hyperlink, http://www.gao.gov/products/GAO-14-357]. For more information, contact Stephen M. Lord at (202) 512-4379 or lords@gao.gov. [End of section] Contents: Letter: Background: TSA Does Not Collect or Analyze Data That Could Enhance System Performance: TSA Has Enhanced Passenger Privacy, but Could Take Additional Steps to Enhance AIT Capabilities: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Transportation Security Administration (TSA) Has Taken Action toward Addressing Advanced Imaging Technology Utilization and Planning Challenges Identified in our January 2012 Report: Appendix II: Objectives, Scope, and Methodology: Appendix III: Comments from the Department of Homeland Security: Appendix IV: GAO Contact and Staff Acknowledgments: Figures: Figure 1: Passenger Screening Process Using Advanced Imaging Technology with Automated Target Recognition (AIT-ATR) Systems and Outcomes: Figure 2: Advanced Imaging Technology with Automated Target Recognition (AIT-ATR) Roadmap for Completing Tier II Upgrades: Figure 3: Transportation Security Administration (TSA) Advanced Imaging Technology (AIT) Roadmap Milestones for Testing and Acquiring AIT-2 Systems: Abbreviations: AIT: Advanced Imaging Technology: AIT-ATR: AIT systems equipped with ATR: AIT-IO: AIT systems with image operators: ATR: Automated Target Recognition: ASAP: Aviation Screening Assessment Program: BMI: Body Mass Index: DHS: Department of Homeland Security: DHS OIG: DHS Office of Inspector General: GPRA: Government Performance and Results Act: IED: Improvised Explosive Device: SO: Screening Officers: SOP: Standard Operating Procedures: TSA: Transportation Security Administration: TSL: Transportation Security Laboratory: TSO: Transportation Security Officer: [End of section] United States Government Accountability Office: GAO: 441 G St. N.W. Washington, DC 20548: March 31, 2014: Congressional Requesters: The Transportation Security Administration (TSA), a component of the Department of Homeland Security (DHS), accelerated the deployment of advanced imaging technology (AIT) systems, commonly referred to as full-body scanners, in response to the December 25, 2009, attempted terrorist attack on Northwest Airlines Flight 253. According to TSA officials, AIT systems provide enhanced security benefits compared with those of walk-through metal detectors by identifying nonmetallic threat objects and liquids. In addition, TSA officials stated that AIT systems provide additional deterrence to potential terrorists and enhance screening efficiencies when compared with physical pat-downs. Following the accelerated deployment of AIT, the public and others raised privacy concerns because AIT systems produced images of passengers' bodies that image operators (IO) analyzed to identify objects or anomalies that could pose a threat to an aircraft or to the traveling public.[Footnote 1] To mitigate those concerns, TSA began installing automated target recognition (ATR) software on deployed AIT systems in July 2011. AIT systems equipped with ATR (AIT-ATR) automatically interpret the image and display anomalies on a generic outline of a passenger instead of displaying images of actual passenger bodies like the AIT systems that used IOs (AIT-IO). Screening officers (SO) use the generic image of a passenger to identify and resolve anomalies on site in the presence of the passenger. In response to the December 25, 2009, bombing attempt, TSA increased the number of units it originally planned to procure and deploy from 878 to 1,800 AIT systems, but in 2012, subsequently lowered the number of AIT systems it sought to procure to 1,250 as a result of implementing new risk-based screening measures and TSA Pre¸TM lanes. [Footnote 2] For fiscal years 2013 and 2014, TSA did not request additional funding to procure AIT systems. In 2013, TSA planned to further reduce the number of AIT systems to 878 in response to changing screening processes. As of March 2014, TSA had deployed about 740 AIT systems at almost 160 airports, and we estimate that TSA will spend over $3.5 billion in life cycle costs on deployed AIT-ATR systems and future AIT systems. Pursuant to the Federal Aviation Administration Modernization and Reform Act of 2012, enacted in January 2012, TSA was mandated to ensure that AIT systems used to screen passengers were equipped with ATR software by June 1, 2012.[Footnote 3] Consistent with provisions of the law, TSA subsequently extended this deadline to June 1, 2013. [Footnote 4] According to TSA, by the June 1, 2013, deadline, all deployed AIT systems have been equipped with ATR software. In January 2012, we issued a classified report on TSA's procurement and deployment of AIT-IO systems that, among other things, assessed TSA's adherence to DHS acquisition guidance when procuring those systems. We found that TSA did not follow DHS acquisition guidance when procuring AIT-IO systems and that TSA procured and deployed a technology that met evolving requirements, but not the initial requirements included in its key acquisition document. Further, we reported that TSA did not have plans to require vendors to meet milestones used during the AIT acquisition. As a result, we recommended that TSA would have more assurance that limited taxpayer resources are used effectively by developing a roadmap that specifies development milestones for the technology and having DHS acquisition officials approve this roadmap.[Footnote 5] We also recommended that TSA make future procurements contingent on meeting those milestones and to acknowledge in the roadmap any uncertainty regarding the attainment of those milestones. DHS agreed with these recommendations, and in February 2012, TSA completed a roadmap that contained milestones for achieving enhanced capabilities, which we discuss later in this report. We also found that TSA had acquired AIT systems that were not used on a regular basis, and thus were not providing a security benefit. Therefore, we recommended that TSA evaluate the use of deployed AIT systems and redeploy systems that were not being extensively used. DHS concurred with this recommendation. Although TSA has taken steps to address our recommendation by developing and implementing mechanisms to better track the use of deployed AIT systems, we found during our most recent review that it has not fully addressed our recommendation because TSA has not ensured that the utilization data it collects are accurate, and, as a result, cannot use these data to inform future deployment decisions. For more information on TSA's efforts to address this recommendation, see appendix I. As an update to our prior work, you asked us to evaluate TSA's efforts to enhance effectiveness of AIT systems. Specifically, this report addresses the following questions: 1. To what extent does TSA collect and analyze available information that could be used to enhance the effectiveness of the AIT-ATR system? 2. To what extent has TSA made progress toward enhancing AIT capabilities to detect concealed explosives and other threat items, and what challenges, if any, remain? To determine the extent to which TSA collects and analyzes available information that could be used to enhance the effectiveness of the entire AIT-ATR system, we analyzed improvised explosive device (IED) checkpoint drills conducted by TSA personnel at airports that submitted data to TSA from March 1, 2011, through February 28, 2013, under TSA's IED checkpoint drill operational directive. TSA's IED checkpoint drill operational directive requires personnel at airports to conduct drills to assess TSO compliance with TSA's screening standard operating procedures (SOP) and to train TSOs to better resolve anomalies identified by AIT-ATR systems.[Footnote 6] Among other things, we evaluated airport compliance with TSA's operational directive and Standards for Internal Control in the Federal Government to determine the extent to which TSA is monitoring compliance with its directive.[Footnote 7] Further, we analyzed laboratory test results of the AIT-ATR system and the AIT-IO system from calendar years 2009 through 2012 conducted by the Transportation Security Laboratory (TSL) to compare both systems' false alarm rates and conducted statistical analysis of those data. [Footnote 8] We visited the TSL in Atlantic City, New Jersey, to interview laboratory scientists responsible for testing and evaluating AIT-ATR systems and reviewed TSL documentation related to laboratory test plans, records, and final reports. Further, we assessed the extent to which laboratory test results demonstrated that the AIT-ATR system met requirements outlined in key acquisition practices established by GAO, because the AIT system is considered a large-scale acquisition program.[Footnote 9] We analyzed the adequacy of laboratory tests by comparing the testing design with generally accepted statistical methods used for data collection and analysis.[Footnote 10] We assessed the reliability of the laboratory and IED checkpoint drill data we used by interviewing officials responsible for capturing and monitoring the data about, among other things, applicable quality control procedures to maintain the integrity of the data, performing statistical tests on the data, and reviewing testing reports and related documentation. We determined these data were sufficiently reliable for the purposes of this report. We also interviewed TSA officials involved in AIT-ATR system deployment, training, and covert testing. We compared the extent to which TSA evaluated the performance of the entire system against key acquisition practices established by GAO, guidelines contained in DHS's Acquisition Directive 102-01, and TSL's Test Management Plan. [Footnote 11] We also visited a nonprobability sample of four U.S. airports to observe AIT-ATR systems and interviewed relevant TSA personnel who operate those systems to obtain their views on system performance.[Footnote 12] The information we obtained from those visits cannot be generalized to other airports, but provided perspectives of various AIT-ATR system users. To determine progress TSA has made and any challenges that remain toward enhancing AIT capabilities to detect concealed explosives and other threat items, we analyzed TSA's original AIT roadmap dated February 2012, as well as the October 2012 revision. To determine the extent to which TSA has met its projected time frames for AIT-ATR system upgrades and AIT-2 development, we reviewed actions taken by TSA testing officials and compared the actual dates for each milestone with the estimated dates documented in TSA's AIT roadmap. To determine the extent to which TSA's AIT roadmap contains fundamental elements of technology roadmaps, we analyzed and compared technology roadmapping guidance developed by the Department of Energy's Sandia National Laboratories with TSA's AIT roadmap.[Footnote 13] We also reviewed technology roadmaps for large-scale acquisition programs developed by various agencies and organizations, such as the Department of Defense, for examples of technology roadmaps that adhered to established guidance and compared these roadmaps with TSA's AIT roadmap. To determine the extent to which the milestones contained in TSA's AIT roadmap are attainable, we interviewed scientists from the Sandia National Laboratories and the Pacific Northwest National Laboratory, a leading AIT vendor, TSA acquisition officials, and a group of 12 experts identified by the National Academy of Sciences to discuss practices used to test technical performance of threat detection technologies, which include AIT systems, at the developmental stage.[Footnote 14] Our interviews with these experts are illustrative and provide insights about testing best practices. We also reviewed prior GAO reports on (1) major acquisition programs to identify best practices for delivering capabilities within schedule and cost estimates and (2) key practices that can help sustain agency collaboration to leverage each others' resources and obtain additional benefits that would not be available if they were working separately. [Footnote 15] More details on our scope and methodology can be found in appendix II. This report is a public version of the prior classified report that we provided to you. DHS and TSA deemed portions of information in the report as secret and sensitive security information, which must be protected from public disclosure. Therefore, this report omits a research question and recommendation about the AIT-ATR system's effectiveness at detecting threats and the extent to which AIT-ATR system performance compares with AIT-IO system performance. This report also omits details related to TSA's tiered requirements for AIT systems; the results of our interviews with airport staff; information about SO performance at resolving anomalies identified by the AIT-ATR system; specific testing results depicting the AIT-ATR systems' false alarm rate; specific airport checkpoint drill requirements, including the number of airports that were required to conduct those drills; specific details pertaining to the number of years it would take to provide enhanced capabilities; and deficiencies identified during AIT testing. Although the information provided in this report is more limited in scope, the overall methodology used for both reports is the same. We conducted this performance audit from September 2012 to March 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: Roles and Responsibilities: DHS and TSA share responsibility for the research, development, and deployment of passenger checkpoint screening technologies. The Aviation and Transportation Security Act established TSA as the federal agency with primary responsibility for securing the nation's civil aviation system, which includes the screening of all passengers and property transported to, from, and within the United States by commercial passenger aircraft.[Footnote 16] Additionally, the Homeland Security Act of 2002 established DHS and, within it, the Science and Technology Directorate for, among other things, conducting research, development, demonstration, and testing and evaluation activities relevant to DHS.[Footnote 17] DHS's Science and Technology Directorate is responsible for testing and evaluating aviation security technologies, including AIT systems, at the TSL on behalf of TSA. Types of Testing Conducted on AIT-ATR Systems: DHS and TSA conducted five types of tests to evaluate the performance of AIT-ATR systems. Qualification testing. TSL conducted qualification tests in a laboratory setting to evaluate the technology's capabilities against TSA's procurement specification and detection standard that specified the required detection rate AIT systems must meet in order to qualify for procurement. Qualification tests evaluate the technology's detection of threat items that are not artfully concealed as they are in covert tests, but do not test the entire system, including the SO's interpretation and resolution of alarms. Qualification testing also includes testing of the system's false alarm rate. For the purposes of this report, we refer to qualification testing as laboratory testing. Operational testing. TSA conducted operational tests that assessed the technology's detection performance, called threat-inject tests, at airports to evaluate the AIT-ATR systems' ability to function in an operational environment. Operational testing also assesses how well AIT systems are suited for use in a real-world, aviation checkpoint environment after systems have successfully completed qualification testing in a laboratory setting. For example, operational testing includes determining whether the system interfered with other equipment fielded at the checkpoint and whether the system met TSA's requirements. Further, DHS's acquisition policy requires that operational tests be conducted prior to an agency procuring a technology. According to TSA testing documentation, threat-inject tests are not intended to evaluate effectiveness of the entire AIT-ATR system, which includes the technology, the personnel who use the technology, and the processes that govern screening, in an operational setting. Covert testing. TSA's Office of Inspection and the DHS Office of Inspector General conducted covert tests of AIT-ATR systems at the passenger checkpoint to identify vulnerabilities in TSA's screening process. According to TSA officials, those tests were intended to identify weaknesses in the technology, the operators who used it, and TSO compliance with SOPs by artfully concealing threat objects intended to simulate a likely terrorist attack. Performance assessments. TSA conducted covert performance assessments of TSO compliance with SOPs, under the Aviation Screening Assessment Program (ASAP), which TSA uses as a standard performance measurement for the Office of Management and Budget. According to TSA officials, ASAP assessments determine SO adherence to TSA's SOPs and are not intended to test AIT-ATR system capabilities. Checkpoint drills. In accordance with TSA's IED checkpoint drill operational directive, TSA requires personnel at airports to conduct drills to assess TSO compliance with TSA's screening SOPs and to train TSOs to better resolve anomalies identified by AIT-ATR systems. [Footnote 18] TSA conducts those drills at airports using test kits that contain inert bombs, bomb parts, and other threat items. According to TSA officials, IED checkpoint drills assess SO adherence to TSA's SOPs and are not intended to test AIT-ATR system capabilities. TSA's Screening Process: TSA uses a multilayered security strategy aimed to enhance aviation security. Within those layers of security, TSA's airport passenger checkpoint screening system includes, among other things, (1) screening personnel; (2) SOPs that guide screening processes conducted by TSOs; and (3) technology, such as AIT-ATR systems, used to conduct screening of passengers.[Footnote 19] According to TSA, those elements collectively determine the effectiveness and efficiency of passenger checkpoint screening. In strengthening one or more elements of its checkpoint screening system, TSA aims to balance its security goals with the need to efficiently process passengers. Passenger screening is a process by which TSOs inspect individuals and their property to deter and prevent an act of violence, such as carrying an explosive, weapon, or other prohibited item onboard an aircraft or into the airport sterile area--in general, an area of an airport for which access is controlled through screening of persons and property.[Footnote 20] TSOs inspect individuals for prohibited items at designated screening locations, referred to as checkpoints, where TSOs use technology and follow SOPs to screen passengers. According to TSA's SOP for AIT-ATR systems, three TSOs are required to operate lanes equipped with AIT systems: one divestiture officer (of either gender), one male SO, and one female SO.[Footnote 21] TSA's Detection Standard and Tier Levels: As we reported in January 2012, TSA's requirements for the AIT system have evolved over time. TSA continued to use those revised requirements to determine whether the AIT-ATR system met the agency's needs. Additionally, TSA used those requirements to evaluate the next generation of AIT systems, referred to as AIT-2. Further, TSA's requirements for AIT systems are based on tiers that correspond to the relative size of items that the AIT system must identify and requirements that the AIT system must meet, with Tier I being the level currently deployed AIT systems already meet and Tier IV being TSA's anticipated goal for AIT systems to meet. TSA's procurement of AIT-2 systems requires vendors to ensure AIT-2 systems meet Tier II requirements and provide faster throughput, among other things. TSA plans to seek proposals from AIT-2 vendors to provide Tier III and Tier IV capabilities by time frames specified in its AIT roadmap. TSA did not initially plan for AIT-IO systems to meet levels beyond Tier III, but included Tier IV in response to our recommendation. TSA Does Not Collect or Analyze Data That Could Enhance System Performance: TSA does not collect or analyze three types of available information that could be used to enhance the effectiveness of the entire AIT-ATR system. First, TSA does not collect or analyze available airport-level IED checkpoint drill data on SO performance at resolving alarms detected by the AIT-ATR system to identify weaknesses and enhance SO performance at resolving alarms at the checkpoint. Second, TSA is not analyzing AIT-ATR systems' false alarm rate in the field using data that could help it monitor the number of false alarms that occur on AIT-ATR systems to help monitor the potential impacts that AIT-ATR systems may have on operational costs. Third, TSA assesses the overall AIT-ATR system performance using laboratory test results that do not reflect the combined performance of the technology, the personnel that operate it, and the process that governs AIT-related security operations. TSA Does Not Collect or Analyze Airport Data on SO Performance on AIT- ATR Systems: TSA does not collect or analyze IED checkpoint drill data, because it does not ensure compliance with its operational directive that requires each airport to conduct IED checkpoint drills each week. Specifically, the operational directive, originally issued in February 2010 and updated in November 2012, requires TSA personnel at airports to conduct a certain number of IED drills per checkpoint lane every week at each airport. The total number of drills per pay period must be split evenly between carry-on baggage and passenger screening. Additionally, for those airports equipped with AIT systems, a certain percentage of on-person drills must be conducted on AIT systems and a certain percentage must be conducted on walk-through metal detectors. TSA is not enforcing compliance with its directive, and as a result, data on SO performance are not being consistently collected or reported by approximately half of airports with AIT-ATR systems. For example, according to TSA data, we found that TSA personnel at almost half of the airports with AIT-IO or AIT-ATR systems did not report any IED checkpoint drill results on those systems from March 2011 through February 2013. Of the airports at which TSA personnel conducted IED checkpoint drills, the number of drills conducted by TSA personnel at airports varied from 1 to 8,645. Further, roughly four-fifths of the on-person IED drills were conducted by screening passengers with metal detectors, with the rest of the IED drills conducted by screening passengers with AIT systems, which did not comply the directive's specified requirements on the number of drills that must be conducted on each type of technology. According to TSA officials, TSA's Office of Security Operations is responsible for overseeing compliance with the operational directive at airports, but it does not analyze the IED checkpoint drill data at the headquarters level. Further, TSA officials told us that TSA formerly tracked the number of IED checkpoint drills in a monthly management report for federal security directors, but in fiscal year 2012, that report was replaced by an executive scorecard that tracks each airport's IED checkpoint drill pass rate but does not include the number of drills conducted.[Footnote 22] TSA officials stated that federal security directors could conduct very few drills that are easy for SOs to identify in order to achieve a high pass rate, since the details of the drills are not provided to headquarters or analyzed beyond the pass rate. According to TSA officials, the agency does not ensure compliance with the directive at every airport because it is unclear which office within the Office of Security Operations should oversee enforcing the operational directive. According to officials from TSA's Office of Training and Workforce Engagement, that office had the ability to monitor the program until TSA began using federal security director scorecards in 2012, which are reviewed by the Office of Security Operations. As a result, it is still unclear which office is ultimately responsible for overseeing whether TSA is in compliance with the operational directive at airports. Data on IED checkpoint drills could provide insight into how well SOs resolve anomalies detected by the AIT systems, information that could be used to help strengthen the existing screening process. By not clarifying which office is responsible for overseeing TSA's IED checkpoint drills operational directive, directing that office to ensure enforcement of the directive in conducting these drills, and analyzing the data, TSA is missing an opportunity to identify any potential weaknesses in the screening process, since performance depends in part on the ability of SOs to accurately resolve anomalies. TSA Does Not Analyze AIT-ATR Systems' False Alarm Rate in the Field: TSA is not analyzing available data on the number of secondary screening pat-downs that SOs conduct as a result of an AIT-ATR system alarm, which indicates that it has detected an anomaly. Analyzing this information could provide insight into the number of false alarms that occur in the field, which may affect operational costs.[Footnote 23] Specifically, when the AIT-ATR system identifies the presence of an anomaly, indicated by an alarm, the SO must resolve the anomaly by conducting a pat-down to determine whether the anomaly is a threat item. If the SO does not resolve the anomaly during the pat-down (i.e. by locating an item in the location identified by the AIT-ATR system alarm), this may be attributed to either a false alarm (the AIT-ATR system identified an anomaly when none actually existed) or SO error (the SO did not identify an anomaly that was present). By not analyzing such operational data, TSA is limited in its understanding of the operational effectiveness of deployed AIT-ATR systems. TSA collected information on false alarm rates through laboratory testing conducted at TSL. These laboratory test results demonstrated that AIT-ATR systems have a higher false alarm rate than AIT-IO systems.[Footnote 24] Our analysis showed that the AIT-ATR system's false alarm rate can be expected to range significantly based on the estimate's 95 percent confidence interval, which could have implications for SO performance at resolving alarms and operational costs.[Footnote 25] Although TSA's detection standard required AIT-ATR systems to meet a specific false alarm rate, TSL laboratory test results on the AIT-ATR system indicate that certain factors, such as body mass index (BMI) and headgear, such as turbans and wigs, may contribute to greater fluctuations in the false alarm rate, either above or below that threshold.[Footnote 26] For example, the false alarm rate for passengers with a normal BMI was less than the false alarm rates for overweight and obese passengers. Additionally, the AIT- ATR system had a higher false alarm rate when passengers wore turbans and wigs. While TSA did not include the false alarm rate as a key performance requirement that could be used as a basis to accept or reject AIT-ATR systems, higher false alarm rates could result in higher operational costs.[Footnote 27] According to TSA, the AIT-ATR systems' current false alarm rate could produce an increase in annual staffing costs in the field, but it has not conducted studies on this issue. According to DHS's Science and Technology Directorate, effective checkpoint screening technologies have lower false alarm rates, as well as higher throughput and lower costs of operations, which enhance the effectiveness and efficiency of how TSA screens passengers. TSA's Functional Requirements Document stated that AIT-ATR systems must have a data collection and reporting system that collects, stores, analyzes, and displays a summary report on the outcomes of scans. The AIT-ATR systems are required to provide, at a minimum, the total number of passengers scanned, total number of passengers on which the system detected anomalies, and the body location of where an anomaly was detected. TSA reported in its System Evaluation Report that the AIT-ATR system was equipped with that data collection and reporting system and the summary report. According to TSA, it verified that currently deployed AIT-ATR systems capture those data in operational testing and evaluation. However, TSA does not collect or analyze those data at headquarters. Rather, TSA gives TSA management at airports the discretion to determine how to use those data and whether to enter those data into TSA's centralized information management system. TSA officials agreed that collecting and analyzing operational data would provide useful information related to the impact of false alarm rates on operational costs, and collecting those data could be done on a selective basis so that it would not be too labor-intensive. According to TSA officials, TSA is in the process of networking all AIT-ATR systems so that information can be collected at the headquarters level, and when this process is complete, TSA would be able to centrally collect operational data that could provide information on secondary screening outcomes, which provide insight into the operational false alarm rate. TSA officials were not able to provide an estimate of when this will be completed. Given the potential staffing implications associated with a higher false alarm rate, it is important to fully understand the system's false alarm rate in the field. Without a complete understanding of how the systems perform in the field, TSA may be at risk of incurring significantly higher operational costs than anticipated. Although TSA officials stated that collecting such data could be labor-intensive if not collected selectively, the agency agreed that evaluating operational screening data in the field could provide useful information, and that data could be collected in such a way that it does not negatively affect operations. Standards for Internal Control in the Federal Government calls for agencies to identify, capture, and distribute operational data to determine whether an agency is meeting its goals and effectively using resources.[Footnote 28] By not establishing protocols that facilitate capturing operational data on passengers at the checkpoint once the AIT-ATR systems are networked together, TSA is unable to determine the extent to which AIT-ATR system false alarm rates affect operational costs and has less information for its decision-making process related to checkpoint screening. Assessments of AIT-ATR System Performance Do Not Include the Performance of All System Factors: According to TSA officials, checkpoint security is a function of technology, people, and the processes that govern them, but TSA does not include measures for each of those factors in determining overall AIT-ATR system performance. TSA evaluated the technology's performance at meeting certain requirements in the laboratory to determine system effectiveness. Laboratory test results provide important insights but do not accurately reflect how well the technology will perform in the field with actual human operators. Figure 1 illustrates the multiple outcomes of the AIT-ATR screening process. Although TSA conducted operational tests on the AIT-ATR system prior to procurement, TSA does not assess how anomalies are resolved by considering how the technology, people, and processes function collectively as an entire system when determining AIT-ATR system performance. Figure 1: Passenger Screening Process Using Advanced Imaging Technology with Automated Target Recognition (AIT-ATR) Systems and Outcomes: [Refer to PDF for image: illustration] Passenger with prohibited item: Scanner identifies anomaly: Transportation Security Officer (TSO) identifies prohibited item: Passenger does not board plane with prohibited item; or: TSO does not identify prohibited item: Passenger boards plane with prohibited item. Scanner does not identify anomaly: Passenger boards plane with prohibited item. Passenger without prohibited item: Scanner identifies anomaly: TSO screens passenger; False alarm; Passenger boards plane; or: Scanner does not identify anomaly: Passenger boards plane. Source: GAO. [End of figure] TSA officials agreed that it is important to analyze performance by including an evaluation of the technology, operators, and processes, and stated that TSA is planning to assess the performance of all layers of security. According to TSA, the agency conducted operational tests on the AIT-ATR system, as well as follow-on operational tests as requested by DHS's Director of Operational Test and Evaluation, but those tests were not ultimately used to assess effectiveness of the operators' ability to resolve alarms, as stated in DHS's Director of Operational Test and Evaluation's letter of assessment on the technology. TSL officials also agreed that qualification testing conducted in a laboratory setting is not always predictive of actual performance at detecting threat items. Further, laboratory testing does not evaluate the performance of SOs in resolving anomalies identified by the AIT-ATR system or TSA's current processes or deployment strategies. According to best practices related to federal acquisitions, technologies should be demonstrated to work in their intended environment.[Footnote 29] According to DHS's Acquisition Directive 102- 01 and its associated guidebook, operational testing results should be used to evaluate the degree to which the system meets its requirements and can operate in the real world with real users like SOs. TSL's Test Management Plan for AIT systems stated that effectiveness must reflect performance under realistic or near-realistic operating conditions.[Footnote 30] Additionally, a group of experts on testing best practices assembled by the National Academy of Sciences concluded that agencies should include the human element when evaluating system performance. That group of experts also determined that agencies should determine system effectiveness by conducting performance testing in an operational setting in addition to laboratory testing, which could include SOs during testing. TSA conducted operational tests, but it did not use those tests to determine AIT-ATR effectiveness. Instead, TSA used laboratory tests that did not factor in performance of the entire system that includes technology, people, and processes. However, AIT-ATR system effectiveness relies on both the technology's capability to identify threat items and its operators to resolve those threat items. Given that TSA is seeking to procure AIT-2 systems, DHS and TSA will be hampered in their ability to ensure that future procurements meet mission needs and perform as intended at airports without measuring system effectiveness based on the performance of the AIT-2 technology and SOs who operate the technology, while taking into account current processes and deployment strategies. TSA Has Enhanced Passenger Privacy, but Could Take Additional Steps to Enhance AIT Capabilities: TSA has enhanced passenger privacy by completing the installation of ATR software upgrades for all deployed AIT systems but could do more to provide enhanced AIT capabilities to meet the agency's mission needs. Moreover, the agency faces technological challenges in meeting its goals and milestones pertaining to enhancing AIT capabilities. TSA Has Enhanced Privacy by Upgrading All Deployed Systems with ATR Software, but Has Not Met Its Goals Pertaining to Enhancing AIT Capabilities: TSA has met milestones as documented in its roadmap pertaining to the installation of ATR software upgrades that were intended to address privacy concerns and improve operational efficiency for all deployed AIT systems in accordance with the statutory deadline included as part of the Federal Aviation Administration Modernization and Reform Act of 2012.[Footnote 31] However, it did not meet proposed milestones documented in its AIT roadmap to provide enhanced capabilities to meet the agency's mission needs. For example, the February 2012 AIT roadmap estimated that TSA would complete installation of Tier II ATR software upgrades for currently deployed AIT systems by December 2012. TSA's updated October 2012 AIT roadmap revised this date to March 2013. According to TSA testing documentation, during operational testing conducted from May through June 2012 at an airport test site, the AIT- ATR Tier II system demonstrated limitations due to noncompliance with certain requirements. Accordingly, TSA decided not to pursue fielding of the Tier II system based on particular deficiencies identified during operational testing. The vendor of this system submitted a new version of the AIT-ATR system for laboratory testing to TSL. In September 2013, the new version had passed laboratory testing and was undergoing operational test and evaluation. As shown in figure 2, TSA began operational test and evaluation for Tier II upgrades 17 months after the expected start date articulated in its October 2012 roadmap. According to TSA, it completed operational test and evaluation in January 2014. According to the timeframes in TSA's revised roadmap, it would take an additional 7 months from January 2014 to complete Tier II upgrades. However, TSA had estimated that it would provide Tier III capabilities by the end of fiscal year 2014. Figure 2: Advanced Imaging Technology with Automated Target Recognition (AIT-ATR) Roadmap for Completing Tier II Upgrades: [Refer to PDF for image: illustration] Advanced Imaging Technology (AIT) with Automated Target Recognition (ATR): February 2012 roadmap: February 2012-December 2012: Tier II ATR upgrades complete (original date). Expected operational test and evaluation (OT&E) start date based on October 2012 roadmap: April, 2012. October 2012 roadmap: October 2012-March 2013: Tier II ATR upgrades complete (revised date). Past scheduled completion date: March 2013-September 2013; 17 months behind schedule. Actual OT&E start date: September 2013. Source: GAO. [End of figure] Although TSA experienced challenges and schedule slippages related to meeting Tier II requirements for the currently deployed AIT systems, in September 2012, TSA made contract awards to purchase and test the next generation of AIT systems (referred to as AIT-2) from three vendors.[Footnote 32] These systems are required to be equipped with ATR software and must be capable of meeting enhanced requirements (qualified at least at the Tier II level), among other things. The updated October 2012 roadmap contained milestones for testing and acquiring AIT-2 systems, which TSA has not met. Specifically, TSA is about 9 months behind schedule for AIT-2 testing and procurement, as depicted in figure 3. For example, the roadmap indicated that TSA would begin qualification testing and evaluation for AIT-2 during the first quarter of fiscal year 2013, would complete that testing by January 2013, and would complete deployment by March 2014. However, TSA did not initiate qualification testing until July 2013 (about 9 months behind schedule) because all three vendors had difficulty providing qualification data packages verifying that the vendors had met contractual requirements and the systems were ready to begin testing. Accordingly, as of March 2014, TSA is not on track to meet the March 2014 deployment milestone and these efforts have not resulted in enhancing AIT capabilities because currently deployed AIT- ATR systems are qualified at the same Tier I level as the systems originally deployed in 2009. Figure 3: Transportation Security Administration (TSA) Advanced Imaging Technology (AIT) Roadmap Milestones for Testing and Acquiring AIT-2 Systems: [Refer to PDF for image: illustration] Advanced Imaging Technology-2 (AIT-2): February 2012 roadmap: March 2012-December 2013; Issue AIT-2 request for proposal: March 2012; AIT-2 deployment complete (original date): December 2013. October 2012 roadmap: October 2012-March 2014; Expected qualification test and evaluation (QT&E) start date based on October 2012 roadmap: October 2012; Actual QT&E start date: July 2013; AIT-2 deployment complete (revised date): March 2014. Source: GAO. [End of figure] We have reported in the past few years that although AIT systems and the associated software have been in development for over two decades, TSA has faced challenges in developing and meeting program requirements in some of its aviation security programs, including AIT. [Footnote 33] Best practices for acquisition programs state that when key technologies are immature at the start of development, programs are at higher risk of being unable to deliver on schedule.[Footnote 34] As we concluded in January 2012, at the start of AIT development, TSA did not fully adhere to DHS acquisition guidance, and procured AIT systems without meeting all key requirements. According to best practices on major acquisitions, realistic program baselines with stable requirements for cost, schedule, and performance are important to delivering capabilities within schedule and cost estimates. [Footnote 35] In its AIT roadmap, TSA describes the time frames as notional and explains that establishing definitive timelines for reaching defined, additional tiers is difficult to achieve because of intricate dependencies that are outside of the program's control and may vary by manufacturer. However, TSA officials stated that they did not use available scientific research or evidence to help assess how long it would take to develop enhanced capabilities. In setting these time frames, TSA officials told us that TSA did not seek input from national laboratories that have conducted technology assessments and explosives research on behalf of DHS's Science and Technology Directorate nor did it evaluate vendor data to determine the capabilities of the technology. According to experts we interviewed from Sandia National Laboratories, to accurately determine realistic time frames in which vendors would be able to provide enhanced capabilities, it would require an evaluation of proprietary vendor data to understand how well the technology can meet requirements at a specific tier level. Rather, according to TSA officials, since TSA did not have access to proprietary data, it relied on notional time frames proposed by the AIT vendors, which comprised estimates for when the vendors expected to be able to develop and deliver AIT systems that would meet TSA's requirements. TSA's October 2012 AIT roadmap contains one key element of a technology roadmap--estimated time frames for achieving each milestone--and does not describe steps or activities needed to achieve each milestone.[Footnote 36] Moreover, in April 2012, the vendor for currently deployed AIT systems provided TSA with a detailed plan for delivering a system that could meet Tier III requirements that contained proposed milestones and time frames for achieving each milestone. Although TSA relied on discussions with this vendor to estimate roadmap time frames, the agency did not incorporate details from the vendor's plan into its roadmap. According to a representative from this vendor, TSA did not consult with the vendor regarding the risks and limitations of its proposed time frames, including how long it might take to develop various hardware or software modifications, nor did it provide feedback to the vendor after the proposal was submitted. The vendor's April 2012 plan states that after the Tier II system has met TSA's requirements, it would take the vendor several years to develop and deliver a Tier III system for TSA to test, followed by an operational test and evaluation system validation phase that would take several months. In addition, according to experts we interviewed from the national laboratories that contributed to the development of imaging technology, the milestones contained in TSA's October 2012 roadmap are not achievable because it did not reflect the time needed to make sufficient improvements to the technology to ensure that it would be able to meet additional tier levels. TSA did not incorporate available information from the national laboratories and vendors into its updated roadmap. As a result, the roadmap underestimated the length of time it would take to develop and deploy AIT-ATR Tier III systems.[Footnote 37] As discussed later in this report, moving forward, it will be important for TSA to incorporate scientific evidence and information from DHS's Science and Technology Directorate, and the national laboratories, as well as nonproprietary information and data provided by vendors into the next revision of its AIT roadmap to ensure that the time frames for achieving future goals and milestones are realistic and achievable. TSA Faces Technological Challenges in Meeting Future Goals and Milestones: Consistent with the Homeland Security Act of 2002, as amended, the DHS Science and Technology Directorate has responsibility for coordinating and integrating the research, development, demonstration, testing, and evaluation activities of the department, as well as for working with private sector stakeholders to develop innovative approaches to produce and deploy the best available technologies for homeland security missions.[Footnote 38] Moreover, we have previously identified key practices that can help sustain agency collaboration and concluded that collaborating agencies can look for opportunities to address resource needs by leveraging each others' resources, thus obtaining additional benefits that would not be available if they were working separately.[Footnote 39] According to TSA officials, the agency recognizes the need to develop achievable milestones based on scientific evidence and is in the process of developing a roadmap for the entire passenger screening program. They explained that they plan to collaborate with DHS Science and Technology Directorate to determine milestones for the new roadmap that will be based on a scientific analysis of technology capabilities as well as ongoing research and development efforts. TSA officials stated that they plan to update the AIT roadmap using this new approach and expect the AIT roadmap to be completed by September 30, 2014. A group of experts moderated by GAO in June 2013 stated that DHS must have personnel with technical expertise in ATR software for AIT systems and development who are engaged throughout the developmental process to ensure that vendors are providing improved capabilities over time. According to these expert comments, it is important to leverage the technical expertise of academia and the national laboratories to improve capabilities over time and provide insight into reasonable time frames for meeting future tiers. In September 2011, we reported that given continuing budget pressures combined with the focus on performance envisioned in the Government Performance and Results Act (GPRA) Modernization Act of 2010, federal agencies must undertake fundamental reexaminations of their operations and programs to identify ways to operate more efficiently.[Footnote 40] While there are various approaches that vendors could take to make needed improvements to the technology, including hardware modifications, software developments, or incorporating new imaging techniques to provide enhanced capabilities, these approaches could take years to develop, and would require significant investment of resources. Moreover, according to scientists that we interviewed from the national laboratories, there are several ways to improve ATR software algorithms to enhance system capabilities; however, there is little market incentive for existing vendors to invest in making these improvements or for new vendors to enter the relatively small airport checkpoint market, since one vendor has already met TSA's current requirements. Further, 2 of the 12 experts identified by the National Academy of Sciences with whom we spoke stated that establishing clear requirements would incentivize vendors to improve performance over time. Thus, according to these experts, it is unlikely that vendors will invest in making the needed improvements to meet TSA's mission needs. According to a representative from the vendor of currently deployed AIT systems, moving from Tier II to Tier III presents new technological challenges because meeting additional tiers will require the development of more targeted algorithms. Accordingly, to develop these new algorithms, vendors would have to build new data sets, conduct research, and invest additional resources before accurately determining realistic time frames for meeting Tier III and Tier IV requirements. Therefore, given the current state of the technology as well as the amount of research that has to be conducted on developing algorithms that can meet Tier III and Tier IV requirements, neither TSA nor the AIT vendors can reliably predict how long it will take to meet Tier IV requirements. Because TSA revised its requirements over time, scientists from the national laboratories noted that vendors have little incentive to meet additional tier levels since they are meeting TSA's current requirements. In addition, TSA has not obtained the necessary information to accurately understand the future state of the technology. Thus, the agency has little assurance that vendors will provide AIT-ATR systems that meet Tier IV requirements within TSA's estimated time frames. As a result, the future capabilities of the technology and the time frames in which those capabilities will be delivered remain unknown. Given these challenges, TSA will be unable to ensure that its roadmap reflects the true capabilities of the next generation of AIT-2 systems without the use of scientific evidence and information from DHS's Science and Technology Directorate, and the national laboratories, as well as nonproprietary information and data provided by vendors to develop a realistic schedule with achievable milestones that outlines the technological advancements, estimated time, and resources needed to achieve TSA's Tier IV end state. Conclusions: TSA has deployed nearly 740 AIT systems and will spend an estimated $3.5 billion in life cycle costs on deployed AIT-ATR systems and future AIT-2 systems. However, TSA faces challenges in managing its AIT program because it is not using all available data that it collects to inform its decisions. For example, TSA does not enforce compliance with its operational directive that requires each airport to conduct IED checkpoint drills each week, nor does it collect or use IED checkpoint drill data on SO performance. Additionally, TSA is not analyzing available data on the number of secondary screening pat- downs that SOs conduct when the system indicates that it has detected an anomaly, which could provide insight into the number of false alarms that occur in the field and the extent to which these alarms affect operational costs. TSA could improve the overall performance of the AIT system and better inform its decision-making process related to checkpoint screening by clarifying which office is responsible for overseeing TSA's operational directive, directing that office to enforce compliance with the directive, and analyzing the IED checkpoint data to identify any potential weaknesses in the airport screening process, and also establishing protocols that facilitate capturing operational data on passengers at the checkpoint to determine the extent to which AIT-ATR system false alarm rates affect operational costs. Although AIT systems and the associated software have been in development for over two decades, TSA has not used available information from the scientific community and vendors to understand the technological advancements that need to be made and determine the time frames in which AIT systems will meet Tier IV requirements. Therefore, the milestones that TSA uses to guide its procurement of this technology do not incorporate scientific evidence from the national laboratories or vendors that could be used to produce an accurate, realistic roadmap. TSA would have more assurance that its $3.5 billion investment in AIT provides effective security benefits by (1) measuring system effectiveness based on the performance of the AIT- 2 technology and SOs who operate the technology, while taking into account current processes and deployment strategies and (2) using scientific evidence and information from DHS's Science and Technology Directorate, and the national laboratories, as well as information and data provided by vendors, to develop a realistic schedule with achievable milestones that outlines the technological advancements, estimated time, and resources needed to achieve TSA's Tier IV end state. Recommendations for Executive Action: To help ensure that TSA improves SO performance on AIT-ATR systems and uses resources effectively, the Administrator of the Transportation Security Administration should take the following two actions: * clarify which office is responsible for overseeing TSA's IED screening checkpoint drills operational directive, direct the office to ensure enforcement of the directive in conducting these drills, and analyze the data to identify any potential weaknesses in the screening process, and: * establish protocols that facilitate the capturing of operational data on secondary screening of passengers at the checkpoint to determine the extent to which AIT-ATR system false alarm rates affect operational costs once AIT-ATR systems are networked together. To help ensure that TSA invests in screening technology that meets mission needs, the Administrator of the Transportation Security Administration should ensure that the following two actions are taken before procuring AIT-2 systems: * measure system effectiveness based on the performance of the AIT-2 technology and screening officers who operate the technology, while taking into account current processes and deployment strategies, and: * use scientific evidence and information from DHS's Science and Technology Directorate, and the national laboratories, as well as information and data provided by vendors to develop a realistic schedule with achievable milestones that outlines the technological advancements, estimated time, and resources needed to achieve TSA's Tier IV end state. Agency Comments and Our Evaluation: We provided a draft of this report to DHS for comment. On March 21, 2014, DHS provided written comments, which are reprinted in appendix III and provided technical comments, which we incorporated as appropriate. DHS generally concurred with our four recommendations and described actions taken, underway, or planned, to implement each recommendation. Specifically, * In response to the recommendation that TSA clarify which office is responsible for overseeing TSA's Improvised Explosive Device Screening Checkpoint Drills operational directive, instruct the responsible office to enforce the directive, and analyze the drill data to identify any potential weaknesses in the screening process, DHS stated that TSA's Office of Security Operations will initiate a review of programs that contribute to assessing screening performance with consideration of the findings identified in our report. TSA anticipates that it will complete this review by the end of fiscal year 2014, and by TSA also stated that by September 30, 2014, the operations directive will be amended to assign responsibility to one office. We believe that these are beneficial steps that would address our recommendation, provided that TSA directs the office to ensure enforcement of the directive in conducting the drills, and uses the data to identify any potential weaknesses in the screening process, as we recommended. * In response to our recommendation that TSA establish protocols to help determine the extent to which AIT-ATR system false alarm rates affect operational costs once AIT-ATR systems are networked together, DHS stated that TSA will monitor, update, and report the results of its efforts to capture operational data on the secondary screening of passengers resulting from AIT-ATR false alarms and evaluate the associated impacts to operational costs based on existing staffing levels. Once implemented, the new reporting mechanism will address our recommendation, provided that it captures sufficient information to determine the extent to which AIT-ATR system false alarm rates affect operational costs. * In response to the recommendation that TSA measure system effectiveness based on the performance of the AIT-2 technology and screening officers who operate the technology, while taking into account current processes and deployment strategies before procuring AIT-2 systems, DHS stated that TSA considers several factors when measuring system effectiveness, including documented deployment strategies, airport needs and conditions such as height and checkpoint space, TSA security operations processes and procedures, feedback from transportation security officers who operate the AIT-ATR systems, as well as concept of operations and formal operational and functional requirements documents. Further, DHS stated that TSA's testing process enables TSA to determine if technologies meet required standards and are feasible for use in the airport environment, and that the system evaluation report for AIT-2--which will document system effectiveness using information from the laboratory and operational test reports-- will state whether or not the next-generation AIT system has an acceptable operationally effective and suitable rating for use within an airport environment. While these are beneficial practices, we believe that it would be preferable for TSA to measure the AIT-2 system's overall probability of detection by including an evaluation of screening officer performance at resolving alarms detected by the technology in its assessment, as we recommended, since AIT system effectiveness relies on both the technology's capability to detect items and screening officers ability to correctly resolve alarms. In addition, DHS stated that TSA is currently implementing the Transportation Security Capability Analysis Process, which will be used to better understand TSA's requirements and better articulate those requirements and needs for acquisition and requirements documentation. This is an important first step toward addressing our recommendation, provided that TSA uses this process to determine the overall effectiveness of its system based on the performance of the AIT-2 technology as well as the screening officers who operate the technology and not solely on the capabilities of current AIT technology as has been done in the past. * In response to the recommendation that TSA use scientific evidence and information from DHS's Science and Technology Directorate, and the national laboratories, as well as information and data provided by vendors to develop a realistic schedule with achievable milestones that outline the technological advancements, estimated time, and resources needed to achieve TSA's Tier IV end state, DHS stated that TSA has initiated an effort to complete a more comprehensive technology roadmap that forecasts technology progression through detection tiers, estimates cost to mature the technology, and includes a timeline with supporting narrative. TSA expects this roadmap to be completed by September 30, 2014. We believe that these are beneficial actions that could help TSA address the weaknesses identified in this report and we will continue to work with TSA to monitor progress on the proposed steps as the agency progresses. As agreed with your offices, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after its issue date. At that time, we will send copies of this report to the Secretary of Homeland Security, the TSA Administrator, the House Homeland Security Committee, the House Subcommittee on Oversight and Management Efficiency, the House Subcommittee on Transportation Security, and other interested parties. In addition, the report is available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202) 512-4379 or lords@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix IV. Signed by: Stephen M. Lord: Managing Director, Forensic Audits and Investigative Services: List of Requesters: The Honorable Michael T. McCaul: Chairman: Committee on Homeland Security: House of Representatives: The Honorable Bennie G. Thompson: Ranking Member: Committee on Homeland Security: House of Representatives: The Honorable Jeff Duncan: Chairman: Subcommittee on Oversight and Management Efficiency: Committee on Homeland Security: House of Representatives: The Honorable Cedric L. Richmond: Ranking Member: Subcommittee on Transportation Security: Committee on Homeland Security: House of Representatives: The Honorable Richard Hudson: House of Representatives: The Honorable Sheila Jackson Lee: House of Representatives: The Honorable Michael D. Rogers: House of Representatives: [End of section] Appendix I: Transportation Security Administration (TSA) Has Taken Action toward Addressing Advanced Imaging Technology Utilization and Planning Challenges Identified in our January 2012 Report: In January 2012, we concluded that TSA had acquired advanced imaging technology (AIT) systems that were not being used on a regular basis and thus were not providing a security benefit. For example, we found that 32 of 486 AIT systems had been used less than 5 percent of the days since their deployment, and that 112 of 486 AIT systems had been used on less than 30 percent of the days since their deployment. Further, we observed that at 5 of the 12 airports we visited, AIT systems were deployed but were not regularly used. For example, at 1 airport we observed that TSA had deployed 3 AIT systems in an area that typically handles approximately 230 passengers. TSA officials informed us at the time that 2 of the AIT systems were seldom used because of the lack of passengers and mentioned that they believed the AIT systems were deployed based on the availability of space. In addition, we observed instances in which AIT systems were not being used because of maintenance problems that affected how often the deployed AIT system screened passengers. We concluded, on the basis of our observations on AIT utilization, that there were concerns about how effectively deployed AIT systems were being used. Accordingly, we recommended that TSA evaluate the utilization of currently deployed AIT systems and potentially redeploy AIT systems based on utilization data, so that those systems not being extensively used could provide enhanced security benefits at airports. The Department of Homeland Security (DHS) agreed, and TSA has taken steps to address our recommendation but has not fully addressed the intent of our recommendation. Specifically, TSA took the following actions. Develop and track AIT utilization metrics. TSA officials we spoke with in October 2012 stated that they revised TSA's metric for measuring utilization based on our January 2012 report to more accurately reflect the amount of time AIT systems were being used. According to TSA's field guide issued in March 2012, TSA measures AIT utilization as the percentage of passengers that are screened by AIT systems. To track AIT utilization based on this metric, TSA developed specific targets to meet that are based on passenger throughput and hours that AIT systems are in operation at an airport. However, the target TSA establishes for an airport is reduced to account for AIT systems that are not operational because of maintenance problems or that are not being used because of lane closures, staffing restrictions, or low passenger volume. Accordingly, the methodology employed by TSA to measure AIT utilization does not accurately measure the extent to which AIT systems are being used since the metric tracks AIT system utilization only when they are being used. Furthermore, to calculate airport targets and track AIT utilization, TSA relies on data submitted by airports into its centralized information management system. However, in September 2013, the DHS Office of Inspector General (DHS OIG) reported that TSA did not have adequate internal controls to ensure accurate data on AIT utilization.[Footnote 41] Specifically, the OIG found that TSA's utilization data were unreliable because (1) AIT throughput data recorded in its centralized information management system were different from data in the source document, (2) AIT throughput data on the source document were not recorded in its centralized information management system, (3) the starting AIT count was different from the previous day's ending AIT count, and (4) AIT throughput source documentation was missing. Further, since airports record and enter AIT throughput in its centralized information management system manually, this may lead to inaccurate recording of information and does not provide an audit trail to validate data accuracy. Accordingly, without reliable throughput data, TSA decision makers cannot accurately measure AIT utilization at airports. Redeploy AIT systems used infrequently. In the spring of 2013, TSA redeployed some AIT systems that were used infrequently to airports with higher passenger volumes. For example, following the decision to remove backscatter X-ray systems from all airports, TSA officials stated that they incorporated a risk-based approach that entailed replacing backscatter X-ray systems that had been frequently used with millimeter wave systems and not deploying these systems to locations where the backscatter X-ray systems had been used less frequently.[Footnote 42] Specifically, for locations that used the backscatter X-ray systems less than 5 percent of the time, other screening measures were implemented in lieu of deploying millimeter wave AIT systems. However, without accurate and reliable utilization data, TSA decision makers cannot ensure the optimal use of deployed AIT systems. Further, in May 2013, TSA officials stated that utilization data are not generally used to make deployment decisions, such as the number of AIT systems that should be deployed to which airports. Accordingly, TSA is not using the data it collects on utilization to inform its deployment decisions. While the actions TSA has taken represent important steps toward addressing our recommendation, ensuring that the utilization data it collects are accurate, and using these data to inform future deployment decisions, would help ensure the effective utilization and redistribution of AIT systems and efficient use of taxpayer resources. [End of section] Appendix II: Objectives, Scope, and Methodology: This report answers the following questions: 1. To what extent does TSA collect and analyze available information that could be used to enhance the performance of AIT systems equipped with ATR (AIT-ATR)? 2. To what extent has TSA made progress toward enhancing AIT capabilities to detect concealed explosives and other threat items, and what challenges, if any, remain? To determine the extent to which TSA collects and analyzes available information to improve the performance of screening officers (SO) responsible for resolving anomalies identified by ATR software, we analyzed improvised explosive device (IED) checkpoint drills conducted by TSA personnel at airports that submitted data to TSA from March 1, 2011, through February 28, 2013, under TSA's IED checkpoint drill operational directive.[Footnote 43] TSA's IED checkpoint drill operational directive requires personnel at airports to conduct drills to assess Transportation Security Officer (TSO) compliance with TSA's screening standard operating procedures (SOP) and to train TSOs to better resolve anomalies identified by AIT-ATR systems.[Footnote 44] We analyzed those data to determine whether airports were in compliance with TSA's operational directive by analyzing the number and percentage of tests that were conducted on AIT systems and on other passenger screening methods at the checkpoint to evaluate whether, overall, airports with AIT systems had conducted the required proportion of drills between AIT drills and other passenger-screening drills. Additionally, we evaluated airport compliance with TSA's operational directive and Standards for Internal Control in the Federal Government to determine the extent to which TSA is monitoring compliance with its directive.[Footnote 45] We also reviewed TSA's AIT deployment schedules to determine which type of AIT-ATR system airports had, the dates those systems were first deployed, and the dates systems were upgraded with ATR capability to assess how airport performance varied at resolving anomalies identified by the AIT-ATR system. Further, we analyzed laboratory test results of the AIT-ATR system and the AIT systems that used IOs (AIT-IO) from calendar years 2009 through 2012 conducted by the Transportation Security Laboratory (TSL).[Footnote 46] We analyzed these data using statistical methods that estimated how the false alarm rates varied according to various characteristics of the mock passenger. We assessed whether the laboratory tests complied with statistical principles by comparing the testing design to generally accepted statistical principles used for data collection.[Footnote 47] We calculated the false alarm rates using two specific statistical calculations, called bias-corrected cluster bootstrap resampling and random effects methods, to estimate the sampling error of the AIT-ATR systems' estimated false alarm rates. We used each of these methods to estimate the 95 percent confidence intervals of the false alarm rates, and achieved similar results using either method. We assessed the extent to which laboratory test results demonstrated that the AIT-ATR system met requirements as required by key acquisition practices established by GAO, because the AIT system is considered a large-scale acquisition program.[Footnote 48] We analyzed the adequacy of laboratory tests by comparing the testing design with generally accepted statistical methods used for data collection and analysis.[Footnote 49] We assessed the reliability of the laboratory and IED checkpoint drill data we used by interviewing officials responsible for capturing and monitoring the data about, among other things, applicable quality control procedures to maintain the integrity of the data, performing statistical tests on the data, and reviewing testing reports and related documentation. We determined these data were sufficiently reliable for the purposes of this report. Furthermore, we compared the extent to which TSA evaluated the performance of the entire system to key acquisition practices established by GAO, DHS's Acquisition Directive 102-01, and TSL's Test Management Plan.[Footnote 50] We also visited a nonprobability sample of four U.S. airports to observe AIT-ATR systems and interview relevant TSA personnel. We interviewed a total of 46 TSA personnel who operate AIT-ATR systems selected by airport officials to obtain their views on system performance, and six Transportation Security Specialists for Explosives to discuss airport IED checkpoint drills.[Footnote 51] We selected these airports based on airport category and AIT-ATR system deployment.[Footnote 52] The information we obtained from these visits cannot be generalized to other airports, but provided us with information on the perspectives of various participants in the deployment of AIT units at airports across the country. We also interviewed TSA officials involved in AIT-ATR deployment, training, and covert testing. We visited TSL in Atlantic City, New Jersey, to interview laboratory scientists responsible for testing and evaluating AIT-ATR systems and reviewed TSL documentation related to laboratory test plans, records, and final reports. We interviewed knowledgeable agency officials from TSA, TSL, and DHS's Science and Technology Directorate to better understand how AIT-ATR and AIT-IO system performance was assessed. To determine progress TSA has made and any challenges that remain toward enhancing AIT capabilities, we analyzed TSA's original AIT roadmap dated February 2012, as well as the October 2012 revision. To determine the extent to which TSA has met its projected time frames for AIT-ATR system upgrades and development of the next generation of AIT systems, referred to as AIT-2, we reviewed actions taken by TSA testing officials and compared the actual dates for each milestone with the estimated dates documented in TSA's AIT roadmap. We also reviewed a leading AIT vendor's technology plan for meeting additional tiers to determine the extent to which TSA's AIT roadmap contained achievable time frames for meeting future tier levels. We further reviewed several technology roadmaps for large-scale acquisition programs developed by other agencies and organizations, such as the Department of Defense, as well as technology roadmapping guidance developed by Sandia National Laboratories to enhance our understanding of the fundamental elements of technology roadmaps.[Footnote 53] We then compared this guidance with TSA's AIT roadmap to determine the extent to which TSA's roadmap contained these elements. We also reviewed prior GAO reports on (1) major acquisition programs to identify best practices for delivering capabilities within schedule and cost estimates and (2) key practices that can help sustain agency collaboration to leverage each others' resources and obtain additional benefits that would not be available if they were working separately. [Footnote 54] To determine challenges TSA faces toward enhancing AIT capabilities, we interviewed scientists from the Department of Energy's Sandia National Laboratories and Pacific Northwest National Laboratory to obtain their views on current and future capabilities of the technology and the scientific advancements that would need to occur to enable the development of future tier levels. We also interviewed a leading AIT vendor to obtain its views on the extent to which TSA obtained input from the vendor related to its ability to meet future tiers within expected time frames as well as the risks and limitations associated with pursuing alternative approaches for developing successive tiers. We further interviewed TSA acquisition officials to obtain the agency's views on the vendors' ability to meet future tiers within estimated time frames. Last, we interviewed 12 experts identified by the National Academy of Sciences to obtain their views on best practices for testing detection technologies, such as AIT-ATR systems.[Footnote 55] Our interviews with these experts are illustrative and provide insights about testing best practices. We conducted this performance audit from September 2012 to March 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix III: Comments from the Department of Homeland Security: U.S. Department of Homeland Security: Washington, DC 20528: March 21, 2014: Stephen M. Lord: Director, Homeland Security & Justice Issues: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Re: Draft Report GAO-14-357, "Advanced Imaging Technology: TSA Needs Additional Information before Procuring Next-Generation Systems" Dear Mr. Lord: Thank you for the opportunity to review and comment on this draft report. The U.S. Department of Homeland Security (DHS) appreciates the U.S. Government Accountability Office's (GAO's) work in planning and conducting its review and issuing this report. The Transportation Security Administration (TSA) is aware that the Advanced Imaging Technology (AIT) units are of particular importance and remains committed to continuously assessing and improving the AIT program's testing procedures. The FAA Modernization and Reform Act of 2012 mandated that TSA use AIT equipped with Automated Target Recognition (ATR) for screening of passengers. After AIT image operator units were removed from airports earlier in calendar year 2013, TSA used Performance Measurement Information System (PMIS) data to capture usage rates and percentage of passengers (PAX) screened to determine deployment strategies for AIT units equipped with ATR. Consequently, TSA was able to increase PAX screened using AIT while decreasing the size of the AIT fleet by approximately 75 units. Enhancing the AIT Testing Process: TSA is refining its Test and Evaluation (T&E) process to achieve a higher statistical significance and confidence intervals, both in developmental testing and operational testing. TSA is using these enhanced test results for the Current AIT-2 (second generation) procurement, which will then inform the Agency's deployment decisions. In addition, the ongoing AIT roadmap, a document that contains milestones for achieving enhanced detection capabilities to meet the Agency's mission needs, will be matched against available testing results to further refine AIT capabilities and help vendors develop a realistic schedule with achievable milestones. TSA expects the updated AIT roadmap to be completed by September 30, 2014. Third-Party Collaboration: In partnership with the screening technology industry, TSA continues to develop and enhance the screening capabilities (i.e., Tier Capabilities) of the AIT system. Through this collaborative process, TSA will ultimately achieve Tier IV screening capabilities and through competitive procurement, incentivize industry to produce the most effective AIT system. In addition, through numerous Industry Days held throughout 2013 by both TSA and DRS's Science and Technology Directorate (S&T), TSA was afforded insight into industry capabilities, both current and expected. This extensive engagement with industry subject matter experts on the capabilities of AIT technology will allow TSA the opportunity to ensure achievement of detection and operational requirements, as well as the procurement strategy and schedule to promote competition and to drive AIT enhanced capabilities. TSA has taken a number of steps to address the issues discussed in this report and is actively developing plans to ensure that airports across the country are operating with the latest acceptable detection capability. TSA has been responsive to industry and extensive communication has taken place to promote competition. Communication of TSA's goals, objectives, and requirements ensure success of both TSA and industry in the competitive procurement process. Our ongoing progress demonstrates our commitment to TSA's mission of securing our Nation's transportation systems. The draft report contained four recommendations with which the Department concurs. Specifically, GAO recommended that the Administrator of TSA: Recommendation 1: Clarify which office is responsible for overseeing TSA's IED [Improvised Explosive Device] Screening Checkpoint drills operational directive, direct the office to ensure enforcement of the directive in conducting these drills, and analyze the data to identify any potential weaknesses in the screening process. Response: Concur. TSA concurs that the oversight of Operations Directive (OD) OD-400-50-1-12A, Improvised Explosive Device Screening Checkpoint Drills, should be revisited. To accomplish this, TSA's Office of Security Operations will initiate a review of programs that contribute to assessing screening performance with consideration of the findings identified in the GAO report. This review, which will include assessing organizational oversight and enforcement responsibilities, will aim to ensure effective use of resources and alignment to operational directives and include procedures for on- going analysis in support of operational goals. TSA's Office of Training and Workforce Engagement and the Office of Security Capabilities will be invited to participate in the review process. TSA anticipates that it will complete its review of programs that contribute to assessing screening performance by the end of Fiscal Year 2014. At the conclusion of the program review, the Operations Directive will be amended to assign responsibility to one office. Estimated Completion Date (ECD): September 30, 2014. Recommendation 2: Establish protocols that facilitate the capturing of operational data on secondary screening of passengers at the checkpoint to determine the extent to which AIT-ATR system false alarm rates affect operational costs once AIT-ATR systems are networked together. Response: Concur. TSA's Office of Security Capabilities (TSA/OSC) periodically captures operational data on the secondary screening of passengers at the checkpoint for the various types of secondary screening methods, which include patdowns as a result of AIT-ATR false alarms. The data elements for each of the various checkpoint processes, including secondary screening of passengers, are defined in the Checkpoint Data Element standards. TSAIOSC will monitor, update, and report results of capturing operational data on the secondary screening of passengers resulting from AIT-ATR false alarms and evaluate the associated impacts to operational costs on the basis of existing staffing levels. In addition, through the process of operational T&E of any system improvement, the metrics associated with the process of screening passengers are evaluated. This includes the data associated with the secondary screening of passengers. These data are presented to both the user and program office in a System Evaluation Report (SER), which contains the evaluation of system effectiveness and suitability based on this data, as well as data from other reliable and relevant test events. ECD: To Be Determined (TBD). Recommendation 3: Prior to procuring AIT-2 systems, measure system effectiveness based on the performance of the AIT-2 technology and SOs [Screening Officers] who operate the technology, while taking into account current processes and deployment strategies. Response: Concur. TSA/OSC considers several factors when measuring system effectiveness. TSA/OSC deployment strategies are documented and considered when developing technology test requirements used to measure system effectiveness. TSA/OSC takes into consideration airport needs and conditions such as ceiling height and checkpoint space. Required equipment dimensions are captured in requirements documents. TSA/OSC security operations processes and procedures used within the airport environment are documented and considered when developing technology test requirements. TSA/OSC gathers feedback from Transportation Security Officers regarding technology deployment needs and concerns. TSA's next generation AIT systems are currently being tested at the Transportation Security Laboratory and TSA Security Integration Facility in laboratory environments to determine their effectiveness. Pending the outcome of the laboratory testing, the systems may be tested in an airport environment to determine operational suitability. TSA's testing process enables TSA/OSC to determine if technologies meet required standards and are feasible for use in the airport environment. At the completion of testing, TSA's laboratory and operational test results are documented in formal test reports and used by TSA/OSC in determining if a system is operationally effective and suitable for use within an airport environment. A TSA Systems Evaluator will prepare a formal SER that documents system effectiveness using information from the laboratory and operational test reports. The SER will state whether or not the next generation AIT has an acceptable operationally effective and suitable rating for use within an airport environment. Concept of Operations (ConOps) and formal operational and functional requirements documents are taken into account when developing the test methodology. The ConOps is also where the processes and employment strategy for the technology are taken into account. TSA is currently implementing the Transportation Security Capability Analysis Process (TSCAP). At the macro level, TSCAP will be used to better understand the overarching holistic security system architecture and gain insight into TSA's needs, or requirements. At the micro level, TSCAP will be used to better articulate those requirements and needs for acquisition and requirements documentation. ECD: TBD. Recommendation 4: Prior to procuring AIT-2 systems, use scientific evidence and information from DHS's Science and Technology Directorate, and the national laboratories, as well as information and data provided by vendors to develop a realistic schedule with achievable milestones that outlines the technological advancements, estimated time, and resources needed to achieve TSA's Tier IV end state. Response: Concur. TSA/OSC has initiated an effort to complete a technology roadmap that: * Forecasts technology progression through detection tiers; * Estimates cost to mature; and; * Results in timeline/roadmap with supporting narrative. A variety of experts will be engaged from academia, S&T, and national laboratories to gain insights on technology limitations, possible future concepts, and potential timelines for achieving advances in capability. The output of this effort will be a comprehensive roadrnap that will include a narrative documenting historical progression and supporting future maturation forecasts for AIT. ECD: September 30, 2014. Again, thank you for the opportunity to review and comment on this draft report. Technical comments were previously provided under separate cover. Please feel free to contact me if you have any questions. We look forward to working with you in the future. Sincerely, Signed by: Jim H. Crumpacker: Director: Departmental GAO-OIG Liaison Office: [End of section] Appendix IV: GAO Contact and Staff Acknowledgments: GAO Contact: Stephen M. Lord at (202) 512-4379 or at lords@gao.gov. Staff Acknowledgments: In addition to the contact named above, David Bruno, Assistant Director; David Alexander; Carl Barden; Carissa Bryant; Susan Czachor; Emily Gunn; Tom Lombardi; Lara Miklozek; Tim Persons; Doug Sloane; and Jeff Tessin made key contributions to this report. [End of section] Footnotes: [1] IOs examined passenger images in remotely located rooms to determine whether an anomaly was present on a passenger. IOs communicated to screening officers, who are typically Transportation Security Officers (TSO), whether additional passenger screening was necessary, but IOs did not have the ability to view the actual person represented by the image. [2] Risk-based screening measures include expedited screening procedures for children 12 and younger and adults 75 and older, among other things. TSA's Pre¸TM is a pre-screening, trusted-traveler initiative that makes risk assessments on passengers who voluntarily participate and, based on the outcome of the risk assessments, enables eligible participants to undergo expedited screening. [3] See Pub. L. No. 112-95, § 826, 126 Stat. 11, 132-33 (2012) (codified at 49 U.S.C. § 44901(/)). Further, in February 2012, TSA issued a request for vendors to provide a second generation of AIT systems, referred to as AIT-2, which would be required to have ATR software, among other things, as discussed later in this report. [4] See 49 U.S.C. § 44901(l)(3) (authorizing TSA to issue one or more extensions, each lasting no longer than 1 year, if TSA determined that AIT-ATR systems were not substantially as effective at screening passengers as the previous systems or if additional testing of such software was necessary). [5] According to technology roadmapping guidance developed by Sandia National Laboratories, a technology roadmap documents the critical system requirements, the product and process performance targets, and the technology alternatives and milestones for meeting those targets. In effect, a technology roadmap identifies alternate technology "roads" for meeting certain performance objectives. DHS's investment review board includes acquisition officials that review acquisition programs for proper management, oversight, accountability, and alignment with the department's strategic functions. [6] For purposes of this report, unless otherwise noted, references to TSOs include any nonfederal screeners employed by a private company under contract to TSA to provide screening services at an airport participating in TSA's Screening Partnership Program. See 49 U.S.C. § 44920. [7] TSA, Operations Directive Improvised Explosive Device Screening Checkpoint Drills, (400-50-1-12A), (Washington, D.C., Nov. 15, 2012). GAO, Standards for Internal Control in the Federal Government, [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1] (Washington, D.C.: Nov. 1, 1999). [8] TSL, a component of DHS's Science and Technology Directorate, is responsible for conducting research, development, testing, and evaluation of technologies for the department. [9] GAO, Homeland Security: DHS Requires More Disciplined Investment Management to Help Meet Mission Needs, [hyperlink, http://www.gao.gov/products/GAO-12-833] (Washington, D.C.: Sept. 18, 2012). We identified key acquisition management practices by reviewing 17 prior GAO reports examining DHS, the Department of Defense, the National Aeronautics and Space Administration, and private sector organizations. [10] For examples of statistical methods for data collection and testing design, see Sharon Lohr, Sampling: Design and Analysis, Duxbury Press, New York, New York, 1999. [11] [hyperlink, http://www.gao.gov/products/GAO-12-833]. DHS, Acquisition Instruction/Guidebook 102-01-001, Version 1.9 (Nov. 7, 2008). TSL, Management Plan for the Qualification Test and Evaluation of Advanced Imaging Technology (AIT) with Automated Target Recognition (ATR) (Aug. 12, 2010). [12] We selected these airports based on AIT-ATR system deployment and airport category, which corresponds to TSA's classification of airports into one of five security risk categories. We interviewed a total of 46 TSA personnel who operate AIT-ATR systems selected by airport officials to obtain their views on system performance. We also interviewed six of TSA's Transportation Security Specialists for Explosives to discuss airport IED checkpoint drills and SO performance at resolving anomalies identified by AIT-ATR systems. Transportation Security Specialists for Explosives are responsible for conducting operational checkpoint drills to train TSOs on resolving alarms and better adhering to TSA's SOPs, as well to as serve as liaisons with local law enforcement. SOPs establish processes for implementing security measures. [13] Department of Energy, Fundamentals of Technology Roadmapping, (Washington, D.C.: April 1997). As described in this report, technology roadmapping is an effective tool for technology planning and coordination, which fits within a broader set of planning activities and provides information to make better technology investment decisions by identifying critical technologies and technology gaps and identifying ways to leverage research and development investments. [14] The National Academy of Sciences identified experts by conducting searches of internal databases and external websites, obtaining recommendations from members of the National Academies' National Materials and Manufacturing Board and the Computer Sciences and Telecommunications Board, and using connections with experts who assisted on prior studies conducted by the National Academy of Sciences. [15] (1) GAO, Best Practices: An Integrated Portfolio Management Approach to Weapon System Investments Could Improve DOD's Acquisition Outcomes, [hyperlink, http://www.gao.gov/products/GAO-07-388] (Washington, D.C.: Mar. 30, 2007). In this report, we identified best practices by reviewing related professional and academic publications, and interviewing knowledgeable officials from five successful commercial companies. While the products developed by these companies range from heavy construction equipment and high-end electronics to pharmaceuticals and household items, each of the companies manages a large diversified portfolio of products, spends billions of dollars annually on research and development, and has thousands of employees worldwide. Therefore we concluded that these best practices could be applied to other large-scale acquisitions. (2) GAO, Results-Oriented Government: Practices That Can Help Enhance and Sustain Collaboration among Federal Agencies, [hyperlink, http://www.gao.gov/products/GAO-06-15] (Washington, D.C.: Oct. 25, 2005). In this report we reviewed academic literature and prior GAO and Congressional Research Service reports, and interviewed experts that we identified from federal and state organizations in coordination and collaboration to derive a set of practices that we believe can help enhance and sustain federal agency collaborative efforts and that are consistent with results-oriented performance management and agency requirements under the Government Performance and Results Act of 1993. [16] See Pub. L. No. 107-71, 115 Stat. 597 (2001). For purposes of this report, "commercial passenger aircraft" refers to a U.S.-or foreign-flagged air carrier operating under TSA-approved security programs with regularly scheduled passenger operations to or from a U.S. airport. See 49 C.F.R. pts. 1544-46. [17] See Pub. L. No. 107-296, tit. III, 116 Stat. 2135, 2163-77 (2002) (codified as amended at 6 U.S.C. §§181-195c). [18] TSA, Operations Directive Improvised Explosive Device Screening Checkpoint Drills (400-50-1-12A) (Washington, D.C.: Nov. 15, 2012). [19] Screening personnel include TSOs, behavior detection officers, and other personnel responsible for the performance of screening functions. [20] See 49 C.F.R. § 1540.5. [21] The divestiture officer is responsible for instructing passengers on what items to remove before entering the AIT, directing passengers through the AIT, and directing passengers who "opt out" of AIT-ATR screening to a TSO of the same gender to conduct a standard pat-down. The SO is responsible for operating the AIT-ATR system, informing passengers of the appropriate stance, observing passengers as they are screened, and conducting either a visual inspection or standard pat- down of anomaly locations identified by the AIT-ATR system. [22] TSA's Office of Security Operations Executive Scorecard contains performance measures that federal security directors use to assess and track airport performance against stated goals. Among other responsibilities, federal security directors manage TSA personnel at airports. [23] TSOs are to conduct secondary screening when the AIT-ATR system detects an anomaly and indicates the location of the anomaly with a box on its generic image of a passenger's body. TSOs are to pat down that body location to resolve the anomaly. [24] TSA defined the false alarm rate as a measured fraction of test cases where an alarm has been indicated for the presence of an undivested object when none is actually present. [25] The 95 percent confidence interval represents the range over which the AIT-ATR system's performance would have varied 95 percent of the time in additional tests. For an example of statistical methods of data collection and testing design, see Lohr, Sampling: Design and Analysis. [26] BMI is a number calculated from a person's weight and height and is considered a fairly reliable indicator of body fatness by the Centers for Disease Control and Prevention. BMI categories include underweight (BMI below 18.5), normal (BMI from 18.5 to 24.9), overweight (BMI from 25.0 to 29.9), and obese (BMI over 30.0). [27] Key performance requirements, technically referred to as key performance parameters, are system characteristics that are considered critical or essential. Failure to meet a key performance parameter could be the basis to reject a system solution. [28] [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1]. [29] We identified eight key practice areas for program management of major acquisitions, which include demonstrating technology, design, and manufacturing maturity to ensure that technology works prior to deployment. Specifically, prior to the start of system development, critical technologies should be demonstrated to work in their intended environment. Likewise, prior to a production decision and deployment, a fully integrated, capable prototype should demonstrate that the system will work as intended in a reliable manner. We identified key acquisition management practices by reviewing 17 prior GAO reports examining DHS, the Department of Defense, the National Aeronautics and Space Administration, and private sector organizations. See [hyperlink, http://www.gao.gov/products/GAO-12-833]. [30] TSL, Management Plan for the Qualification Test and Evaluation of Advanced Imaging Technology (AIT) with Automated Target Recognition (ATR) (Aug. 12, 2010). [31] See 49 U.S.C. § 44901(l). On March 26, 2013, TSA published a Notice of Proposed Rulemaking in the Federal Register soliciting public comment on the use of AIT as a primary means for screening passengers. See 78 Fed. Reg. 18,287 (Mar. 26, 2013). The public comment period closed on June 24, 2013. In meeting the requirement to upgrade all AIT systems with ATR software, TSA terminated its contract with one AIT system vendor and removed all of these systems from airports because the vendor was unable to develop ATR software by the June 2013 deadline. [32] These were low-rate initial production contract awards, which allowed TSA to purchase AIT-2 systems for testing without a guarantee of purchasing future systems based on vendor proposals that state that the vendor can meet the requirements. [33] See for example, GAO, Homeland Security: DHS and TSA Continue to Face Challenges Developing and Acquiring Screening Technologies, [hyperlink, http://www.gao.gov/products/GAO-13-469T] (Washington, D.C.: May 8, 2013). TSA has also faced challenges in developing and meeting program requirements for explosives detection systems used to screen checked baggage. [34] GAO, Best Practices: Using a Knowledge-Based Approach to Improve Weapon Acquisition, [hyperlink, http://www.gao.gov/products/GAO-04-386SP] (Washington, D.C.: January 2004). In this report we identified acquisition best practices by reviewing 13 prior GAO reports on major acquisition systems. We determined that these best practices could be applied to other large- scale acquisitions. [35] GAO, Best Practices: An Integrated Portfolio Management Approach to Weapon System Investments Could Improve DOD's Acquisition Outcomes, [hyperlink, http://www.gao.gov/products/GAO-07-388] (Washington, D.C.: Mar. 30, 2007). In this report we identified best practices by reviewing related professional and academic publications, and interviewing knowledgeable officials from five successful commercial companies. While the products developed by these companies range from heavy construction equipment and high-end electronics to pharmaceuticals and household items, each of the companies manages a large, diversified portfolio of products; spends billions of dollars annually on research and development; and has thousands of employees worldwide. Therefore, we concluded that these best practices could be applied to other large-scale acquisitions. [36] According to technology roadmapping guidance developed by Sandia National Laboratories, a technology roadmap documents the critical system requirements, the product and process performance targets, and the technology alternatives and milestones for meeting those targets. In effect, a technology roadmap identifies alternative technology "roads" for meeting certain performance objectives. Department of Energy, Fundamentals of Technology Roadmapping, (Washington, D.C.: April 1997). [37] The AIT vendor's plan is based on the assumption that its AIT-ATR system has passed the Tier II requirements, because the Tier II ATR software is the baseline for the Tier III development effort. [38] See 6 U.S.C. § 182. [39] GAO, Results-Oriented Government: Practices That Can Help Enhance and Sustain Collaboration among Federal Agencies, [hyperlink, http://www.gao.gov/products/GAO-06-15] (Washington, D.C.: Oct. 25, 2005). In this report we reviewed academic literature and prior GAO and Congressional Research Service reports, and interviewed experts that we identified from federal and state organizations in coordination and collaboration to derive a set of practices that we believe can help enhance and sustain federal agency collaborative efforts and that are consistent with results-oriented performance management and agency requirements under the Government Performance and Results Act of 1993. [40] Pub. L. No. 111-352, 124 Stat. 3866 (2011). GAO, Streamlining Government: Key Practices from Select Efficiency Initiatives Should Be Shared Governmentwide, [hyperlink, http://www.gao.gov/products/GAO-11-908] (Washington, D.C.: Sept. 25, 2011). In this report, we selected federal initiatives that were being implemented departmentwide, involved reexamining federal programs and their related processes or structures or streamlining or consolidating existing processes to become more efficient, and were identified by the Office of Management and Budget (OMB) or government management experts as having potentially promising practices that may be adapted by other federal agencies. [41] Department of Homeland Security, Office of Inspector General, Transportation Security Administration's Deployment and Use of Advanced Imaging Technology. OIG-13-120 (September 2013). [42] Backscatter X-ray technology uses a low-level X-ray to produce an X-ray image, while millimeter-wave technology beams the millimeter- wave radio-frequency energy over the body's surface to produce a three- dimensional image. Since the backscatter vendor was unable to develop Automated Target Recognition (ATR) software by the June 2013 statutory deadline, as extended by TSA, to upgrade all deployed AIT systems with the software, TSA terminated its contract with this vendor and removed all of these systems from airports in order to meet the requirement. [43] TSA, Operations Directive Improvised Explosive Device Screening Checkpoint Drills, (400-50-1-12A), (Washington, D.C.: Nov. 15, 2012). [44] For purposes of this report, unless otherwise noted, references to TSOs include any nonfederal screeners employed by a private company under contract to TSA to provide screening services at an airport participating in TSA's screening partnership program. See 49 U.S.C. § 44920. [45] GAO, Standards for Internal Control in the Federal Government, [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1] (Washington, D.C.: Nov. 1, 1999). [46] TSL, a component of DHS's Science and Technology Directorate, is responsible for conducting research, development, testing, and evaluation of technologies for the department. [47] For examples of statistical methods for data collection and testing design, see Sharon Lohr, Sampling: Design and Analysis, Duxbury Press, New York, New York, 1999. [48] GAO, Homeland Security: DHS Requires More Disciplined Investment Management to Help Meet Mission Needs, [hyperlink, http://www.gao.gov/products/GAO-12-833] (Washington, D.C.: Sept. 18, 2012). We identified key acquisition management practices by reviewing 17 prior GAO reports examining DHS, the Department of Defense, the National Aeronautics and Space Administration, and private sector organizations. [49] For examples of statistical methods for data collection and testing design, see Sharon Lohr, Sampling: Design and Analysis. [50] [hyperlink, http://www.gao.gov/products/GAO-12-833.DHS], Acquisition Instruction/Guidebook 102-01-001, Version 1.9 (November 7, 2008); TSL, Management Plan for the Qualification Test and Evaluation of Advanced Imaging Technology (AIT) with Automated Target Recognition (ATR) (Aug. 12, 2010). [51] Transportation Security Specialists for Explosives are responsible for conducting operational checkpoint drills to train TSOs on resolving alarms and better adhering to TSA's SOPs, as well as to serve as liaisons with local law enforcement. SOPs establish processes for implementing security measures. [52] TSA classifies airports into one of five security risk categories. [53] Department of Energy, Fundamentals of Technology Roadmapping (Washington, D.C.: April 1997). As described in this report, technology roadmapping is an effective tool for technology planning and coordination, which fits within a broader set of planning activities and provides information to make better technology investment decisions by identifying critical technologies and technology gaps and identifying ways to leverage research and development investments. [54] (1) GAO, Best Practices: An Integrated Portfolio Management Approach to Weapon System Investments Could Improve DOD's Acquisition Outcomes, [hyperlink, http://www.gao.gov/products/GAO-07-388] (Washington, D.C.: Mar. 30, 2007). In this report we identified best practices by reviewing related professional and academic publications, and interviewing knowledgeable officials from five successful commercial companies. While the products developed by these companies range from heavy construction equipment and high-end electronics to pharmaceuticals and household items, each of the companies manages a large, diversified portfolio of products; spends billions of dollars annually on research and development; and has thousands of employees worldwide. Therefore we concluded that these best practices could be applied to other large-scale acquisitions. (2) GAO, Results-Oriented Government: Practices That Can Help Enhance and Sustain Collaboration among Federal Agencies, [hyperlink, http://www.gao.gov/products/GAO-06-15] (Washington, D.C.: Oct. 25, 2005). In this report we reviewed academic literature and prior GAO and Congressional Research Service reports, and interviewed experts that we identified from federal and state organizations in coordination and collaboration to derive a set of practices that we believe can help enhance and sustain federal agency collaborative efforts and that are consistent with results-oriented performance management and agency requirements under the Government Performance and Results Act of 1993. [55] The National Academy of Sciences identified those experts by conducting searches of internal databases and external websites, obtaining recommendations from members of the National Academies' National Materials and Manufacturing Board and the Computer Sciences and Telecommunications Board, and using connections with experts who assisted on prior studies conducted by the National Academy of Sciences. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]