Skip to main content

TSA Modernization: Use of Sound Program Management and Oversight Practices Is Needed to Avoid Repeating Past Problems

GAO-18-46 Published: Oct 17, 2017. Publicly Released: Oct 17, 2017.
Jump To:
Skip to Highlights

Highlights

What GAO Found

The Transportation Security Administration's (TSA) new strategy for the Technology Infrastructure Modernization (TIM) program includes using Agile software development, but the program only fully implemented two of six leading practices necessary to ensure successful Agile adoption. Specifically, the Department of Homeland Security (DHS) and TSA leadership fully committed to adopt Agile and TSA provided Agile training. Nonetheless, the program had not defined key roles and responsibilities, prioritized system requirements, or implemented automated capabilities that are essential to ensuring effective adoption of Agile. Until TSA adheres to all leading practices for Agile implementation, the program will be putting at risk its ability to deliver a quality system that strengthens and enhances the sophistication of TSA's security threat assessments and credentialing programs.

TSA and DHS fully implemented one of the key practices for overseeing the TIM program, by establishing a process for ensuring corrective actions are identified and tracked. However, TSA and DHS did not fully implement the remaining three key practices, which impede the effectiveness of their oversight. Specifically,

  • TSA and DHS documented selected policies and procedures for governance and oversight of the TIM program, but they did not develop or finalize other key oversight and governance documents. For example, TSA officials developed a risk management plan tailored for Agile; however, they did not update the TIM system life-cycle plan to reflect the Agile governance framework they were using.
  • The TIM program management office conducted frequent performance reviews, but did not establish thresholds or targets for oversight bodies to use to ensure that the program was meeting acceptable levels of performance. In addition, department-level oversight bodies have focused on reviewing selected program life-cycle metrics for the TIM program; however, they did not measure the program against the rebaselined cost, or important Agile release-level metrics.
  • TIM's reported performance data were not always complete and accurate. For example, program officials reported that they were testing every line of code, even though they were unable to confirm that they were actually doing so, thus calling into question the accuracy of the data reported.

These gaps in oversight and governance of the TIM program were due to, among other things, TSA officials not updating key program management documentation and DHS leadership not obtaining consensus on needed oversight and governance changes related to Agile programs. Given that TIM is a historically troubled program and is at least 6 months behind its rebaselined schedule, it is especially concerning that TSA and DHS have not fully implemented oversight and governance practices for this program. Until TSA and DHS fully implement these practices to ensure the TIM program meets its cost, schedule, and performance targets, the program is at risk of repeating past mistakes and not delivering the capabilities that were initiated 9 years ago to protect the nation's transportation infrastructure.

Why GAO Did This Study

TSA conducts security threat assessment screening and credentialing activities for millions of workers and travelers in the maritime, surface, and aviation transportation industries that are seeking access to transportation systems. In 2008, TSA initiated the TIM program to enhance the sophistication of its security threat assessments and to improve the capacity of its supporting systems. However, the program experienced significant cost and schedule overruns, and performance issues, and was suspended in January 2015 while TSA established a new strategy. The program was rebaselined in September 2016 and is estimated to cost approximately $1.27 billion and be fully operational by 2021 (about $639 million more and 6 years later than originally planned).

GAO was asked to review the TIM program's new strategy. This report determined, among other things, the extent to which (1) TSA implemented selected key practices for transitioning to Agile software development for the program; and (2) TSA and DHS are effectively overseeing the program's cost, schedule, and performance. GAO compared program documentation to key practices identified by the Software Engineering Institute and the Office of Management and Budget, as being critical to transitioning to Agile and for overseeing and governing programs.

Recommendations

GAO is making 14 recommendations, including that DHS should prioritize requirements and obtain leadership consensus on oversight and governance changes. DHS concurred with all 14 recommendations.

Recommendations for Executive Action

Agency Affected Recommendation Status
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office establishes and implements specific time frames for determining key strategic implementation details, including how the program will transition from the current state to the final TIM state. (Recommendation 1)
Closed – Implemented
In November 2017, TSA updated its implementation plans for the TIM program, including roadmaps that clearly showed the key steps needed to transition from the current state to the final state, such as when each user group would be transitioned to the new system and when each legacy system would be decommissioned. As a result, TSA and DHS oversight bodies have better assurance that the program will ultimately deliver its complete solution.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office establishes a schedule that provides planned completion dates based on realistic estimates of how long it will take to deliver capabilities. (Recommendation 2)
Closed – Implemented
In November 2017, TSA updated its schedule for the TIM program with more realistic completion dates and the program consistently met these planned dates, such as deploying initial PreCheck capabilities by November 2017 and fully deploying all remaining PreCheck capabilities by June 2018. As a result, TSA has better assurance that the TIM program meets the dates it has committed to.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office establishes new time frames for implementing the actions identified in the organizational change management strategy and effectively executes against these time frames. (Recommendation 3)
Closed – Implemented
In January 2019, although TSA had not implemented all actions in the organizational change management strategy, it had addressed the underlying concern behind our recommendation, which was to effectively coordinate and communicate with TIM's end-users. The program had completed several Agile development software releases which involved incorporating end user input iteratively and working to enhance communication with end users. As a result, TSA has had greater assurance that there is effective communication with stakeholders, and TIM's end users have indicated they are satisfied with the significant progress made with the system.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office defines and documents the roles and responsibilities among product owners, the solution team, and any other relevant stakeholders for prioritizing and approving Agile software development work. (Recommendation 4)
Closed – Implemented
In response to our recommendation, TSA further clarified and documented the roles and responsibilities of key TIM program stakeholders in prioritizing and approving the program's Agile software development work. As a result, TSA has better assurance that the TIM program's key stakeholders can effectively establish priorities, approve user stories, and decide whether completed work meets the acceptance criteria for the TIM program.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office establishes specific prioritization levels for current and future features and user stories. (Recommendation 5)
Closed – Implemented
In response to our recommendation, TSA consistently prioritized TIM program requirements (features and user stories) with its end users from October 2017 through September 2018. As a result, the program was able to deliver functionality that aligned with the highest needs of its end users.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office implements automated Agile management testing and deployment tools, as soon as possible. (Recommendation 6)
Closed – Implemented
In November 2018, TSA demonstrated that it had implemented all 19 of the software testing and deployment tools planned for the TIM program. As a result, TSA has better assurance that the TIM program can test and deploy software at increased efficiency levels and better ensure product quality.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office updates the Systems Engineering Life Cycle Tailoring Plan to reflect the current governance framework and milestone review processes. (Recommendation 7)
Closed – Implemented
In November 2018, TSA officials provided an updated version of the TIM program's Systems Engineering Life Cycle Tailoring Plan, which reflected the Agile development governance approach the program had been using. As a result, TSA has established a clearly documented and repeatable governance process to effectively oversee the program.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office establishes thresholds or targets for acceptable performance-levels. (Recommendation 8)
Closed – Implemented
In response to our recommendation, TSA began measuring and monitoring TIM's performance against DHS' core Agile metrics and key performance parameters, which establish thresholds for acceptable performance levels. From April 2017 through September 2018, TIM largely met or exceeded these thresholds. As a result, TSA has established mechanisms to help ensure the TIM program is meeting acceptable performance levels.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office begins collecting and reporting on Agile-related cost metrics. (Recommendation 9)
Closed – Implemented
In September 2018, TSA began collecting and reporting on the TIM program's Agile development cost performance to DHS, as part of the department's oversight of the program. As a result, TSA has better assurance that oversight bodies have complete information by which to monitor TIM program costs.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office ensures that program velocity is measured and reported consistently. (Recommendation 10)
Closed – Implemented
In response to our recommendation, TSA provided several performance reports, ranging from June through November 2017, that demonstrated consistent reporting of the TIM program's velocity. As a result, TSA has better assurance that it is using accurate program velocity data to more reliably forecast the TIM program's ability to meet its cost, schedule, and performance targets.
Department of Homeland Security The TSA Administrator should ensure that the TIM program management office ensures that unit test coverage for software releases is measured and reported accurately. (Recommendation 11)
Closed – Implemented
In October 2018, TSA demonstrated that the TIM program was using an automated software tool to measure and report on the percentage of unit test code coverage for software releases. As a result, TSA has more accurate data on how much of the program's code is getting tested to help ensure product quality.
Department of Homeland Security The Secretary of Homeland Security should direct the Under Secretary for Management to ensure that appropriate DHS leadership reaches consensus on needed oversight and governance changes related to the frequency of reviewing Agile programs, and then documents and implements associated changes. (Recommendation 12)
Closed – Implemented
In February 2020, DHS revised its governance and oversight procedures to reflect more frequent reviews of Agile development programs that are aligned with their development release cycles. As a result, DHS has increased assurance that its Agile programs will plan to undergo more frequent oversight reviews.
Department of Homeland Security The Secretary of Homeland Security should direct the Under Secretary for Management to ensure that the Office of the Chief Technology Officer completes guidance for Agile programs to use for collecting and reporting on performance metrics. (Recommendation 13)
Closed – Implemented
In October 2017, DHS provided its completed guidance which included recommended practices for collecting and reporting on agile performance metrics, as well as a set of core agile performance metrics that programs should report to the Department. As a result, DHS has better assurance that agile development programs will report informative performance metrics to oversight entities so that they can ensure the programs are effectively delivering their intended capabilities and outcomes.
Department of Homeland Security The Secretary of Homeland Security should direct the Under Secretary for Management to ensure that DHS-level oversight bodies review key Agile performance and cost metrics for the TIM program and use them to inform management oversight decisions. (Recommendation 14)
Closed – Implemented
In response to our recommendation, DHS oversight bodies began reviewing key Agile performance and cost metrics for TSA's Technology Infrastructure Modernization Program and used them to inform management oversight decisions. As a result, DHS has better assurance that its oversight bodies could identify early indicators of any issues with the program and call for course correction, as needed.

Full Report

Office of Public Affairs

Topics

Best practicesData integrityDocumentationHomeland securityPerformance measuresCorrective actionPrioritizingPolicies and proceduresProgram managementProgram evaluationRequirements definitionRisk managementSecurity threat assessmentsSoftwareSystems development life cycleTestingTransportation securityCost and schedule performance