Skip to main content

Homeland Security: DHS Could Strengthen Acquisitions and Development of New Technologies

GAO-11-829T Published: Jul 15, 2011. Publicly Released: Jul 15, 2011.
Jump To:
Skip to Highlights

Highlights

This testimony discusses our past work examining the Department of Homeland Security's (DHS) progress and challenges in developing and acquiring new technologies to address homeland security needs. DHS acquisition programs represent hundreds of billions of dollars in life-cycle costs and support a wide range of missions and investments including border surveillance and screening equipment, nuclear detection equipment, and technologies used to screen airline passengers and baggage for explosives, among others. Since its creation in 2003, DHS has spent billions of dollars developing and procuring technologies and other countermeasures to address various threats and to conduct its missions. Within DHS, the Science and Technology Directorate (S&T) conducts general research and development and oversees the testing and evaluation efforts of DHS components, which are responsible for developing, testing, and acquiring their own technologies. This testimony focuses on the findings of our prior work related to DHS's efforts to acquire and deploy new technologies to address homeland security needs. Our past work has identified three key challenges: (1) developing technology program requirements, (2) conducting and completing testing and evaluation of technologies and (3) incorporating information on costs and benefits in making technology acquisition decisions. This statement will also discuss recent DHS efforts to strengthen its investment and acquisition processes.

We have identified technologies that DHS has deployed that have not met key performance requirements. For example, in June 2010, we reported that over half of the 15 DHS programs we reviewed awarded contracts to initiate acquisition activities without component or department approval of documents essential to planning acquisitions, setting operational requirements, and establishing acquisition program baselines. Our prior work has also identified that failure to resolve problems discovered during testing can sometimes lead to costly redesign and rework at a later date and that addressing such problems during the testing and evaluation phase before moving to the acquisition phase can help agencies avoid future cost overruns. Specifically: (1) In March 2011, we reported that the independent testing and evaluation of SBInet's Block 1 capability to determine its operational effectiveness and suitability was not complete at the time DHS reached its decision regarding the future of SBInet or requested fiscal year 2012 funding to deploy the new Alternative (Southwest) Border Technology. (2) In September 2010, we reported that S&T's plans for conducting operational testing of container security technologies did not reflect all of the operational scenarios that CBP was considering for implementation. (3) In October 2009, we reported that TSA deployed explosives trace portals, a technology for detecting traces of explosives on passengers at airport checkpoints, even though TSA officials were aware that tests conducted during 2004 and 2005 on earlier models of the portals suggested the portals did not demonstrate reliable performance in an airport environment. TSA also lacked assurance that the portals would meet functional requirements in airports within estimated costs and the machines were more expensive to install and maintain than expected. In June 2006, TSA halted deployment of the explosives trace portals because of performance problems and high installation costs. Our prior work has shown that cost-benefit analyses help congressional and agency decision makers assess and prioritize resource investments and consider potentially more cost-effective alternatives and that without this ability, agencies are at risk of experiencing cost overruns, missed deadlines, and performance shortfalls. For example, we have reported that DHS has not consistently included these analyses in its acquisition decision making. Specifically: (1) In March 2011, we reported that the decision by the Secretary of Homeland Security to end the SBInet program was informed by, among other things, an independent analysis of cost-effectiveness. However, it was not clear how DHS used the results to determine the appropriate technology plans and budget decisions, especially since the results of SBInet's operational effectiveness were not complete at the time of the Secretary's decision to end the program. Furthermore, the cost analysis was limited in scope and did not consider all technology solutions because of the need to complete the first phase of the analysis in 6 weeks. (2) In October 2009, we reported that TSA had not yet completed a cost-benefit analysis to prioritize and fund its technology investments for screening passengers at airport checkpoints. One reason that TSA had difficulty developing a cost-benefit analysis was that it had not yet developed life-cycle cost estimates for its various screening technologies. (3) In June 2009, we reported that DHS's cost analysis of the Advanced Spectroscopic Portal (ASP) program did not provide a sound analytical basis for DHS's decision to deploy the portals.

Full Report

Office of Public Affairs

Topics

Baggage screeningBorder securityCost effectiveness analysisEvaluationExplosives detection systemsFederal procurementHomeland securityInvestmentsOperational testingPassenger screeningProcurement planningProgram managementRequirements definitionResearch and developmentStrategic planningTechnologyTesting