This is the accessible text file for GAO report number GAO-12-833 entitled 'Homeland Security: DHS Requires More Disciplined Investment Management to Help Meet Mission Needs' which was released on September 19, 2012. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Requesters: September 2012: Homeland Security: DHS Requires More Disciplined Investment Management to Help Meet Mission Needs: GAO-12-833: GAO Highlights: Highlights of GAO-12-833, a report to congressional requesters. Why GAO Did This Study: DHS invests extensively in major acquisition programs to develop new systems that help the department execute its many critical missions. In 2011, DHS reported to Congress that it planned to invest $167 billion in these major acquisition programs. We previously found that DHS had not managed its investments effectively, and its acquisition management activities have been on GAO’s High Risk List since 2005. This report addresses the extent to which (1) major DHS acquisition programs face key challenges; (2) DHS has policies and processes to effectively manage individual acquisition programs; (3) DHS has policies and processes to effectively manage its portfolio of acquisition programs as a whole; and (4) DHS has taken actions to address the high-risk acquisition management issues GAO has identified in previous reports. GAO surveyed all 77 major program offices DHS identified in 2011 (92 percent response rate), reviewed available documentation of acquisition decisions from November 2008 to April 2012, and interviewed officials at DHS headquarters and components. What GAO Found: Nearly all of the Department of Homeland Security (DHS) program managers GAO surveyed reported their programs had experienced significant challenges. Sixty-eight of the 71 respondents reported they experienced funding instability, faced workforce shortfalls, or their planned capabilities changed after initiation, and most survey respondents reported a combination of these challenges. DHS lacks the data needed to accurately measure program performance, but GAO was able to use survey results, information DHS provided to Congress, and an internal DHS review from March 2012 to identify 42 programs that experienced cost growth, schedule slips, or both. GAO gained insight into the magnitude of the cost growth for 16 of the 42 programs, which increased from $19.7 billion in 2008 to $52.2 billion in 2011, an aggregate increase of 166 percent. DHS acquisition policy reflects many key program management practices that could help mitigate program risks. It requires programs to develop documents demonstrating critical knowledge that would help leaders make better informed investment decisions when managing individual programs. However, DHS has not consistently met these requirements. The department has only verified that four programs documented all of the critical knowledge the policy requires to proceed with acquisition activities. Officials explained that DHS’s culture has emphasized the need to rapidly execute missions more than sound acquisition management practices. Most major programs lack reliable cost estimates, realistic schedules, and agreed-upon baseline objectives, limiting DHS leadership’s ability to effectively manage those programs and provide information to Congress. DHS recognizes the need to implement its acquisition policy more consistently, but significant work remains. DHS acquisition policy does not fully reflect several key portfolio management practices, such as allocating resources strategically, and DHS has not yet re-established an oversight board to manage its investment portfolio across the department. As a result, DHS has largely made investment decisions on a program-by-program and component-by-component basis. The widespread risk of poorly understood cost growth, coupled with the fiscal challenges facing the federal government, makes it essential that DHS allocate resources to its major programs in a deliberate manner. DHS plans to develop stronger portfolio-management policies and processes, but until it does so, DHS programs are more likely to experience additional funding instability, which will increase the risk of further cost growth and schedule slips. These outcomes, combined with a tighter budget, could prevent DHS from developing needed capabilities. DHS has introduced seven initiatives that could improve acquisition management by addressing longstanding challenges GAO and DHS survey respondents have identified, such as funding instability and acquisition workforce shortfalls. Implementation plans are still being developed, and DHS is still working to address critical issues. Because of this, it is too early to determine whether the DHS initiatives will be effective, as GAO has previously established that agencies must sustain progress over time to address management challenges. DHS is also pursuing a tiered-governance structure, but it must reduce risks and improve program outcomes before regularly delegating major milestone decision authority. What GAO Recommends: GAO recommends that DHS modify its policy to better reflect key program and portfolio management practices, ensure acquisition programs fully comply with DHS acquisition policy, prioritize major acquisition programs departmentwide and account for anticipated resource constraints, and document prerequisites for delegating major milestone decision authority. DHS concurred with all of GAO’s recommendations, and noted its progress on a number of fronts, which is accounted for in the report. View [hyperlink, http://www.gao.gov/products/GAO-12-833]. For more information, contact John Hutton at (202) 512-4841 or huttonj@gao.gov. [End of section] Contents: Letter: Background: Nearly All Program Managers Surveyed Reported Significant Challenges: Programs Proceed without Meeting Sound DHS Acquisition Policy: DHS Needs Policy and Process Enhancements to Effectively Manage Its Portfolio of Investments: DHS Acquisition Management Initiatives Target Longstanding Challenges, but Key Implementation Issues Remain: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: Key Acquisition Management Practices: Appendix III: Program Office Survey Results: Appendix IV: Major DHS Acquisition Programs and their Key Acquisition Documents: Appendix V: Comments from the Department of Homeland Security: Appendix VI: GAO Contact and Staff Acknowledgments: Related GAO Products: Tables: Table 1: DHS Acquisition Levels for Major Acquisition Programs: Table 2: Key DHS Acquisition Documents Requiring Department-level Approval: Table 3: GAO Assessment of DHS's Acquisition Policy Compared to Key Program-management Practices: Table 4: GAO Assessment of DHS's Acquisition Policy Compared to Key Portfolio-management Practices: Table 5: GAO Assessment of DHS Acquisition Management Initiatives: Table 6: Programs that responded to our survey: Table 7: Programs that did not respond to our survey: Table 8: Programs canceled in 2011: Figures: Figure 1: DHS Acquisition Life Cycle and Document Requirements for Major Acquisition Programs: Figure 2: DHS Acquisition Managers: Figure 3: Breakdown of 68 Major Acquisition Programs Experiencing One or More Significant Management Challenges: Figure 4: Number of Major Acquisition Programs Experiencing Cost Growth or Schedule Slips: Figure 5: Programs that Changed Planned Capabilities and Experienced Cost Growth or Schedule Slips: Figure 6: Reasons Programs Changed Their Planned Capabilities: Figure 7: Effects of Funding Instability on Programs, as Identified by Program Managers: Figure 8: Reasons Programs Experienced Funding Instability: Figure 9: Functional Areas Contributing to Workforce Shortfalls: Figure 10: DHS Acquisition Documents Requiring Department-level Approval: Figure 11: Programs That Have Key Acquisition Documents Approved in Accordance with AD 102: Figure 12: Programs Reviewed by DHS's Executive Review Board: Figure 13: Programs with Department-approved APBs, Reliable Cost Estimates, and Realistic Schedules: Figure 14: Councils and Offices in DHS's Proposed IILCM: Figure 15: Acquisition Management Challenges and Corresponding Initiatives: Figure 16: Initiatives that Slipped from DHS's Original January 2011 Schedule to the June 2012 Update: Figure 17: DHS's Proposed IRB/ESC Governance Structure: Abbreviations: AD 102: Acquisition Management Directive 102-01: AOA: Analysis of alternatives: APB: Acquisition Program Baseline: CAE: Component Acquisition Executive: CASR: Comprehensive Acquisition Status Report: CRC: Capabilities and Requirements Council: DHS: Department of Homeland Security: ESC: Executive Steering Committee: FYHSP: Future Years Homeland Security Program: IILCM: Integrated Investment Life Cycle Model: IT: Information Technology: IRB: Investment Review Board: JRC: Joint Requirements Council: MAOL: Major Acquisition Oversight List: MNS: Mission Need Statement: OCFO: Office of the Chief Financial Officer: PA&E: Office of Program Analysis and Evaluation: PARM: Office of Program Accountability and Risk Management: QPAR: Quarterly Program Accountability Report: USM: Under Secretary for Management: [End of section] United States Government Accountability Office: Washington, DC 20548: September 18, 2012: Congressional Requesters: The Department of Homeland Security (DHS) invests extensively in acquisition programs to develop new systems that help the department execute its many critical missions. DHS is acquiring systems to help secure the border, facilitate trade, screen travelers, enhance cyber security, improve disaster response, and execute a wide variety of other operations. Several of DHS's major acquisition programs--the department's most expensive and critical investments--existed prior to the creation of DHS and were managed by one of the 22 separate agencies that merged to form the department. These acquisition programs are now managed by senior officials at DHS headquarters and 12 component agencies. In 2011, DHS reported to Congress that it planned to ultimately invest $167 billion in its major acquisition programs. In fiscal year 2012, DHS reported it was investing more than $18 billion in the department's acquisition programs. DHS acquisition management issues have been highlighted in our High- Risk List since 2005.[Footnote 1] Over the past several years, our work has identified significant shortcomings in the department's ability to manage an expanding portfolio of complex acquisitions.[Footnote 2] In 2008, DHS revised its acquisition review process to include more detailed guidance for key acquisition decision events, documentation requirements, and the roles and responsibilities of DHS decision makers. In January 2010, DHS again updated its acquisition policy, but later that year, we found that the department still was not effectively carrying out its acquisition management responsibilities.[Footnote 3] We have previously established that a program must have a sound business case that includes firm requirements, a knowledge-based acquisition strategy, and realistic cost estimates in order to reduce program challenges.[Footnote 4] These conditions provide a program a reasonable chance of overcoming challenges yet delivering on time and within budget. Because DHS invests significant resources developing capabilities to support the department's mission, you asked us to assess the extent to which (1) DHS's major acquisition programs face challenges that increase the risk of poor outcomes; (2) DHS has policies and processes in place to effectively manage individual acquisition programs; (3) DHS has policies and processes in place to effectively manage its portfolio of acquisition programs as a whole; and (4) DHS has taken actions to address the high-risk acquisition management issues we have identified in previous reports. In addition to this report, we are also issuing a report focused on the performance of DHS's major information technology (IT) investments.[Footnote 5] To determine the extent to which major acquisition programs identified by DHS in 2011 face challenges, we surveyed all 77 major program offices from January to March 2012, and achieved a 92 percent response rate.[Footnote 6] Through our survey, we collected information regarding major programs' performance, program managers' understanding of acquisition guidance, challenges developing requirements, and funding issues. We also reviewed all available documentation of department-level acquisition decisions from November 2008 to April 2012 and interviewed acquisition officials at DHS headquarters and components to understand program risks, management challenges, and data limitations--particularly data limitations regarding program performance. Further, we reviewed resource plans and DHS performance reports to establish the extent to which major acquisition programs are achieving their cost, schedule and capability objectives. To determine the extent to which DHS has policies in place to effectively manage individual acquisition programs, as well as the department's acquisition portfolio as a whole, we compared our key acquisition management practices to DHS acquisition policy, and identified the extent to which DHS has implemented its policy.[Footnote 7] We also met with DHS officials to discuss our analysis, identify relevant sections of the policy that we had not yet accounted for, and solicit their thoughts on those key practices that were not reflected in the policy. To determine the extent to which DHS has taken actions to address the high-risk acquisition management issues we have identified in previous reports, we analyzed the department's recently proposed efforts to address high-risk acquisition management challenges, including the department's progress in implementing new initiatives and any challenges DHS must overcome moving forward. We conducted this performance audit from August 2011 to September 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: DHS invests in major acquisition programs to develop capabilities intended to improve its ability to execute its mission. DHS generally defines major programs as those expected to cost at least $300 million over their respective life cycles, and many are expected to cost more than $1 billion. DHS Acquisition Management Directive 102-01 (AD 102) and DHS Instruction Manual 102-01-001 (Guidebook), which includes 12 appendixes, establish the department's policies and processes for managing these major acquisition programs. DHS issued the initial version of AD 102 in 2008 in an effort to establish an acquisition management system that effectively provides required capability to operators in support of the department's missions.[Footnote 8] AD 102 establishes that DHS's Chief Acquisition Officer--currently the Under Secretary for Management (USM)--is responsible for the management and oversight of the department's acquisition policies and procedures.[Footnote 9] The USM, Deputy Secretary, and Component Acquisition Executives (CAE) are the Acquisition Decision Authorities for DHS's major acquisition programs. Table 1 identifies how DHS categorizes the 77 major acquisition programs it identified in 2011. Table 1: DHS Acquisition Levels for Major Acquisition Programs: Level: 1; Life-cycle cost: Greater than or equal to $1 billion; Acquisition Decision Authority: Deputy Secretary, Chief Acquisition Officer, or the Under Secretary for Management; Number of programs in fiscal year 2011: 43. Level: 2; Life-cycle cost: $300 million or more, but less than $1 billion; Acquisition Decision Authority: Chief Acquisition Officer, Under Secretary for Management, or the Component Acquisition Executive; Number of programs in fiscal year 2011: 34. Source: GAO analysis of AD 102 and DHS's fiscal year 2011 Major Acquisition Oversight List. Notes: Currently, the USM is designated DHS's Chief Acquisition Officer, but another official could serve in that role in the future. [End of table] Non-major acquisition programs expected to cost less than $300 million are designated Level 3. An acquisition may be raised to a higher acquisition level if (a) its importance to DHS's strategic and performance plans is disproportionate to its size, (b) it has high executive visibility, (c) it impacts more than one component, (d) it has significant program or policy implications, or (e) the Deputy Secretary, Chief Acquisition Officer, or Acquisition Decision Authority recommends an increase to a higher level. Level 1 and 2 acquisitions may be delegated to components through formal letters of delegation from the Acquisition Decision Authority. The Acquisition Life Cycle and Key Documents: The Acquisition Decision Authority is responsible for reviewing and approving the movement of DHS's major acquisition programs through four phases of the acquisition life cycle at a series of five predetermined Acquisition Decision Events. These five Acquisition Decision Events provide the Acquisition Decision Authority an opportunity to assess whether a major program is ready to proceed through the life-cycle phases. The four phases of the acquisition life cycle, as established in AD 102, are: 1. Need phase: Department officials identify that there is a need, consistent with DHS's strategic plan, justifying an investment in a new capability and the establishment of an acquisition program to produce that capability; 2. Analyze/Select phase: the Acquisition Decision Authority designates a qualified official to manage the program, and this program manager subsequently reviews alternative approaches to meeting the need, and recommends a best option to the Acquisition Decision Authority; 3. Obtain phase: The program manager develops, tests, and evaluates the selected option; during this phase, programs may proceed through ADE 2B, which focuses on the cost, schedule, and performance parameters for each of the program's projects; and ADE 2C, which focuses on low rate initial production issues; and: 4. Produce/Deploy/Support phase: DHS delivers the new capability to its operators, and maintains the capability until it is retired; this phase includes sustainment, which begins when a capability has been fielded for operational use; sustainment involves the supportability of fielded systems through disposal, including maintenance and the identification of cost reduction opportunities; this phase tends to account for up to 70 percent of life-cycle costs. Figure 1 depicts the acquisition life cycle. Figure 1: DHS Acquisition Life Cycle and Document Requirements for Major Acquisition Programs: [Refer to PDF for image: illustration] Acquisition life-cycle phase: Need; Number of programs in fiscal year 2011[A]: 1; Acquisition Decision Events (ADE): ADE 1; Documents requiring department-level approval[B]: MNS, CDP. Acquisition life-cycle phase: Analyze/Select; Number of programs in fiscal year 2011[A]: 4; Acquisition Decision Events (ADE): ADE 2A[C]; Documents requiring department-level approval[B]: APB, ILSP, AP, LCCE[D], ORD. Acquisition life-cycle phase: Obtain; Number of programs in fiscal year 2011[A]: 20; Acquisition Decision Events (ADE): ADE 2B; Documents requiring department-level approval[B]: APB, ILSP, AP, LCCE[D], TEMP; Acquisition Decision Events (ADE): ADE 2C; Acquisition Decision Events (ADE): ADE 3; Documents requiring department-level approval[B]: APB, ILSP, AP, LCCE[D]. Acquisition life-cycle phase: Produce/Deploy/Support; Number of programs in fiscal year 2011[A]: 51. MNS: Mission Need Statement; CDP: Capability Development Plan; APB: Acquisition Program Baseline; ILSP: Integrated Logistics Support Plan; AP: Acquisition Plan; LCCE: Life-cycle Cost Estimate; ORD: Operational Requirements Document; TEMP: Test and Evaluation Master Plan. Source: GAO analysis of AD 102 and survey data. [A] Programs with multiple increments are categorized based on their most mature increment. We were unable to identify the phase for one program. [B] Documents identified for ADE 2B are required for capital assets. Programs providing services only require an APB and AP at ADE 2B. [C] Analyses of Alternatives and Concepts of Operations are approved at the component level at ADE 2A. [D] Level 2 programs' life-cycle cost estimates do not require department-level approval. [End of figure] An important aspect of the Acquisition Decision Events is the review and approval of key acquisition documents critical to establishing the need for a major program, its operational requirements, an acquisition baseline, and testing and support plans. AD 102--and the associated DHS Instruction Manual 102-01-001 and appendixes--provide more detailed guidance for preparing these documents than DHS's predecessor policy. See table 2 for descriptions of the key acquisition documents requiring department-level approval before a program moves to the next acquisition phase. Table 2: Key DHS Acquisition Documents Requiring Department-level Approval: Document: Mission Need Statement; Description: Provides a high-level description of the mission need, whether from a current or impending gap. Outlines only the concept of the solution to fill the gap and does not provide information on specific types of acquisitions that could provide that capability. Document: Capability Development Plan; Description: Serves as the agreement between the component head, program manager and the Acquisition Decision Authority on the activities, cost, and schedule for the work to be performed in the Analyze/Select phase. Document: Operational Requirements Document; Description: Provides a number of performance parameters that must be met by a program to provide useful capability to the operator by closing the capability gaps identified in the Mission Need Statement. Document: Acquisition Plan; Description: Provides a top-level plan for the overall acquisition approach. Describes why the solution is in the government's best interest and why it is the most likely to succeed in delivering capabilities to operators. Document: Integrated Logistics Support Plan; Description: Defines the strategy for ensuring the supportability and sustainment of a future capability. Provides critical insight into the approach, schedule, and funding requirements for integrating supportability requirements into the systems engineering process. Document: Life-Cycle Cost Estimate; Description: Provides an exhaustive and structured accounting of all resources and associated cost elements required to develop, produce, deploy, and sustain a particular program. Document: Acquisition Program Baseline; Description: Establishes a program's critical baseline cost, schedule, and performance parameters. Expresses the parameters in measurable, quantitative terms, which must be met in order to accomplish the investment's goals. Document: Test and Evaluation Master Plan; Description: Documents the overarching test and evaluation approach for the acquisition program. Describes the Developmental and Operational Test and Evaluation needed to determine a system's technical performance, operational effectiveness/suitability, and limitations. Source: DHS acquisition policy. [End of table] Acquisition Management Officials: DHS acquisition policy requires that the DHS Investment Review Board (IRB) support the Acquisition Decision Authority by reviewing major acquisition programs for proper management, oversight, accountability, and alignment to the department's strategic functions at Acquisition Decision Events and other meetings as needed. DHS acquisition policy establishes that the IRB shall be chaired by the Acquisition Decision Authority and consist of individuals that manage DHS's mission objectives, resources, and contracts, including: * Under Secretary for Science and Technology, * Assistant Secretary for Policy, * General Counsel, * Chief Financial Officer, * Chief Procurement Officer, * Chief Information Officer, * Chief Human Capital Officer, * Chief Administrative Services Officer, * Chief Security Officer, * CAE responsible for the program being reviewed, and: * User representatives from component(s) sponsoring the capability. The Office of Program Accountability and Risk Management (PARM) is responsible for DHS's overall acquisition governance process, supports the IRB, and reports directly to the USM. PARM, which is led by an executive director, develops and updates program management policies and practices, oversees the acquisition workforce, provides support to program managers, and collects program performance data. In March 2012, PARM issued its first Quarterly Program Accountability Report, which provided an independent evaluation of major programs' health and risks. The department's program management offices are responsible for planning and executing DHS's individual programs within cost, schedule, and performance goals. The program managers provide the IRB key information by preparing required acquisition documents that contain critical knowledge about their respective programs, facilitating the governance process. Nearly all of DHS's program management offices are located within 12 of the department's component agencies, such as the Transportation Security Administration, or U.S. Customs and Border Protection.[Footnote 10] Within these components, CAEs are responsible for establishing acquisition processes and overseeing the execution of their respective portfolios. Additionally, under AD 102, the USM can delegate Acquisition Decision Authority to CAEs for programs with life-cycle cost estimates between $300 million and $1 billion. Figure 2 depicts the relationship between acquisition managers at the department, component, and program level. Figure 2: DHS Acquisition Managers: [Refer to PDF for image: illustration] Top level: Secretary[A]; Deputy Secretary[A]. Second level: USM[A]: * PARM[A]; * OCFO/PA&E[A]; Component head[B]: * CAE[B]: - Program management office[C]; Component head[B]: * CAE[B]: - Program management office[C]; Component head[B]: * CAE[B]: - Program management office[C]. [A] Department level. [B] Component level. [C] Program level. Source: GAO analysis of DHS acquisition policy. [End of figure] The Office of Program Analysis and Evaluation (PA&E), within the Office of the Chief Financial Officer (OCFO), is responsible for advising the USM, among others, on resource allocation issues. PA&E coordinates with DHS's Office of Policy on the department's long-term strategic planning efforts, analyzing budget submissions, cost estimates, and resource constraints. PA&E also oversees the development of the Future Years Homeland Security Program (FYHSP). DHS is required to submit the FYHSP to Congress annually with each budget request. The FYHSP is DHS's 5-year funding plan for programs approved by the Secretary that are to support the DHS strategic plan. The FYHSP provides a detailed account of time-phased resource requirements for each component, as well as programs' cost estimates, milestones, and performance measures.[Footnote 11] Nearly All Program Managers Surveyed Reported Significant Challenges: Nearly all of the program managers we surveyed reported their programs had experienced significant challenges increasing the risk of poor outcomes, particularly cost growth and schedule slips. Sixty-eight of the 71 programs that responded to our survey reported that they experienced funding instability, faced workforce shortfalls, or their planned capabilities changed after initiation. Most program managers reported a combination of these challenges, as illustrated in figure 3. Figure 3: Breakdown of 68 Major Acquisition Programs Experiencing One or More Significant Management Challenges: [Refer to PDF for image: overlapping spheres] Planned program changes: 43 programs; Funding instability: 61 programs; Workforce shortfalls: 51 programs. Source: GAO analysis of survey data. Note: Fifty-nine survey respondents reported whether their program's planned capabilities changed since design and development activities began; 71 survey respondents reported whether their programs experienced funding instability; 62 survey respondents reported whether their program's experienced workforce shortfalls. [End of figure] We have previously reported that these challenges increase the likelihood acquisition programs will cost more and take longer to deliver capabilities than expected.[Footnote 12] Although DHS lacks the reliable cost estimates and realistic schedules needed to accurately measure program performance, it has submitted some cost information to Congress, and PARM conducted an internal review of its major acquisition programs in March 2012. We used this information and our survey results to identify 42 programs that experienced cost growth, schedule slips, or both.[Footnote 13] Cost information DHS submitted to Congress provides insight into the magnitude of the cost growth for 16 of the 42 programs. Using this information, we found total project costs increased from $19.7 billion in 2008 to $52.2 billion in 2011, an aggregate increase of 166 percent. See figure 4. Figure 4: Number of Major Acquisition Programs Experiencing Cost Growth or Schedule Slips: [Refer to PDF for image: illustration] Cost growth: 13 programs; Cost growth & schedule slip: 14 programs. Combined: Total magnitude of cost growth undefined: 11 programs; Total magnitude of cost growth reported to Congress: 16 programs: 2008: $19.7 billion; 2011: $52.2 billion. Schedule slip: 15 programs. Source: GAO analysis of DHS and survey data. Note: We calculated the magnitude of the total-project cost growth using DHS FYHSPs issued in 2008 and 2011. DHS did not submit FYHSPs in 2009 or 2010. FYHSP cost estimates are presented in then-year dollars. Additionally, 40 survey respondents reported having a DHS approved Acquisition Program Baseline and whether their program had experienced cost growth or schedule slips. [End of figure] We have previously reported that cost growth and schedule slips can lead to reduced capabilities, decreasing the value provided to the operator--as well as the value of the resources invested in the programs.[Footnote 14] This poor performance threatens the department's ability to successfully field the capabilities it is pursuing. Most Major Programs Reported Their Planned Capabilities Changed after Initiation: Prior to entering the Obtain phase, programs are to establish the specific capabilities they plan to develop to improve DHS's ability to execute its mission. Forty-three survey respondents reported that their programs changed planned capabilities after the initiation of design and development activities, which occurs between ADE 2B and testing. We have previously found that both increases and decreases in planned capabilities are associated with cost growth and schedule slips.[Footnote 15] We have found that increasing planned capabilities can lead to cost growth or schedule slips because programs are more costly to change after they begin development activities. Alternatively, we have stated that programs may choose to decrease their planned capabilities in response to cost growth or schedule slips in an effort to maintain affordability or deliver certain capabilities when needed. At DHS, we found that more than half of the 43 programs that reported changing their capabilities had experienced cost growth or schedule slips, regardless of whether their planned capabilities increased, decreased, or both. See figure 5. Figure 5: Programs that Changed Planned Capabilities and Experienced Cost Growth or Schedule Slips: [Refer to PDF for image: stacked horizontal bar graph] 17 programs increased some planned capabilities: Cost growth: 5; Cost growth & schedule slip: 2; Schedule slip: 3; No reported growth or schedule slip: 7. 9 programs both increased and decreased some planned capabilities: Cost growth: 2; Cost growth & schedule slip: 3; Schedule slip: 3; No reported growth or schedule slip: 1. 17 programs decreased some planned capabilities: Cost growth: 4; Cost growth & schedule slip: 2; Schedule slip: 4; No reported growth or schedule slip: 7. Source: GAO analysis of survey data. Note: Fifty-nine survey respondents replied to whether their program had changed planned capabilities. Forty respondents reported having a DHS approved Acquisition Program Baseline and whether their program had experienced cost growth or schedule slips. [End of figure] The 43 survey respondents that reported their planned capabilities changed identified five key reasons for the changes. Nineteen of the 43 survey respondents reported more than one reason. See figure 6. Figure 6: Reasons Programs Changed Their Planned Capabilities: [Refer to PDF for image: horizontal bar graph] Reasons programs' increased planned capabilities changed: Funding availability; Number of programs that decreased planned capabilities: 21; Number of programs that increased planned capabilities: 11. Reasons programs' increased planned capabilities changed: Changes in program schedule; Number of programs that decreased planned capabilities: 15; Number of programs that increased planned capabilities: 8. Reasons programs' increased planned capabilities changed: Operator input; Number of programs that decreased planned capabilities: 2; Number of programs that increased planned capabilities: 21. Reasons programs' increased planned capabilities changed: Technology development efforts/availability; Number of programs that decreased planned capabilities: 6; Number of programs that increased planned capabilities: 11. Reasons programs' increased planned capabilities changed: Mission change; Number of programs that decreased planned capabilities: 2; Number of programs that increased planned capabilities: 9. Source: GAO analysis of survey data. Note: Fifty-nine survey respondents replied to whether their programs' planned capabilities changed; 43 reported reasons their planned capabilities changed; 19 of the 43 reported multiple reasons. [End of figure] Survey respondents identified operator input as the most common reason for increasing planned capabilities after the initiation of development efforts, even though officials at the department, component, and program levels all said operator input at the initiation of design and development is very useful. For example, in 2011, we reported that the U.S. Citizenship and Immigration Services's Transformation program did not fully define its planned capabilities before it awarded a contract to develop a new system to enhance the adjudication of applications. After the contract was awarded, the program office worked with those officials most familiar with adjudication operations and discovered that the functions were more complex than expected. As a result, the program office revised the requirements, and the deployment date for key capabilities slipped from April 2011 to October 2012.[Footnote 16] Alternatively, DHS program managers identified funding availability as the most common reason for decreasing planned capabilities after the initiation of development efforts. In the past, we have stated that agencies may reduce planned capabilities in this manner when their programs experience cost growth.[Footnote 17] Decreasing planned capabilities in response to affordability concerns may be fiscally responsible, but as a result, operators may not receive the capability originally agreed upon to address existing capability gaps. Most Major Programs Reported They Experienced Funding Instability: DHS is required to establish out-year funding levels for programs annually in the FYHSP. Changes to planned out-year funding levels create funding instability, which we have previously found increases the risk of cost growth, schedule slips, and capability shortfalls. [Footnote 18] Sixty-one survey respondents reported that their programs have experienced funding instability, and we found that 44 of the 61 programs had also realized cost growth, schedule slips, or capability reductions.[Footnote 19] Additionally, 29 survey respondents reported that their programs had to resequence the delivery of certain capabilities.[Footnote 20] For example, Coast Guard officials told us they deferred some of the HH-60 helicopter's capabilities because of funding constraints across their portfolio of programs. The Coast Guard delayed delivery of dedicated radar to search the surface of the water in order to replace critical components, such as main rotor blades, as planned. Figure 7 identifies how program managers reported funding instability has affected their programs. Figure 7: Effects of Funding Instability on Programs, as Identified by Program Managers: [Refer to PDF for image: vertical bar graph] Effect of funding instability: Cost growth; Number of programs: 19. Effect of funding instability: Schedule slips; Number of programs: 23. Effect of funding instability: Capability reductions; Number of programs: 8. Effect of funding instability: Resequenced segments; Number of programs: 29. Source: GAO analysis of survey data. Note: Seventy-one survey respondents replied to whether their program experienced funding instability; 61 survey respondents reported whether they experienced an effect of funding instability; 40 reported at least one effect; 30 of the 40 reported multiple effects. [End of figure] Forty-five of the 61 survey respondents that reported their programs experienced funding instability also reported reasons for the funding instability. Twenty-two survey respondents reported more than one reason. See figure 8. Figure 8: Reasons Programs Experienced Funding Instability: [Refer to PDF for image: horizontal bar graph] Reasons programs' out-year funding changed: Difference between congressional mark and submitted budget; Number of programs with decreased out-year funding: 17; Number of programs with increased out-year funding: 5. Reasons programs' out-year funding changed: Another program's funding needs; Number of programs with decreased out-year funding: 18; Number of programs with increased out-year funding: 1. Reasons programs' out-year funding changed: Continuing resolution; Number of programs with decreased out-year funding: 10; Number of programs with increased out-year funding: 3. Reasons programs' out-year funding changed: Mission/requirements change; Number of programs with decreased out-year funding: 7; Number of programs with increased out-year funding: 3. Reasons programs' out-year funding changed: Life-cycle cost estimate does not reflect true cost; Number of programs with decreased out-year funding: 3; Number of programs with increased out-year funding: 4. Reasons programs' out-year funding changed: Program delay due to technical changes; Number of programs with decreased out-year funding: 3; Number of programs with increased out-year funding: 0. Source: GAO analysis of survey data. Note: Seventy-one survey respondents replied to whether their program experienced funding instability; 45 reported reasons for their funding instability; 22 of the 45 reported multiple reasons for funding instability. [End of figure] Eighteen survey respondents reported that their program experienced a funding decrease because of another program's funding needs. We have previously reported that agencies often change funding levels in this manner when they commit to more programs than they can afford. [Footnote 21] A PA&E official told us that DHS's resource requirements exceed the department's funding levels, and that the department has allowed major acquisition programs to advance through the acquisition life cycle without identifying how they will be funded. Furthermore, a PA&E official stated that DHS has not been able to determine the magnitude of its forthcoming funding gap because cost estimates are unreliable. The director of the department's cost analysis division determined that only 12 major acquisition programs met most of DHS's criteria for reliable cost estimates when it reviewed the components' fiscal year 2013 budget submissions.[Footnote 22] In 2010, we reported that DHS officials had difficulty managing major programs because they lacked accurate cost estimates.[Footnote 23] Given the fiscal challenges facing the federal government, funding shortfalls may become an increasingly common challenge at DHS, leading to further cost growth that widens the gap between resource requirements and available funding. Most Major Programs Reported They Experienced Workforce Shortfalls: DHS acquisition policy establishes that each program office should be staffed with personnel who have appropriate qualifications and experience in key disciplines, such as systems engineering, logistics, and financial management.[Footnote 24] Fifty-one survey respondents reported that their programs had experienced workforce shortfalls-- specifically a lack of government personnel--increasing the likelihood their programs will perform poorly in the future. We have previously reported that a lack of adequate staff in DHS program offices--both in terms of skill and staffing levels--increased the risk of insufficient program planning and contractor oversight, which is often associated with cost growth and schedule slips.[Footnote 25] Figure 9 below identifies the functional areas where DHS acquisition programs reported workforce shortfalls. Figure 9: Functional Areas Contributing to Workforce Shortfalls: [Refer to PDF for image: pie-chart] No workforce shortfall: 11 programs; Workforce shortfall: 51 programs. Programs that reported a shorfall by functional area: Program management: 49; Business functions[A]: 47; Engineering and technical[B]: 43. Source: GAO analysis of survey data. Note: Sixty-two survey respondents reported whether their program experienced workforce shortfalls in government full-time-equivalents in three functional areas; 46 of the 51 reported shortfalls in multiple functional areas. [A] Includes auditing, business, cost estimating, financial management, property management, and purchasing: [B] Includes systems planning, research, development, and engineering; life-cycle logistics; test and evaluation; production; quality and manufacturing; and facilities engineering. [End of figure] We found that 29 of the 51 DHS programs that identified workforce shortfalls had also experienced cost growth or schedule slips. [Footnote 26] The workforce shortfalls have led to insufficient program planning, hindering the development of key acquisition documents intended to inform senior-level decision making. For example, CAEs and program managers said that workforce shortfalls limited program management offices' interaction with stakeholders and operators, and delayed or degraded test plans and cost estimates. In addition, a PARM official explained that DHS has had to rely on contractors to produce cost estimates because of workforce shortfalls, and the quality of these cost estimates has varied. The USM has stated that properly staffing programs is one of DHS's biggest challenges, and we have previously reported that the capacity of the federal government's acquisition workforce has not kept pace with increased spending for increasingly complex purchases.[Footnote 27] PARM officials told us that the IRB's program reviews include assessments of the program office workforce, but that the IRB considers staffing issues a relatively low priority, and we found the IRB has formally documented workforce-related challenges for only 11 programs. Programs Proceed without Meeting Sound DHS Acquisition Policy: DHS acquisition policy reflects many key program management practices.[Footnote 28] It requires programs to develop documents demonstrating critical knowledge that would help leaders make better informed investment decisions when managing individual programs. This knowledge would help DHS mitigate the risks of cost growth and schedule slips resulting from funding instability, workforce shortfalls, and planned-capability changes. However, as of April 2012, the department had only verified that four programs documented all of the critical knowledge required to progress through the acquisition life cycle. In most instances, DHS leadership has allowed programs it has reviewed to proceed with acquisition activities without meeting these requirements. Officials explained that DHS's culture has emphasized the need to rapidly execute missions more than sound acquisition management practices, and we have found that most of the department's major programs are at risk of cost growth and schedule slips as a result. In addition, they lack the reliable cost estimates, realistic schedules, and agreed-upon baseline objectives that DHS acknowledges are needed to accurately track program performance, limiting DHS leadership's ability to effectively manage those programs and provide information to Congress. DHS recognizes the need to implement its acquisition policy more consistently, but significant work remains. DHS Acquisition Policy Generally Reflects Key Program Management Practices: In 2005, we reported that DHS established an investment review process that adopted many practices to reduce risk and increase the chances for successful outcomes.[Footnote 29] In 2010, we reported that AD 102 provided more detailed guidance for preparing key acquisition documents than the department's predecessor policy. In October 2011, DHS updated the Guidebook and its appendixes, and we have found that it establishes a knowledge-based acquisition policy for program management that is largely consistent with key practices. A knowledge-based approach to capability development allows developers to be reasonably certain, at critical points in the acquisition life cycle, that their products are likely to meet established cost, schedule, and performance objectives.[Footnote 30] This knowledge provides them with information needed to make sound investment decisions, and it would help DHS address the significant challenges we identified across its acquisition programs: funding instability, workforce shortfalls, and planned-capability changes. Over the past several years, our work has emphasized the importance of obtaining key knowledge at critical points in major system acquisitions and, based on this work, we have identified eight key practice areas for program management. These key practice areas are summarized in table 3, along with our assessment of DHS's acquisition policy. Table 3: GAO Assessment of DHS's Acquisition Policy Compared to Key Program-management Practices: GAO key practice area: Identify and validate needs; Summary of key practices: Current capabilities should be identified to determine if there is a gap between the current and needed capabilities. A need statement should be informed by a comprehensive assessment that considers the organization's overall mission; GAO assessment of DHS acquisition policy: DHS policy reflects key practices. GAO key practice area: Assess alternatives to select most appropriate solution; Summary of key practices: Analyses of Alternatives should be conducted early in the acquisition process to compare key elements of competing solutions, including performance, costs, and risks. Moreover, these analyses should assess many alternatives across multiple concepts; GAO assessment of DHS acquisition policy: DHS policy reflects key practices. GAO key practice area: Clearly establish well-defined requirements; Summary of key practices: Requirements should be well defined and include input from operators and stakeholders. Programs should be grounded in well-understood concepts of how systems would be used and likely requirements costs; GAO assessment of DHS acquisition policy: DHS policy reflects key practices. GAO key practice area: Develop realistic cost estimates and schedules; Summary of key practices: A cost estimate should be well documented, comprehensive, accurate, and credible. A schedule should identify resources needed to do the work and account for how long all activities will take. Additionally, a schedule should identify relationships between sequenced activities; GAO assessment of DHS acquisition policy: DHS policy reflects key practices. GAO key practice area: Secure stable funding that matches resources to requirements; Summary of key practices: Programs should make trade-offs as necessary when working in a constrained budget environment; GAO assessment of DHS acquisition policy: DHS policy reflects key practices. GAO key practice area: Demonstrate technology, design, and manufacturing maturity; Summary of key practices: Capabilities should be demonstrated and tested prior to system development, making a production decision, and formal operator acceptance; GAO assessment of DHS acquisition policy: DHS policy partially reflects key practices. GAO key practice area: Utilize milestones and exit criteria; Summary of key practices: Milestones and exit criteria--specific accomplishments that demonstrate progress--should be used to determine that a program has developed required and appropriate knowledge prior to a program moving forward to the next acquisition phase; GAO assessment of DHS acquisition policy: DHS policy substantially reflects key practices. GAO key practice area: Establish an adequate workforce; Summary of key practices: Acquisition personnel should have appropriate qualifications and experience. Program managers should stay on until the end of an acquisition life-cycle phase to assure accountability. Government and contractor staff should also remain consistent; GAO assessment of DHS acquisition policy: DHS policy partially reflects key practices. Source: GAO analysis of DHS acquisition policy. Note: Appendixes I and II present a more detailed description of key program-management practices and how we assessed them. [End of table] We found that DHS's acquisition policy generally reflects key program- management practices, including some intended to help develop knowledge at critical points in the acquisition life cycle. Furthermore, the revised policy the department issued in October 2011 better reflects two key practice areas by bolstering exit criteria and taking steps to establish an adequate acquisition workforce. Specifically, the revised Guidebook and its appendixes require that refined cost estimates be reviewed at major milestones after the program baseline has been established, and used to determine whether a program has developed appropriate knowledge to move forward in the acquisition life cycle. These reviews can help reduce risk and the potential for unexpected cost and schedule growth. Additionally, the revised policy establishes that major program offices should be staffed with personnel with appropriate qualifications and experience in key acquisition disciplines. We have previously identified that the magnitude and complexity of the DHS acquisition portfolio demands a capable and properly trained workforce and that workforce shortfalls increase the risk of poor acquisition outcomes. The policy revisions could help mitigate this risk. However, there are three areas where DHS could further enhance acquisition oversight: * The policy requires that DHS test technologies and manufacturing processes, but it does not require that 1) programs demonstrate technologies in a realistic environment prior to initiating development activities at the outset of the Obtain phase, or 2) manufacturing processes be tested prior to production. These practices decrease the risk that rework will be required, which can lead to additional cost growth and schedule slips. * The policy requires that DHS establish exit criteria for programs moving to the next acquisition phase, and standardizes document requirements across all major programs, but it does not require that 1) exit criteria be quantifiable to the extent possible, or 2) consistent information be used across programs when approving progress within the Obtain phase, specifically at ADE 2B and 2C. These practices decrease the risk that a program will make an avoidable error because management lacks information needed to leverage lessons learned across multiple program reviews. * The policy requires that program managers be certified at an appropriate level, but it does not state that they should remain with their programs until the next major milestone when possible. This practice decreases the risk that program managers will not be held accountable for their decisions, such as proceeding without reliable cost estimates or realistic schedules. PARM officials generally acknowledged DHS has opportunities to strengthen its program-management guidance. Officials reported that they are currently in the process of updating AD 102, which they plan to complete by the end of fiscal year 2012. They also plan to issue revisions to the associated guidebook and appendixes in phases. PARM officials told us that they plan to structure the revised acquisition policy by function, consolidating guidance for financial management, systems engineering, reporting requirements, and so forth. PARM officials anticipate that this organization will make it easier for users to identify relevant information as well as streamline the internal review process for future updates. DHS Has Approved Few Programs' Key Acquisition Documents: DHS acquisition policy establishes several key program-management practices through document requirements. AD 102 requires that major acquisition programs provide the IRB documents demonstrating the critical knowledge needed to support effective decision making before progressing through the acquisition life cycle. For example, programs must document that they have assessed alternatives to select the most appropriate solution through a formal Analysis of Alternatives report, which must be approved by component-level leadership. Figure10 identifies acquisition documents that must be approved at the department level and their corresponding key practice areas. Figure 10: DHS Acquisition Documents Requiring Department-level Approval: [Refer to PDF for image: table] GAO key practice area: Identify and validate needs; Key acquisition documents required by DHS acquisition policy: Mission Need Statement; Phase for which document is required: Analyze/Select. GAO key practice area: Clearly established well-defined requirements; Key acquisition documents required by DHS acquisition policy: Operational Requirements Document; Phase for which document is required: Obtain. GAO key practice area: Secure stable funding that matches resources to requirements; Key acquisition documents required by DHS acquisition policy: Acquisition Program Baseline; Phase for which document is required: Obtain. GAO key practice area: Develop realistic cost estimates and schedules; Key acquisition documents required by DHS acquisition policy: Acquisition Program Baseline; Integrated Logistics Support Plan; Phase for which document is required: Obtain. GAO key practice area: Demonstrate technology, design, and manufacturing maturity; Key acquisition documents required by DHS acquisition policy: Test and Evaluation Master Plan; Phase for which document is required: Produce/Deploy/Support. Source: GAO analysis of DHS acquisition policy. [End of figure] DHS acquisition policy requires these documents, but the department generally has not implemented its acquisition policy as intended, and in practice the department has not adhered to key program management practices. DHS's efforts to implement the department's acquisition policy have been complicated by the large number of legacy programs initiated before the department was created, including 11 programs that PARM officials told us were in sustainment when AD 102 was signed.[Footnote 31] We found that the department has only approved four programs' required documents in accordance with DHS policy: the National Cybersecurity and Protection System, the Next Generation Network, the Offshore Patrol Cutter, and the Passenger Screening Program. Additionally, we found that 32 programs had none of the required documents approved by the department. See figure 11. Figure 11: Programs That Have Key Acquisition Documents Approved in Accordance with AD 102: [Refer to PDF for image: pie-chart] Programs have all documents approved by the department as required: 4; Programs have some documents approved by the department as required: 30; Programs have no documents approved by the department as required: 32. Source: GAO analysis of survey data, DHS acquisition decision memoranda, and DHS acquisition policy. Note: Appendix IV identifies which documents have been approved for each of the 71 programs that responded to our survey. Five programs are not accounted for in this figure because their documents did not require department-level approval. See appendix IV for additional information. [End of figure] According to PARM officials, 10 of the 32 programs with no documents approved were in sustainment at the time AD 102 was signed, as was 1 of the 30 programs with some documents approved -the Application Support Center. DHS approved the Application Support Center's Mission Need Statement and Acquisition Program Baseline in March 2011. Since 2008, DHS leadership--through the IRB or its predecessor body the Acquisition Review Board--has formally reviewed 49 of the 71 major programs that responded to our survey. It permitted 43 of those programs to proceed with acquisition activities without verifying the programs had developed the knowledge required for AD 102's key acquisition documents. See figure 12. Figure 12: Programs Reviewed by DHS's Executive Review Board: [Refer to PDF for image: vertical bar graph] 71 major acquisition programs responded to our survey; 49 out of the 71 programs that responded to our survey have been formally reviewed by a DHS executive board since 2008; 43 programs proceeded without stating critical knowledge: 1 program's critical-knowledge requirements were waived; 1 program did not proceed; 4 programs demonstrated critical knowledge before proceeding. Source: GAO analysis of DNS documents. [End of figure] Officials from half of the CAE offices we spoke to reported that DHS's culture has emphasized the need to rapidly execute missions more than sound acquisition management practices. PARM officials agreed, explaining that DHS has permitted programs to advance without department-approved acquisition documents because DHS had an operational need for the promised capabilities, but the department could not approve the documents in a timely manner. PARM officials explained that, in certain instances, programs were not capable of documenting knowledge, while in others, PARM lacked the capacity to validate that the documented knowledge was adequate. In 2008 and 2010, we reported that several programs were permitted to proceed with acquisition activities on the condition they complete key action items in the future.[Footnote 32] However, PARM officials told us that many of these action items were not addressed in a timely manner. Additionally, program managers reported that there has been miscommunication between DHS headquarters and program offices regarding implementation of the acquisition policy, and we found that DHS headquarters and program managers often had a different understanding of whether their programs were in compliance with AD 102. For example, DHS headquarters officials told us that 19 of the 40 programs that reported through our survey they had department-approved acquisition program baselines (APB) in fact did not. Because DHS has not generally implemented its acquisition policy, senior leaders lack the critical knowledge needed to accurately track program performance: (1) department-approved APBs, (2) reliable cost estimates, and (3) realistic schedules. Specifically, at the beginning of 2012, DHS leadership had approved APBs for less than one-third of the 63 programs we reviewed that are required to have one based on their progression through the acquisition life cycle. Additionally, we found that none of the programs with a department-approved APB also met DHS's criteria for both reliable cost estimates and realistic schedules, which are key components of the APB. This raises questions about the quality of those APBs that have been approved, as well as the value of the DHS review process in practice. Figure 13 identifies how many programs currently have department-approved APBs, reliable cost estimates, and realistic schedules. Figure 13: Programs with Department-approved APBs, Reliable Cost Estimates, and Realistic Schedules: [Refer to PDF for image: illustration of concentric circles] 63 programs required to have department-approved APBs: 20 programs have department-approved APBs: 6 programs met criteria for reliable cost estimates; 4 programs met criteria for realistic schedules. Source: GAO analysis of DHS acquisition decision memoranda, cost estimate data, and survey data. Note: We analyzed DHS acquisition decision memorandums to determine whether a program had an approved APB, and we reviewed an internal DHS assessment to determine the reliability of programs' cost estimates. Sixty-eight survey respondents replied to our scheduling questions. [End of figure] Acquisition Program Baselines: The APB is a critical tool for managing an acquisition program. According to DHS's acquisition Guidebook, the program baseline is the agreement between program, component, and department level officials, establishing how systems will perform, when they will be delivered, and what they will cost.[Footnote 33] In practice, when the Acquisition Decision Authority approves a program's APB, among other things, it is concurring that the proposed capability is worth the estimated cost. However, we found that DHS plans to spend more than $105 billion on programs lacking current, department-approved APBs. Specifically, when DHS submitted the FYHSP to Congress in 2011, it reported that 34 of the 43 programs lacking department-approved APBs were expected to cost $108.8 billion over their acquisition life cycles. DHS did not provide cost estimates for the other 9 programs because the data were unreliable. In addition to overall cost, schedule, and performance goals, the APB also contains intermediate metrics to measure a program's progress in achieving those goals. These intermediate metrics allow managers to take corrective actions earlier in the acquisition life cycle. DHS's lack of APBs, PARM officials explained, makes it more difficult to manage program performance. In March 2012, PARM reported that 32 programs had experienced significant cost growth or schedule slips in its internal Quarterly Program Accountability Report. However, DHS has only formally established that 8 of its programs have fallen short of their cost, schedule, or performance goals, because approximately three-quarters of the programs PARM identified lack the current, department-approved APBs needed to authoritatively measure performance. Cost Estimates and Schedules: To accurately assess a program's performance, managers need accurate cost and schedule information. However, DHS acquisition programs generally do not have reliable cost estimates and realistic schedules, as required by DHS policy.[Footnote 34] In June 2012, the department reported to GAO that its senior leaders lacked confidence in the performance data they receive, hindering their efforts to manage risk and allocate resources. PARM and PA&E officials said that cost estimates provided by program management offices often understate likely costs. PA&E officials explained that many programs have not included operations and maintenance activities in their cost estimates, which we have previously reported can account for 60 percent or more of a program's total costs.[Footnote 35] The director of the department's cost analysis division determined that only 12 major acquisition programs met most of DHS's criteria for reliable cost estimates, and 6 of these programs lacked current, department-approved APBs.[Footnote 36] Additionally, only 12 program offices reported that they fully adhered to DHS's scheduling guidance, which requires that programs sequence all activities, examine the effects of any delays, update schedules to ensure validity, and so forth. Eight of these programs lacked department-approved APBs.[Footnote 37] DHS's lack of reliable performance data not only hinders its internal acquisition management efforts, but also limits Congressional oversight. Congress mandated the department submit the Comprehensive Acquisition Status Report (CASR) to the Senate and House Committees on Appropriations as part of the President's fiscal year 2013 budget, which was submitted in February 2012. However, DHS told us that it did not do so until August 2012. Congress mandated DHS produce the CASR in order to obtain information necessary for in-depth congressional oversight, including life-cycle cost estimates, schedules, risk ratings, and out-year funding levels for all major programs. The CASR has the potential to greatly enhance oversight efforts by establishing a common understanding of the status of all major programs. DHS Recognizes the Need to Implement Its Acquisition Policy More Consistently, but Significant Work Remains: In April 2012, PARM officials told us that DHS had begun to implement its acquisition policy in a more disciplined manner. They told us that they had adequate capacity to review programs, and would no longer advance programs through the acquisition life cycle until DHS leadership verified the programs had developed critical knowledge. For example, in February 2012, the IRB denied a request from the BioWatch Gen 3 program--which is developing a capability to detect airborne biological agents--to solicit proposals from contractors because its draft APB was not valid. PARM officials said they are using a risk- based approach to prioritize the approval of the department's APBs. Specifically, they explained that one of their fiscal year 2011 initiatives was to attain department-level approval of APBs for all Level 1 programs in the Obtain phase of the acquisition life cycle. However, we found only 8 of the 19 programs PARM said fell into this category had current, department-approved APBs as of September 2012. In an effort to improve the consistency of performance data reported by program managers, PARM officials stated that they are establishing scorecards to assess cost estimates and standard work breakdown structures for IT programs. The PARM officials also explained that CAE's performance evaluations now include an assessment of the completeness and accuracy of performance data reported for their respective programs. However, DHS must overcome significant challenges in order to improve the reliability of performance data and meet key requirements in the department's acquisition policy. For example, department and component-level officials told us that program managers do not report on their programs in a consistent manner. Additionally, DHS officials told us that they lack cost estimating capacity throughout the department and that they must rely heavily on contractors, which do not consistently provide high-quality deliverables. In August 2012, a PARM official stated that DHS was currently in the process of hiring eight additional government cost estimators to support programs. DHS Needs Policy and Process Enhancements to Effectively Manage Its Portfolio of Investments: DHS acquisition policy does not fully reflect several key portfolio- management practices, such as allocating resources strategically, and DHS has not yet reestablished an oversight board to manage its investment portfolio across the department. As a result, DHS has largely made investment decisions on a program-by-program and component-by-component basis. The widespread risk of poorly understood cost growth, coupled with the fiscal challenges facing the federal government, makes it essential that DHS allocate resources to its major programs in a deliberate manner. DHS plans to develop stronger portfolio-management policies and processes, but until it does so, DHS programs are more likely to experience additional funding instability in the future, which will increase the risk of further cost growth and schedule slips. These outcomes, combined with a tighter budget, could prevent DHS from developing needed capabilities. DHS Acquisition Policy Shortfalls Hinder Portfolio Management Efforts: In our past work, we have found that successful commercial companies use a disciplined and integrated approach to prioritize needs and allocate resources.[Footnote 38] As a result, they can avoid pursuing more projects than their resources can support, and better optimize the return on their investment. This approach, known as portfolio management, requires companies to view each of their investments as contributing to a collective whole, rather than as independent and unrelated. With this enterprise perspective, companies can effectively (1) identify and prioritize opportunities, and (2) allocate available resources to support the highest priority--or most promising-- opportunities. Over the past several years, we have examined the practices that private and public sector entities use to achieve a balanced mix of new projects, and based on this work, we have identified four key practice areas for portfolio management, summarized in table 4, along with our assessment of DHS acquisition policy. Table 4: GAO Assessment of DHS's Acquisition Policy Compared to Key Portfolio-management Practices: GAO key practice area: Clearly define and empower leadership; Summary of key practices: Portfolio managers, with the support of cross-functional teams, should be empowered to make investment decisions, and held accountable for outcomes; GAO assessment of DHS acquisition policy: DHS policy partially reflects key practices. GAO key practice area: Establish standard assessment criteria, and demonstrate comprehensive knowledge of the portfolio; Summary of key practices: Investments should be ranked and selected using a disciplined process to assess the costs, benefits, and risks of alternative products to ensure transparency and comparability across alternatives; GAO assessment of DHS acquisition policy: DHS policy partially reflects key practices. GAO key practice area: Prioritize investments by integrating the requirements, acquisition, and budget processes; Summary of key practices: Organizations should use long-range planning and an integrated approach to prioritize needs and allocate resources in accordance with strategic goals, so they can avoid pursuing more products than they can afford and optimize return on investment; GAO assessment of DHS acquisition policy: DHS policy partially reflects key practices. GAO key practice area: Continually make go/no-go decisions to rebalance the portfolio; Summary of key practices: Reviews should be scheduled (1) annually to consider proposed changes, (2) as new opportunities are identified, (3) whenever a program breaches its objectives, and (4) after investments are completed. Information gathered during these reviews should be used to adjust and balance the portfolio to achieve strategic outcomes; GAO assessment of DHS acquisition policy: DHS policy minimally reflects key practices. Source: GAO analysis of DHS acquisition policy. Note: Appendixes I and II present a more detailed description of key portfolio management practices and how we assessed them. [End of table] We found that DHS's acquisition policy reflects some key portfolio- management practices. DHS has not designated individual portfolio managers, but it requires that the department's Chief Acquisition Officer--currently the USM--be supported by the IRB, which includes officials representing key functional areas, such as budget, procurement, IT, and human capital. DHS's acquisition policy also establishes that requirements, acquisition, and budget processes should be connected to promote stability. However, as acknowledged by DHS officials, the policy does not reflect several other key portfolio-management practices: * The policy does not empower portfolio managers to decide how best to invest resources. This practice increases the likelihood resources will be invested effectively, and that portfolio managers will be held accountable for outcomes. * The policy does not establish that investments should be ranked and selected using a disciplined process. This practice increases the likelihood the portfolio will be balanced with risk spread across products. * The policy does not establish that (1) resource allocations should align with strategic goals, or (2) the investment review policy should use long-range planning. These practices increase the likelihood that the right amount of funds will be delivered to the right projects, maximizing return on investments. * The policy does not require portfolio reviews (1) annually to consider proposed changes, (2) as new opportunities are identified, or (3) whenever a program breaches its objectives. These practices provide opportunities for leaders to increase the value of investments, determine whether or not the investments are still relevant and affordable, and help keep programs within cost and schedule targets. PARM officials acknowledge that the department does not currently have a policy that addresses these key portfolio-management practices. Further, they told us that there has been less focus on portfolio management than program management to date because the acquisition process is still relatively immature. As a result, DHS largely makes investment decisions on a program-by-program and component-by- component basis. In our work at the Department of Defense, we have found this approach hinders efforts to achieve a balanced mix of programs that are affordable and feasible and that provide the greatest return on investment.[Footnote 39] PARM officials anticipate that DHS will improve its portfolio- management guidance in the future by formalizing its proposed Integrated Investment Life Cycle Model (IILCM). In January 2011, DHS presented a vision of the IILCM as a means to better integrate investment management functions, including requirements development, resource allocation, and program governance. DHS explained that the IILCM would ensure mission needs drive investment decisions and establish a common framework for monitoring and assessing the department's investments. The IILCM would be implemented through the creation of several new department-level councils, as illustrated in figure 14, which would identify priorities and capability gaps. Figure 14: Councils and Offices in DHS's Proposed IILCM: [Refer to PDF for image: illustrated table] Department Strategy Council (proposed): * Provides strategic direction; * Focuses on mission outcomes. Functional Coordination Offices: Screening (existing); Law enforcement (proposed); Domain awareness (proposed); Enterprise business services (proposed). Capabilities and requirements Council (proposed): * Conducts portfolio analysis; * Makes trade-off decisions; * Harmonizes operational requirements; * Assesses alternatives. Program Review Board (existing): * Considers affordability; * Prioritizes within fiscal constraints; * Formulates budget; * Allocates resources. Investment Review Board (existing): * Manages programs; * Establishes baselines; * Manages risk. Source: DHS documents. [End of figure] DHS Investment Process Shortfalls Hinder Portfolio Management Efforts: In 2003, DHS established the Joint Requirements Council (JRC) to identify crosscutting opportunities and common requirements among DHS components, and help determine how DHS should use its resources. However, as we have previously reported, the JRC stopped meeting in 2006.[Footnote 40] In 2008, we recommended that the JRC be reinstated, or that DHS establish another joint requirements oversight board. At that time, DHS officials recognized that strengthening the JRC was a top priority. The department has proposed the creation of a Capabilities and Requirements Council (CRC) to serve in a similar role as the JRC, but the CRC is not yet established. In the absence of a JRC, or the proposed CRC, DHS budget officials explained it is difficult to develop a unified strategy to guide trade- offs between programs because of the diversity of the department's missions. Poor program outcomes, coupled with a tighter budget, could prevent DHS from developing needed capabilities. In our work at the Department of Defense, we have found that agencies must prioritize investments, or programs will continually compete for funding by promising more capabilities than they can deliver while underestimating costs.[Footnote 41] We also found that success was measured in terms of keeping a program alive rather than efficiently delivering the capabilities needed.[Footnote 42] It appears the lack of prioritization is affecting DHS in the same way. As discussed earlier in our assessment of program challenges, 18 of the department's programs reported DHS decreased their out-year funding levels because of another program's funding needs, and 61 programs reported they experienced some form of funding instability. Until recently, the responsibility for balancing portfolios has fallen on components. However, DHS policy officials noted that component- level officials have a relatively limited perspective focused on those programs under their authority, making it more difficult to ensure the alignment of mission needs to department-level goals. Additionally, component-level officials can only make trade-offs across the portion of the DHS portfolio that falls under their purview, limiting opportunities to increase the department's return on its investments. The USM and PARM officials have stated they recognize the value of portfolio management, and they have taken some steps to fill the gap left without a functioning JRC or CRC. A PARM official stated that, starting in 2012, PARM is collaborating with the Offices of the Chief Information, Financial, and Procurement Officers, as well as the Office of Policy, to conduct portfolio reviews from a functional, cross-component perspective. In the past, PARM's portfolio reviews focused on each component individually. This new functional approach is establishing portfolios based on departmentwide missions, such as domain awareness or screening, and PARM officials intend to produce trade-off recommendations for prioritizing funding across different components. They also intend to use functional portfolio reviews to provide greater insight into the effects of funding instability, and the USM has stated that the portfolio reviews will inform the department's fiscal year 2014 budget. DHS intends for the proposed CRC to make trade-offs across the functional portfolios. PARM's Quarterly Program Accountability Report (QPAR), issued in March 2012, also has the potential to inform DHS's portfolio management efforts. In developing the QPAR, PARM used a standardized set of five criteria to measure the value of each program: mission alignment, architectural maturity, capability gap, mission criticality, and DHS benefit. This allowed PARM to identify 48 high-value and 13 low-value programs. However, the QPAR does not recommend using the information to prioritize resource allocations, which would address a key portfolio management practice. Further, DHS's widespread lack of department-approved Mission Need Statements (MNS) undermines efforts to improve portfolio management and prioritize investments. The MNS links capability gaps to the acquisitions that will fill those gaps, making it a critical tool for prioritizing programs. The MNS also provides formal executive-level acknowledgment that there is a mission need justifying the allocation of DHS's limited resources. However, only about 40 percent of DHS's major acquisition programs have a department-approved MNS. DHS Acquisition Management Initiatives Target Longstanding Challenges, but Key Implementation Issues Remain: DHS has introduced seven initiatives that could improve acquisition management by addressing longstanding challenges we have identified-- such as funding instability and acquisition workforce shortfalls-- which DHS survey respondents also identified in 2012. Implementation plans are still being developed for all of these initiatives, and DHS is still working to address critical issues, particularly capacity questions. Because of this, it is too early to determine whether the DHS initiatives will be effective, as we have previously established that agencies must sustain progress over time to address management challenges.[Footnote 43] DHS is also pursuing a tiered-governance structure that it has begun to implement for IT acquisitions. Before the department can regularly delegate ADE decision authority through this tiered-governance structure, DHS must successfully implement its seven acquisition management initiatives and apply its knowledge-based acquisition policy on a more consistent basis to reduce risks and improve program outcomes. DHS Initiatives Are Intended to Address Longstanding Acquisition Management Challenges: In 2005, we identified acquisition management as a high-risk area at DHS.[Footnote 44] Since then, we have issued multiple reports identifying acquisition management challenges.[Footnote 45] In 2008, we made several recommendations intended to help DHS address those challenges, and in September 2010, we provided DHS a list of specific acquisition management outcomes the department must achieve to help address the high-risk designation. This list largely drew from our past recommendations, and stressed that the department must implement its knowledge-based acquisition policy consistently. DHS has generally concurred with our recommendations, but still faces many of the same challenges we have previously identified. In 2011, DHS began to develop initiatives to address these challenges, and DHS has continued to evolve these plans in 2012. In January 2011, DHS produced the initial iteration of its Integrated Strategy for High Risk Management in order to measure progress in addressing acquisition management challenges we had identified, as well as financial management, human capital, IT, and management integration issues. The department subsequently produced updates in June 2011, December 2011, and June 2012. These updates present the department's progress in developing and implementing its initiatives. Additionally, in December 2011, DHS issued the Program Management and Execution Playbook (Playbook), which expounded on some of those initiatives, and introduced a vision for a "more mature, agile, and effective process for program governance and execution." Figure 15 identifies seven key DHS initiatives and how they correspond to acquisition management challenges we have identified. Figure 15: Acquisition Management Challenges and Corresponding Initiatives: [Refer to PDF for image: table] DHS initiative: The Integrated Investment Life Cycle Model; Description of initiative: This initiative is intended to strengthen strategic decision making by establishing a repeatable decision making process at critical phases of the investment life cycle; it envisions the creation of (1) a Department Strategy Council to provide strategic direction, (2) Functional Coordination Offices to support functional portfolios, and (3) a Capabilities and Requirements Council to conduct portfolio analyses, make trade-off decisions, and inform the budget- formulation process; Corresponding challenge(s) identified by GAO: Gaps in portfolio management policies/processes; Poorly defined requirements; Funding instability. DHS initiative: Acquisition Workforce Development; Description of initiative: This initiative is intended to establish a certification program to train and develop the workforce; Corresponding challenge(s) identified by GAO: Acquisition workforce shortfalls, including cost estimators. DHS initiative: Program Management Corps; Description of initiative: This initiative is intended to increase the number of acquisition staff and fill key program office and component oversight positions; Corresponding challenge(s) identified by GAO: Acquisition workforce shortfalls, including cost estimators. DHS initiative: Procurement Staffing Model; Description of initiative: This initiative is intended to develop a model to determine the optimal number of personnel to properly award and administer contracts; Corresponding challenge(s) identified by GAO: Acquisition workforce shortfalls, including cost estimators. DHS initiative: Centers of Excellence; Description of initiative: This initiative is intended to bring together program managers, senior leadership staff, and subject matter experts to promote best practices, provide expert counsel, technical guidance, and acquisition management tools; Corresponding challenge(s) identified by GAO: Acquisition workforce shortfalls, including cost estimators; Miscommunication between DHS headquarters and program offices regarding the implementation of acquisition policy. DHS initiative: Component Acquisition Executive Structure; Description of initiative: This initiative is intended to strengthen the role of the Component Acquisition Executive and improve acquisition management; Corresponding challenge(s) identified by GAO: Miscommunication between DHS headquarters and program offices regarding the implementation of acquisition policy; Limited component-level acquisition management capacity. DHS initiative: Business Intelligence; Description of initiative: This initiative is intended to develop a tool to improve investment governance by providing access to accurate program data and metrics; Corresponding challenge(s) identified by GAO: Unreliable performance data, including program schedules. Source: GAO analysis of DHS initiatives. [End of figure] As envisioned, the DHS initiatives would better position the department to implement its knowledge-based acquisition policy on a more consistent basis to reduce risks and ultimately improve individual program outcomes. The initiatives would also help address challenges identified by survey respondents in 2012, particularly funding instability and acquisition workforce shortfalls. Additionally, the IILCM would enhance DHS's ability to effectively manage its acquisition portfolio as a whole. Some Initiatives Are Taking Longer to Implement Than Originally Envisioned, and Capacity Issues Could Create Further Challenges: DHS has made progress implementing some of the initiatives intended to address the challenges we have identified. In June 2012, DHS reported that all of its components had an approved CAE in place and the Procurement Staffing Model had been completed. In August 2012, DHS told us that eight Centers of Excellence had been chartered. However, from January 2011 to June 2012, the schedules for four of the seven initiatives slipped by at least 6 months, including the schedule for the IILCM, which slipped by a year. In March 2012, an official responsible for the IILCM initiative stated that many acquisition officials throughout the department do not yet understand the intended benefits of the IILCM. Thirty-two survey respondents reported that they were not at all familiar with the initiative, as opposed to nine that reported they were very familiar with the IILCM.[Footnote 46] Additionally, officials from three CAE offices, including two CAEs, told us that they were not familiar with the IILCM. Previously, we have reported that it is important to involve employees and obtain their ownership when transforming organizations.[Footnote 47] Figure 16 identifies the schedule slips and their causes. Figure 16: Initiatives that Slipped from DHS's Original January 2011 Schedule to the June 2012 Update: [Refer to PDF for image: table] Initiative: Integrated Investment Life Cycle Model; Activity that slipped: Initiate operations; Reason for slip: DHS had not yet dedicated sufficient resources to the effort; Fiscal year 2011, Q4 to Fiscal year 2012 Q4. Initiative: Acquisition Workforce Development; Activity that slipped: Revise training curriculum; Reason for slip: The scope of this initiative expanded significantly after lead responsibility was transferred from PARM to the Office of the Chief Procurement Officer at the beginning of fiscal year 2012; Fiscal year 2011, Q4 to Fiscal year 2013, Q2. Initiative: Program Management Corps; Prepare vacancy announcements; Reason for slip: DHS had difficulty defining its desired acquisition workforce; Fiscal year 2012, Q1 to Fiscal year 2012 Q4. Initiative: Business Intelligence; Activity that slipped: Implement full operational capability; Reason for slip: DHS increased the number of performance requirements for the initiative from 150 to 500 to support the production of the QPAR and CASR; Fiscal year 2012, Q2 to Fiscal year 2013, Q1. Source: GAO analysis of DHS planning documents. [End of figure] Moving forward, all seven acquisition management initiatives face significant implementation challenges that could affect DHS's efforts to address the challenges we have identified. We have previously established that agencies must have top leadership commitment, adequate capacity, corrective action plans, and performance measures to address high-risk issues.[Footnote 48] We have also established that agencies must demonstrate progress in implementing corrective actions. Table 5 summarizes our assessment of DHS's initiatives, identifying challenges to successful implementation. Table 5: GAO Assessment of DHS Acquisition Management Initiatives: DHS initiative: The Integrated Investment Life Cycle Model; Implementation challenges: DHS is still developing guidance for implementing and operating the IILCM, and it has not yet developed a resource estimate or dedicated a funding source; GAO's assessment: Questionable capacity. DHS initiative: Acquisition Workforce Development; Implementation challenges: DHS reports it currently has adequate resources to develop and deliver training, but future training requirements are better understood for personnel that award and administer contracts, than other disciplines, such as systems engineers and cost estimators; GAO's assessment: Questionable capacity. DHS initiative: Program Management Corps; Implementation challenges: DHS has reported critical resource shortfalls; the department previously identified the need for 150 additional positions; additionally, performance metrics are subjective, such as: "Ensure Program Managers are engaged in advancing the department's acquisition and program management capabilities"; GAO's assessment: Questionable capacity and lacks objective measures. DHS initiative: Procurement Staffing Model; Implementation challenges: DHS reports it has developed the model; however, the initiative is only intended to identify workforce needs and DHS has not established an initiative to meet those needs; GAO's assessment: Questionable capacity and inadequate corrective action plan. DHS initiative: Centers of Excellence; Implementation challenges: DHS officials anticipate the Centers of Excellence will face capacity challenges; additionally, performance metrics have not yet been established; GAO's assessment: Questionable capacity and lacks objective measures. DHS initiative: Component Acquisition Executive Structure; Implementation challenges: DHS has established meaningful performance measures, such as percentage of components with appropriate staffing levels, but it has reported critical resource shortfalls; GAO's assessment: Questionable capacity. DHS initiative: Business Intelligence; Implementation challenges: DHS reports that the initiative has critical resource shortfalls, and the corrective action plan does not explicitly address the enduring need to improve the quality of underlying performance data, particularly cost estimates and schedules; GAO's assessment: Questionable capacity and inadequate corrective action plan. Source: GAO analysis of DHS initiatives. [End of table] Capacity issues resulting from resource shortfalls have the potential to undermine DHS's efforts to address longstanding acquisition management challenges. DHS officials said the department established Centers of Excellence in an effort to increase the effectiveness of their limited number of skilled acquisition personnel. Additionally, the department plans to reassign high-performing staff to high- priority programs in order to cope with workforce shortfalls. However, DHS officials anticipate that capacity issues will endure, and in June 2012, the department reported that it may require 250 personnel to implement the IILCM, which would further strain DHS's limited capacity. Acquisition Management and Program Performance Must Improve Before Delegating Major Milestone Decision Authority: In December 2011, DHS presented a vision of acquisition management that includes a new tiered-governance structure dividing acquisition decision authority between the headquarters-level IRB and a group of component-level Executive Steering Committees (ESC). See figure 17. According to the Playbook, DHS will adopt this tiered-governance structure because the current process does not provide consistent or timely decisions, ensure appropriate stakeholders are involved, or allow high risk/impact issues to get sufficient senior-level attention. In fiscal year 2011, DHS established ESCs for 14 IT programs. Figure 17: DHS's Proposed IRB/ESC Governance Structure: [Refer to PDF for image: governance structure] Department-level officials: IRB. Devolution of decision authority: Component-level officials: ESC: Program; Program; Program. ESC: Program; Program. ESC: Program. Source: GAO analysis of DHS documents. [End of figure] According to the Playbook, the ESCs would be the primary decision- making authority for each program, approving ADE decisions. Additionally, many ESCs would not include headquarters-level officials and the program management offices overseen by the ESCs would provide the necessary administrative support. Programs would be elevated for IRB-level reviews if they experienced difficulty with schedule, budget, or scope. Since the Playbook was issued, PARM officials and DHS's Chief Information Officer clarified that ADE decision authority should not be delegated to component-level officials unless the USM has approved a program's APB and the program is being executed within agreed-upon cost, schedule, and performance thresholds. However, DHS has not clearly documented these conditions as prerequisites for delegating ADE decision authority. Additionally, as we have identified, DHS lacks the knowledge needed to effectively manage its programs, nearly all of which are at risk of poor outcomes. Because of this, DHS generally is not prepared to delegate ADE decision authority to component-level officials. Conclusions: DHS has a diverse, critical, and challenging mission that requires it to respond to an ever-evolving range of threats. Given this mission, it is important that DHS maintain an agile and flexible management approach in its day-to-day operations. However, DHS must adopt a more disciplined and systematic approach for managing its major investments, which are intended to help meet critical mission needs. DHS has taken some steps to improve investment management, but most of its major acquisition programs continue to cost more than expected, take longer to deploy than planned, or deliver less capability than promised. These outcomes are largely the result of DHS's lack of adherence to key knowledge-based program management practices, even though many are reflected in the department's own acquisition policy. DHS leadership has authorized and continued to invest in major acquisition programs even though the vast majority of those programs lack foundational documents demonstrating the knowledge needed to help manage risks and measure performance. This limits DHS's ability to proactively identify and address the challenges facing individual programs. Further, although the department's acquisition policy contains many key practices that help reduce risks and increase the chances for successful outcomes, the policy does not include certain program management practices that could further enhance acquisition management. For example, the policy does not require that programs demonstrate technologies in a realistic environment prior to initiating development activities, or that exit criteria be quantifiable to the extent possible. Cost growth and schedule slips at the individual program level complicate DHS's efforts to manage its investment portfolio as a whole. When programs encounter setbacks, the department has often redirected funding to troubled programs at the expense of others, which in turn are more likely to struggle. Additionally, DHS acquisition policy does not fully reflect key portfolio-management practices that would help improve investment management across the department. For example, the policy does not empower portfolio managers to invest resources in a disciplined manner or establish that investments should be ranked and selected using a disciplined process. DHS acknowledges the importance of having strong portfolio-management practices. However, DHS does not have a process to systematically prioritize its major investments to ensure that the department's acquisition portfolio is consistent with DHS's anticipated resource constraints, which is particularly important because of the diversity of the department's missions. Since 2008, we have emphasized the need for DHS to re-establish an oversight board dedicated to addressing portfolio management challenges. DHS has produced plans to establish such a board, but the concept is still under development. It is essential that DHS take a more disciplined acquisition management approach moving forward, particularly as the department must adjust to a period of governmentwide funding constraints. Without greater discipline, decisionmakers will continue to lack critical information and the department will likely continue to pay more than expected for less capability than promised, which will ultimately hinder DHS's day-to-day operations and its ability to execute its mission. Further, Congress's ability to assess DHS funding requests and conduct oversight will remain limited. To its credit, DHS has undertaken a variety of initiatives over the past two years designed to address the department's longstanding acquisition management challenges, such as increasing acquisition management capabilities at the component-level. However, more disciplined program and portfolio management at the department-level is needed before DHS can regularly delegate major milestone decision authority to component-level officials. Widespread challenges--including funding instability and acquisition workforce shortfalls--cost growth, and schedule slips indicate how much further DHS must go to improve acquisition outcomes. Recommendations for Executive Action: We recommend that the Secretary of Homeland Security direct the Under Secretary for Management to take the following five actions to help mitigate the risk of poor acquisition outcomes and strengthen the department's investment management activities: * Modify DHS acquisition policy to more fully reflect the following program management practices: - Require that (1) programs demonstrate technologies in a realistic environment prior to initiating development activities, and (2) manufacturing processes be tested prior to production; - Require that (1) exit criteria be quantifiable to the extent possible, and (2) consistent information be used across programs at ADE 2B and 2C; - State that program managers should remain with their programs until the next major milestone when possible; * Modify DHS acquisition policy to more fully reflect the following portfolio management practices: - Empower portfolio managers to decide how best to invest resources; - Establish that investments should be ranked and selected using a disciplined process; - Establish that (1) resource allocations should align with strategic goals, and (2) the investment review policy should use long-range planning; and: - Require portfolio reviews (1) annually to consider proposed changes, (2) as new opportunities are identified, and (3) whenever a program breaches its objectives; * Ensure all major acquisition programs fully comply with DHS acquisition policy by obtaining department-level approval for key acquisition documents before approving their movement through the acquisition life cycle; * Once the department's acquisition programs comply with DHS acquisition policy, prioritize major acquisition programs departmentwide and ensure that the department's acquisition portfolio is consistent with DHS's anticipated resource constraints; and: * Clearly document that department-level officials should not delegate ADE decision authority to component-level officials for programs lacking department approved APBs or not meeting agreed-upon cost, schedule, and performance thresholds. Agency Comments and Our Evaluation: DHS provided us with written comments on a draft of this report. In its comments, DHS concurred with all five of our recommendations and noted that two should be closed based on actions taken. The department's written comments are reprinted in appendix V. DHS also provided technical comments that we incorporated into the report as appropriate. DHS identified specific actions the department would take to address three of our recommendations. DHS stated that it was in the process of revising its policy to more fully reflect key program management practices. Additionally, DHS stated that it would continue to mature and solidify the portfolio review process over the next few years, and that it would revise its policy to reflect this process. DHS anticipates that this effort will also help the department prioritize its major acquisition programs departmentwide, and help ensure that the department's acquisition portfolio is consistent with anticipated resource constraints. DHS concurred with and requested we close our recommendation that the department ensure all acquisition programs fully comply with DHS acquisition policy by obtaining department-level approval for key acquisition documents before approving their movement through the acquisition life cycle. DHS stated that, in effect, its executive review board is approving a program's documents when it advances the program, thus satisfying this recommendation. As we noted in our report, DHS officials told us in April 2012 that the department has begun to implement its acquisition policy in a more disciplined manner and that it will no longer advance programs through the acquisition life cycle until DHS leadership verifies the programs have developed critical knowledge. However, it would be premature to close this recommendation until DHS demonstrates, over time, the consistent verification of the critical knowledge captured in key documents, especially as we found that nearly all of the department's major acquisition programs lack at least some of these acquisition documents. DHS also concurred with and requested we close our recommendation that the department clearly document that department-level officials should not delegate ADE decision authority to component-level officials for programs lacking department approved APBs or not meeting agreed-upon cost, schedule, and performance thresholds. DHS stated that it amended AD 102 to clarify that decision authority for any program that breaches an approved APB's cost, schedule or performance parameters will not be delegated to component-level officials, thus satisfying this recommendation. However, the amendment DHS provided does not include this language or clearly document the department's stated position. For this reason, it would be premature to close this recommendation at this time. In addition to commenting on our recommendations, the department made a number of observations on our draft report. For example, DHS stated that the report references many practices that occurred prior to the time period of the audit, and that the department has made measurable progress on a number of fronts. While we reviewed investment management activities going back to November 2008 to coincide with the issuance of AD 102, we also accounted for progress made through August 2012 by assessing ongoing DHS initiatives intended to address investment management challenges in the future. DHS also noted that our survey of 71 programs captured valuable information, but suggested the survey data cannot be generalized and expressed concern that it would be used as the basis for a recommendation. To clarify, none of the recommendations in this report are based on the survey data. In the absence of reliable program data, we surveyed program managers to obtain their perspectives on challenges facing the department's acquisition programs, and we obtained responses from 92 percent of the major acquisition programs DHS identified in 2011. DHS noted that programs can experience cost growth and schedule slips without a "breach." We recognize the validity of this point and our findings are consistent with this position. DHS incorrectly suggested that our data sources for quantifying cost growth - the Future Years Homeland Security Programs (FYHSP) issued in 2008 and 2011 - did not consistently account for costs beyond the initial five-year period. However, these two FYHSPs aggregated funding levels for each program to produce a total project cost. To measure total project cost growth for the 16 programs, as depicted in figure 4, we compared the total project costs reported in the 2008 FYHSP to the total project costs reported in the 2011 FYHSP. Thus, we measured changes in total project costs, not just costs over two different five- year periods. As agreed with your offices, unless you publicly announce the contents of this report earlier, we plan no further distribution until September 19, 2012. At that time, we will send copies to the Secretary of Homeland Security. In addition, the report will be available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202) 512-4841 or huttonj@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix VI. Signed by: John P. Hutton: Director, Acquisition and Sourcing Management: List of Requesters: The Honorable Joseph I. Lieberman: Chairman: The Honorable Susan M. Collins: Ranking Member: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Thomas R. Carper: Chairman: Subcommittee on Federal Financial Management, Government Information, Federal Services and International Security: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Michael T. McCaul: Chairman: Subcommittee on Oversight, Investigations, and Management: Committee on Homeland Security: House of Representatives: [End of section] Appendix I: Objectives, Scope, and Methodology: The objectives of this review were to assess the Department of Homeland Security's (DHS) acquisition management activities. Specifically, we assessed the extent to which: (1) DHS's major acquisition programs face challenges that increase the risk of poor outcomes; (2) DHS has policies and processes in place to effectively manage individual acquisition programs; (3) DHS has policies and processes in place to effectively manage its portfolio of acquisition programs as a whole; and (4) DHS has taken actions to resolve the high- risk acquisition management issues we have identified in previous reports. To answer these questions, we reviewed 77 of the 82 programs DHS included in its fiscal year 2011 Major Acquisition Oversight List (MAOL), which identified each program the department designated a major acquisition in 2011.[Footnote 49] We excluded 5 programs that were canceled in 2011; these are identified in appendix IV. The 77 selected programs were sponsored by 12 different components and departmental offices. To determine the extent to which major DHS acquisition programs face challenges increasing the risk of poor outcomes, we surveyed the program managers for all 77 programs, and received usable responses from 71 programs (92 percent response rate). Appendix III presents the survey questions we asked, and summarizes the responses we received. The web-based survey was administered from January 12, 2012, to March 30, 2012. Respondents were sent an e-mail invitation to complete the survey on a GAO web server using a unique username and password. During the data collection period, nonrespondents received a reminder e-mail and phone call. Because this was not a sample survey, it has no sampling errors. The practical difficulties of conducting any survey may also introduce nonsampling errors, such as difficulties interpreting a particular question, which can introduce unwanted variability into the survey results. We took steps to minimize nonsampling errors by pretesting the questionnaire in person with program management officials for five different programs, each in a different component. We conducted pretests to make sure that the questions were clear and unbiased, the data and information were readily obtainable, and that the questionnaire did not place an undue burden on respondents. Additionally, a senior methodologist within GAO independently reviewed a draft of the questionnaire prior to its administration. We made appropriate revisions to the content and format of the questionnaire after the pretests and independent review. All data analysis programs used to generate survey results were independently verified for accuracy. To determine the extent to which major DHS acquisition programs face challenges increasing the risk of poor outcomes, we also reviewed the 2008 and 2011 versions of the Future Years Homeland Security Program (FYHSP), all acquisition decision memoranda documenting DHS executive review board decisions from November 2008 to April 2012, the Office of Program Accountability and Risk Management's (PARM) initial Quarterly Program Assessment Report (QPAR), issued March 2012, and other management memos identifying available program-performance data. The survey results and documentation review allowed us to identify program performance, and the reasons for any poor performance. We also interviewed individuals at the component and department-level to enhance our understanding of common challenges. At the component level, we interviewed six of the eight Component Acquisition Executives that had been designated by the USM, and interviewed representatives of the remaining two. At the department level, we interviewed policy, budget, and acquisition oversight officials, including the Deputy Assistant Secretary for the Office of Strategic Plans, the department's Chief Information Officer, the Executive Director of PARM, and the Director of Program Analysis and Evaluation (PA&E). These officials provided a strategic perspective on program management challenges, and shared valuable insights regarding the limitations of available program performance data. Based on their input, we chose to use FYHSP data to calculate cost growth for individual programs where possible because the document is provided to Congress and constitutes DHS's most authoritative, out-year funding plan. To determine the extent to which DHS policies and processes are in place to effectively manage individual acquisition programs, as well as the department's acquisition portfolio as a whole, we identified key acquisition management practices and assessed the extent to which DHS policies and processes reflected those practices. We identified the key practices through a review of previous GAO reports, which are listed in appendix II. We compared DHS Acquisition Directive 102-01 (AD 102), an associated guidebook--DHS Instruction Manual 102-01-001-- and the guidebook's 12 appendixes to those key practices, and identified the extent to which they were reflected in the department's acquisition policy using a basic scoring system. If the DHS policy reflected a particular key practice, we assigned the policy a score of 5 for that practice. If the policy did not reflect the key practice, we assigned it a score of 1. We then took the average score for all the key practices in a particular area--as identified in appendix II-- to establish an overall score for each key practice area. We concluded that key practice areas that scored a 5 were reflected in the policy, scored a 4 were substantially reflected, scored a 3 were partially reflected, and scored a 2 were minimally reflected. We subsequently met with PARM officials to discuss our analysis, identify relevant sections of the policy that we had not yet accounted for, and solicit their thoughts on those key practices that were not reflected in the policy. In order to assess DHS's processes for implementing its policy, we surveyed program managers, and interviewed component and department-level officials. We also reviewed DHS's plans for the Integrated Investment Life Cycle Model (IILCM), which is being designed to better integrate the department's investment management functions. Further, we reviewed all acquisition decision memoranda documenting DHS executive review board decisions from November 2008 to April 2012, the March 2012 QPAR, and other management memos identifying available program-performance data, and any limitations of that data. To determine the extent to which DHS has taken actions to resolve the high-risk acquisition management issues we have identified in previous reports and this audit, we reviewed the first three versions of the DHS Integrated Strategy for High Risk Management--issued in January, June, and December 2011. We also reviewed the DHS Program Management and Execution Playbook, issued in December 2011. We identified initiatives intended to improve acquisition management, the department's progress in implementing those initiatives, and enduring challenges confronting the department. We also surveyed program managers, and interviewed component and department-level officials to obtain their perspectives on the initiatives. We conducted this performance audit from August 2011 to September 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Key Acquisition Management Practices: To determine the extent to which the Department of Homeland Security (DHS) has policies and processes in place to effectively manage individual acquisition programs, and the department's acquisition portfolio as a whole, we identified key acquisition management practices established in our previous reports examining DHS, the Department of Defense, NASA, and private sector organizations. The specific program-and portfolio-management practices, as well as the reports where we previously identified the value of those practices, are presented below. Key Program Management Practices: The following list identifies several key practices that can improve outcomes when managing an individual program. Identify and validate needs: * A need statement should be informed by a comprehensive assessment that considers the organization's overall mission: * Current capabilities should be identified to determine if there is a gap between the current and needed capabilities: * Noncapital alternatives and the modernization of existing assets should be evaluated before deciding how best to meet the gap: * Needs should be communicated within the context of a business case: Assess alternatives to select most appropriate solution: * Analyses of alternatives (AOA) should compare the performance, costs, and risks of competing solutions, and identify the most promising system solution to acquire: * AOAs should be conducted early in the acquisition process, before requirements are set: * AOAs should be sufficiently broad to assess many alternatives across multiple concepts: Clearly establish well-defined requirements: * Programs should be grounded in well-understood concepts of how systems would be used and likely requirements costs: * Operators and stakeholders should be involved in the development of requirements: * Firm requirements should be presented in a business case at the outset of a program: * Requirements should be well defined to ensure clear communication about what the government needs: Develop realistic cost estimates and schedules: * A cost estimate should be well documented, and include source data, clearly detailed calculations, and explanations of why particular methods were chosen: * A cost estimate should be comprehensive enough to ensure that cost elements are neither omitted nor double counted: * A cost estimate should be accurate, unbiased, and based on an assessment of most likely costs: * A cost estimate should be credible, and discuss any limitations of the analysis because of uncertainty or assumptions; uncertainty analyses and independent cost estimates should be conducted: * A schedule should account for how long all activities will take-- including a risk analysis--and identify resources needed to do the work: * A schedule should identify relationships between sequenced activities--including the critical path representing the schedule's longest total duration--and how much a predecessor activity can slip before affecting successor activities: * A schedule should be planned so that critical project dates can be met, and continuously updated using logic and durations: Secure stable funding that matches resources to requirements: * Technologies, time, funding, and other resources should be consistent with established requirements before development begins: * Programs should invest in systems engineering resources early, and make trade-offs by reducing or deferring requirements, in order to address capability needs in achievable increments with shorter cycle times and more predictable funding needs: * Prior to the initiation of system development activities, realistic cost estimates--validated against independent cost estimates--should be developed to establish a sound basis for acquiring new systems: * Acquisition strategies should provide sufficient time and money for design activities before construction start: * Projects should be budgeted in useful segments when dealing with a capped budget environment: Demonstrate technology, design, and manufacturing maturity: * Program officials should maintain regular communication with the contractor to track technical performance, risks, and issues: * Prior to the start of system development, critical technologies should be demonstrated to work in their intended environment: * Prior to a critical design review, design should be stable, and a prototype should demonstrate that the design can meet requirements: * Prior to a production decision, (a) a fully integrated, capable prototype should demonstrate that the system will work as intended in a reliable manner; and (b) the program should demonstrate that manufacturing processes are stable: * Prior to formal operator acceptance, operators should participate in the testing of system functionality: Utilize milestones and exit criteria: * Exit criteria and decision reviews should be used to determine that product managers captured required and appropriate knowledge-- including a more refined cost estimate--before a program moves forward to the next acquisition phase: * To the extent possible, exit criteria should be quantifiable, and decision reviews should be consistent across programs: * Independent program assessments should be conducted at each key decision point: Establish an adequate program workforce: * The right people, with the right skill sets, should be assigned to the right programs: * Program managers should stay on until the next major milestone to assure accountability; government and contractor staff should also remain consistent: Key Portfolio Management Practices: The following list identifies several key practices that can improve outcomes when managing a portfolio of multiple programs. Clearly define and empower leadership: * Those responsible for product investment decisions and oversight should be clearly identified and held accountable for outcomes: * Portfolio managers should be empowered to make decisions about the best way to invest resources: * Portfolio managers should be supported with cross-functional teams composed of representatives from key functional areas: Establish standard assessment criteria, and demonstrate comprehensive knowledge of the portfolio: * Specific criteria should be used to ensure transparency and comparability across alternatives: * Investments should be ranked and selected using a disciplined process to assess the costs, benefits, and risks of alternative products: * Knowledge should encompass the entire portfolio, including needs, gaps, and how to best meet the gaps: Prioritize investments by integrating the requirements, acquisition, and budget processes: * Requirements, acquisition, and budget processes should be connected to promote stability and accountability: * Organizations should use an integrated approach to prioritize needs and allocate resources, so they can avoid pursuing more products than they can afford, and optimize return on investment: * Resource allocation across the portfolio should align with strategic goals/objectives, and investment review policy should use long-range planning: Continually make go/no-go decisions to rebalance the portfolio: * Program requirements should be reviewed annually to make recommendations on proposed changes/descoping options: * As potential new products are identified, portfolios should be rebalanced based on those that add the most value: * If project estimates breach established thresholds, the product should be immediately reassessed within the context of the portfolio to determine whether it is still relevant and affordable: * Agencies should use information gathered from post-implementation reviews of investments, as well as information learned from other organizations, to fine-tune the investment process and the portfolios to shape strategic outcomes: Previous Reports Establishing Key Acquisition Management Practices: Information Technology: Critical Factors Underlying Successful Major Acquisitions. [hyperlink, http://www.gao.gov/products/GAO-12-7]. Washington, D.C.: October 21, 2011. Acquisition Planning: Opportunities to Build Strong Foundations for Better Services Contracts. [hyperlink, http://www.gao.gov/products/GAO-11-672]. Washington, D.C.: August 9, 2011. NASA: Assessments of Selected Large-Scale Projects. [hyperlink, http://www.gao.gov/products/GAO-11-239SP]. Washington, D.C.: March 3, 2011. Defense Acquisitions: Assessments of Selected Weapon Programs. [hyperlink, http://www.gao.gov/products/GAO-11-233SP]. Washington, D.C.: March 29, 2011. Defense Acquisitions: Strong Leadership Is Key to Planning and Executing Stable Weapon Programs. [hyperlink, http://www.gao.gov/products/GAO-10-522]. Washington, D.C.: May 6, 2010. Defense Acquisitions: Assessments of Selected Weapon Programs. [hyperlink, http://www.gao.gov/products/GAO-10-388SP]. Washington, D.C.: March 30, 2010. Defense Acquisitions: Many Analyses of Alternatives Have Not Provided a Robust Assessment of Weapon Systems Options. [hyperlink, http://www.gao.gov/products/GAO-09-665]. Washington, D.C.: September 24, 2009. Department of Homeland Security: Billions Invested in Major Programs Lack Appropriate Oversight. [hyperlink, http://www.gao.gov/products/GAO-09-29]. Washington, D.C.: November 18, 2008. GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs. [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. Washington, D.C.: March 2009. Defense Acquisitions: Sound Business Case Needed to Implement Missile Defense Agency's Targets Program. [hyperlink, http://www.gao.gov/products/GAO-08-1113]. Washington, D.C.: September 26, 2008. Defense Acquisitions: A Knowledge-Based Funding Approach Could Improve Major Weapon System Program Outcomes. [hyperlink, http://www.gao.gov/products/GAO-08-619]. Washington, D.C.: July 2, 2008. Defense Acquisitions: Realistic Business Cases Needed to Execute Navy Shipbuilding Programs. [hyperlink, http://www.gao.gov/products/GAO-07-943T]. Washington, D.C.: July 24, 2007. Best Practices: An Integrated Portfolio Management Approach to Weapon System Investments Could Improve DOD's Acquisition Outcomes. [hyperlink, http://www.gao.gov/products/GAO-07-388]. Washington, D.C.: March 30, 2007. Best Practices: Better Support of Weapon System Program Managers Needed to Improve Outcomes. [hyperlink, http://www.gao.gov/products/GAO-06-110]. Washington, D.C.: November 30, 2005. NASA's Space Vision: Business Case for Prometheus 1 Needed to Ensure Requirements Match Available Resources. [hyperlink, http://www.gao.gov/products/GAO-05-242]. Washington, D.C.: February 28, 2005. Information Technology Investment Management: A Framework for Assessing and Improving Process Maturity. [hyperlink, http://www.gao.gov/products/GAO-04-394G]. Washington, D.C.: March 2004. Executive Guide: Leading Practices in Capital Decision-Making. [hyperlink, http://www.gao.gov/products/GAO/AIMD-99-32]. Washington, D.C.: December 1998. [End of section] Appendix III: Program Office Survey Results: To help determine the extent to which major Department of Homeland Security (DHS) acquisition programs face challenges increasing the risk of poor outcomes, we surveyed the program managers for all 77 programs, and received usable responses from 71 programs (92 percent response rate). The web-based survey was administered from January 12, 2012, to March 30, 2012. We present the survey questions we asked and summarize the responses we received below. Introduction: The Senate Homeland Security and Governmental Affairs Committee has asked the Government Accountability Office (GAO) to examine the progress the Department of Homeland Security (DHS) has made in improving its acquisition management. In order to gain the program manager's perspective on acquisition management activities being implemented and challenges that may impede achieving program's cost, schedule, and performance objectives, we are conducting a survey of all major DHS acquisition programs. Since we do not plan on scheduling meetings with each program, this survey is your primary opportunity to share how you are managing your program and any challenges you face. Additionally, information from this survey will help determine whether DHS's planned acquisition management initiatives may mitigate challenges facing program management offices or if any additional efforts may be needed. We ask that you please complete this survey within 2 weeks of receipt. The survey should take about 60 to 90 minutes to complete and can be completed over multiple sittings. Your responses can be saved and accessed at a later date. If you are unsure of how to respond to a question, please contact us for assistance. We will not release individually identifiable data outside of GAO, unless required by law or requested by a member of Congress. We will generally report the results of this survey in the aggregate, but if we incorporate individual responses into the report, we will do so in a manner designed to ensure that individual respondents cannot be identified. Thank you for your time and assistance. Section I: Program Profile: 1. What is your program's name? data intentionally not reported: 2. How many years and/or months of experience do you have as a program manager at DHS and outside of DHS? a. At DHS: Year(s): Mean: 5.6; Median: 5; Minimum: 0; Maximum: 31; Number of respondents: 70. Month(s): Mean: 5.3; Median: 6; Minimum: 0; Maximum: 10; Number of respondents: 56. b. Outside of DHS: Year(s): Mean: 10.6; Median: 10; Minimum: 0; Maximum: 40; Number of respondents: 52. Year(s): Mean: 2.1; Median: 0; Minimum: 0; Maximum: 11; Number of respondents: 28. 3. In what phase(s) of the DHS Acquisition Directive (AD) 102 acquisition lifecycle is your program currently? (select all that apply): 1. Need (Prior to Acquisition Decision Event (ADE) 1): Not checked: 64; checked: 6; Number of respondents: 70. 2. Analyze/Select (Between ADE 1 and ADE 2): Not checked: 60; checked: 10; Number of respondents: 70. 3. Obtain (Between ADE 2 and ADE 3): Not checked: 42; checked: 28; Number of respondents: 70. 4. Production/deploy/support (Post ADE 3): Not checked: 23; checked: 47; Number of respondents: 70. 5. Do not know: Not checked: 69; checked: 1; Number of respondents: 70. Section II: DHS Acquisition Guidance, AD 102: 4. How often, if at all, do you refer to each of the following resources to help you manage your acquisition program? DHS AD 102 guidebook and appendices: Daily: 1; Weekly: 12; Monthly: 17; Semi-annually: 20; Annually: 7; I do not refer to this source: 11; Number of respondents: 68. Component's guidance: Daily: 4; Weekly: 17; Monthly: 28; Semi-annually: 7; Annually: 4; I do not refer to this source: 9; Number of respondents: 69. Component Acquisition Executive (CAE) staff: Daily: 7; Weekly: 23; Monthly: 19; Semi-annually: 10; Annually: 5; I do not refer to this source: 4; Number of respondents: 68. DHS Acquisition Program Management Division (APMD)/Program Analysis and Risk Management (PARM): Daily: 1; Weekly: 9; Monthly: 17; Semi-annually: 23; Annually: 10; I do not refer to this source: 9; Number of respondents: 69. Other (please specify below): Daily: 3; Weekly: 4; Monthly: 7; Semi-annually: 2; Annually: 0; I do not refer to this source: 10; Number of respondents: 26. If other resource, please specify: data intentionally not reported: 5. Which of the following DHS-provided opportunities, if any, has your program management office used to understand DHS acquisition guidance, AD 102; and if used, how useful was the opportunity, if at all? 1. Check box at left if your program is not required to follow AD 102, and then click here to skip to question 13: Not checked: 68; checked: 3; Number of respondents: 71. Training session(s) on AD 102 hosted by DHS headquarters: Did your program management office use this opportunity? Yes: 30; No, the opportunity is available but we did not use it: 10; No, opportunity is not available: 12; Do not know: 11; Number of respondents: 63. If used, how useful was the opportunity, if at all? Very useful: 11; Somewhat useful: 16; Not at all useful: 1; No opinion: 2; Number of respondents: 30. Manuals and templates for implementing AD 102 provided by DHS headquarters: Did your program management office use this opportunity? Yes: 50; No, the opportunity is available but we did not use it.: 8; No, opportunity is not available: 4; Do not know: 4; Number of respondents: 66. If used, how useful was the opportunity, if at all? Very useful: 17; Somewhat useful: 29; Not at all useful: 2; No opinion: 2; Number of respondents: 50. Direct support for your program from DHS Acquisition Program Management Division (APMD)/Program Analysis and Risk Management (PARM): Did your program management office use this opportunity? Yes: 49; No, the opportunity is available but we did not use it.: 9; No, opportunity is not available: 4; Do not know: 3; Number of respondents: 65. If used, how useful was the opportunity, if at all? Very useful: 24; Somewhat useful: 22; Not at all useful: 2; No opinion: 1; Number of respondents: 49. Other (please specify below): Did your program management office use this opportunity? Yes: 17; No, the opportunity is available but we did not use it.: 1; No, opportunity is not available: 3; Do not know: 6; Number of respondents: 27. If used, how useful was the opportunity, if at all? Very useful: 11; Somewhat useful: 2; Not at all useful: 1; No opinion: 2; Number of respondents: 16. If other opportunity, please specify: data intentionally not reported: 6. How clear or unclear is the DHS AD 102 acquisition guidance and framework for managing the following types of acquisitions? Developmental: Very clear: 11; Somewhat clear: 22; Neither clear nor unclear: 4; Somewhat unclear: 4; Very unclear: 0; No opinion: 22; Number of respondents: 63. Non-developmental: Very clear: 4; Somewhat clear: 23; Neither clear nor unclear: 5; Somewhat unclear: 5; Very unclear: 1; No opinion: 24; Number of respondents: 62. IT systems: Very clear: 12; Somewhat clear: 19; Neither clear nor unclear: 5; Somewhat unclear: 5; Very unclear: 1; No opinion: 22; Number of respondents: 64. Services: Very clear: 7; Somewhat clear: 21; Neither clear nor unclear: 6; Somewhat unclear: 4; Very unclear: 4; No opinion: 20; Number of respondents: 62. 7. How clear or unclear is DHS AD 102 acquisition guidance, including the guidebook and appendices, regarding each of the following? Understanding the Acquisition Review Board (ARB)'s roles and responsibilities: Very clear: 23; Somewhat clear: 26; Neither clear nor unclear: 3; Somewhat unclear: 5; Very unclear: 1; No opinion: 7; Number of respondents: 65. Understanding Component Acquisition Executive (CAE)'s roles and responsibilities: Very clear: 25; Somewhat clear: 20; Neither clear nor unclear: 7; Somewhat unclear: 5; Very unclear: 1; No opinion: 7; Number of respondents: 65. Implementing the Systems Engineering Life Cycle (SELC) framework to ensure capabilities are effectively delivered: Very clear: 15; Somewhat clear: 24; Neither clear nor unclear: 6; Somewhat unclear: 7; Very unclear: 3; No opinion: 10; Number of respondents: 65. Conducting testing and evaluation: Very clear: 13; Somewhat clear: 28; Neither clear nor unclear: 4; Somewhat unclear: 6; Very unclear: 3; No opinion: 11; Number of respondents: 65. Managing and prioritizing requirements across multiple DHS components: Very clear: 4; Somewhat clear: 18; Neither clear nor unclear: 10; Somewhat unclear: 12; Very unclear: 5; No opinion: 16; Number of respondents: 65. Aligning AD-102 guidance with Capital Planning and Investment Control (CPIC) and Planning, Programming, Budgeting, and Execution (PPBE) guidance to match resources and requirements: Very clear: 10; Somewhat clear: 24; Neither clear nor unclear: 9; Somewhat unclear: 9; Very unclear: 5; No opinion: 8; Number of respondents: 65. Aligning DHS and your component's guidance: Very clear: 8; Somewhat clear: 20; Neither clear nor unclear: 8; Somewhat unclear: 10; Very unclear: 8; No opinion: 9; Number of respondents: 63. Other (please specify below): Very clear: 2; Somewhat clear: 2; Neither clear nor unclear: 1; Somewhat unclear: 1; Very unclear: 1; No opinion: 15; Number of respondents: 22. [End of table] If other, please specify: data intentionally not reported: 8. How clear or unclear is DHS AD 102 acquisition guidance, including the guidebook and appendices, on how to develop each of the following key acquisition documents? Mission Needs Statement (MNS): Very clear: 27; Somewhat clear: 25; Neither clear nor unclear: 2; Somewhat unclear: 2; Very unclear: 0; No opinion: 8; Number of respondents: 64. Operational Requirements Document (ORD): Very clear: 23; Somewhat clear: 24; Neither clear nor unclear: 4; Somewhat unclear: 3; Very unclear: 2; No opinion: 7; Number of respondents: 63. Life Cycle Cost Estimate (LCCE): Very clear: 13; Somewhat clear: 28; Neither clear nor unclear: 8; Somewhat unclear: 4; Very unclear: 4; No opinion: 7; Number of respondents: 64. Acquisition Program Baseline (APB): Very clear: 24; Somewhat clear: 22; Neither clear nor unclear: 6; Somewhat unclear: 4; Very unclear: 2; No opinion: 6; Number of respondents: 64. Analysis of Alternatives (AOA): Very clear: 19; Somewhat clear: 24; Neither clear nor unclear: 9; Somewhat unclear: 2; Very unclear: 3; No opinion: 7; Number of respondents: 64. Test and Evaluation Master Plan (TEMP): Very clear: 18; Somewhat clear: 21; Neither clear nor unclear: 7; Somewhat unclear: 5; Very unclear: 2; No opinion: 11; Number of respondents: 64. Integrated Logistics Support Plan (ILSP): Very clear: 15; Somewhat clear: 23; Neither clear nor unclear: 8; Somewhat unclear: 6; Very unclear: 3; No opinion: 9; Number of respondents: 64. 9. How long is the average component and DHS review period for key acquisition documents required by AD 102 (e.g. MNS, ORD, LCCE and APB)? Component review: Less than 1 month: 6; 1 month to 3 months: 22; 3 months to 6 months: 20; 6 months to 1 year: 6; More than 1 year: 2; Not Applicable: 5; Number of respondents: 61. DHS review: Less than 1 month: 1; 1 month to 3 months: 15; 3 months to 6 months: 24; 6 months to 1 year: 10; More than 1 year: 5; Not Applicable: 8; Number of respondents: 63. 10. After an Acquisition Review Board (ARB) review, how adequately does an Acquisition Decision Memo (ADM) communicate action items? Very adequately: 26; Somewhat adequately: 25; Not at all adequately: 2; No opinion: 6; Not applicable: 5; Number of respondents: 64. 11. How has the introduction of AD 102 helped or hindered your ability to manage your program's cost and schedule and the overall acquisition program? Program's cost and schedule: Significantly helped: 8; Somewhat helped: 16; Neither helped nor hindered: 22; Somewhat hindered: 7; Significantly hindered: 3; No opinion: 9; Number of respondents: 65. Overall ability to manage your acquisition program: Significantly helped: 7; Somewhat helped: 21; Neither helped nor hindered: 18; Somewhat hindered: 6; Significantly hindered: 5; No opinion: 8; Number of respondents: 65. 12. If you would like to elaborate on any of your previous responses regarding the clarity and/or implementation of DHS acquisition guidance (AD 102) please use the following space. data intentionally not reported: Section III: Program Performance: 13. Does your program have a DHS-approved Acquisition Program Baseline (APB)? Yes: 40; No (please explain below): 31; Number of respondents: 71. 13a. If your program does not have a DHS-approved APB, please explain why it does not have one in the box below. data intentionally not reported: After answering 13a above, 14. How does your program's current projected cost compare against its DHS-approved APB? Cost is below the APB: 8; Cost meets the APB: 23; Cost exceeds the APB by less than 8 percent: 4; Cost exceeds the APB by 8 percent or more: 5; Number of respondents: 40. 15. How does your program's current projected schedule compare to its DHS-approved APB? Schedule is ahead of the APB: 3; Schedule meets the APB: 21; Schedule is behind the APB: 16; Number of respondents: 40. 16. How do your program's current planned system capabilities compare to its DHS-approved APB? System capabilities exceed APB: 1; System capabilities meet APB: 31; System capabilities fall short of APB: 7; Number of respondents: 39. 17. How frequently, if at all, does your program management office use each of the following performance metrics to monitor your program's progress? Earned Value Management data: Used all or most of the time: 24; Used sometimes: 9; Used rarely or not at all: 9; Do not know: 1; Not applicable: 27; Number of respondents: 70. Integrated Master Schedule: Used all or most of the time: 42; Used sometimes: 17; Used rarely or not at all: 2; Do not know: 1; Not applicable: 8; Number of respondents: 70. Technology readiness levels to assess the maturity of critical technologies: Used all or most of the time: 19; Used sometimes: 22; Used rarely or not at all: 11; Do not know: 1; Not applicable: 17; Number of respondents: 70. Percentage of design drawings completed: Used all or most of the time: 20; Used sometimes: 5; Used rarely or not at all: 12; Do not know: 3; Not applicable: 29; Number of respondents: 69. Percentage of production processes under statistical control: Used all or most of the time: 16; Used sometimes: 8; Used rarely or not at all: 14; Do not know: 1; Not applicable: 31; Number of respondents: 70. Other (please specify below): Used all or most of the time: 11; Used sometimes: 0; Used rarely or not at all: 2; Do not know: 1; Not applicable: 9; Number of respondents: 23. If other performance metrics, please specify: data intentionally not reported: Section IV: Cost Estimates and Program Schedule: 18. If your cost estimate was developed after AD 102 went into effect, to what extent did the GAO Cost Estimating Guide, i.e. Appendix I to DHS's AD 102 guidebook, inform your efforts to develop your program's LCCE? Greatly informed: 29; Somewhat informed: 17; Did not inform: 1; Program does not have a LCCE: 2; Not applicable, cost estimate was developed prior to AD 102: 21; Number of respondents: 70. 19. What obstacles, if any, did your program management office encounter when using the GAO Cost Estimating Guide to develop your program's LCCE? data intentionally not reported: 20. How fully, if at all, does your integrated master schedule (IMS) account for each of the following practices? Capturing and sequencing all activities: Fully accounts for: 38; Partially accounts for: 21; Does not account for: 4; Do not know: 5; Number of respondents: 68. Assigning resources (i.e. dollars, FTEs, etc.) to all activities: Fully accounts for: 12; Partially accounts for: 32; Does not account for: 19; Do not know: 5; Number of respondents: 68. Establishing duration of all activities, noting start and end dates, and keeping durations as short as possible: Fully accounts for: 38; Partially accounts for: 20; Does not account for: 4; Do not know: 5; Number of respondents: 67. Establishing a critical path to examine the effects of any activities slipping along this path: Fully accounts for: 38; Partially accounts for: 18; Does not account for: 7; Do not know: 5; Number of respondents: 68. Identifying float (i.e. the time that a predecessor activity can slip before the delay affects successor activities) between activities: Fully accounts for: 33; Partially accounts for: 25; Does not account for: 4; Do not know: 5; Number of respondents: 67. Updating the schedule to ensure valid status dates are captured: Fully accounts for: 42; Partially accounts for: 19; Does not account for: 2; Do not know: 5; Number of respondents: 68. Section V: Requirements: 21. Have each of the following documents been approved by DHS leadership for your program? Mission Needs Statement (MNS): Yes, approved: 44; Submitted to DHS leadership, not yet approved: 6; Not approved: 14; Do not know: 4; Number of respondents: 68. Operational Requirements Document (ORD): Yes, approved: 29; Submitted to DHS leadership, not yet approved: 10; Not approved: 22; Do not know: 5; Number of respondents: 66. Test and Evaluation Master Plan (TEMP): Yes, approved: 31; Submitted to DHS leadership, not yet approved: 7; Not approved: 22; Do not know: 6; Number of respondents: 66. Integrated Logistics Support Plan (ILSP): Yes, approved: 28; Submitted to DHS leadership, not yet approved: 9; Not approved: 23; Do not know: 8; Number of respondents: 68. 22. When setting operational requirements, which of the following processes best describes your program's efforts to consider alternatives at the program level? An Analysis of Alternatives (AOA) was conducted prior to operational requirements being set: 27; An AOA was conducted after operational requirements were set: 18; A trade-off analysis was conducted prior to operational requirements being set: 5; A trade-off analysis was conducted after operational requirements were set: 1; No AOA or trade-off analysis was conducted: 5; Do not know: 3; Not applicable, operational requirements have not yet been set.: 4; Number of respondents: 63. 23. At the initiation of design and development activities, how useful were each of the following stakeholders' contributions when defining proposed system capabilities/requirements? 1. Check box at left if design and development activities have not yet been initiated, and then click here to skip to question 28: Not checked: 61; checked: 10; Number of respondents: 71. End-user(s): Very useful: 46; Somewhat useful: 9; Not useful: 1; No opinion: 2; Did not seek input from stakeholder: 0; Number of respondents: 58. Subject Matter Expert(s): Very useful: 51; Somewhat useful: 5; Not useful: 0; No opinion: 2; Did not seek input from stakeholder: 0; Number of respondents: 58. Systems Engineer(s): Very useful: 37; Somewhat useful: 13; Not useful: 0; No opinion: 2; Did not seek input from stakeholder: 4; Number of respondents: 56. Component's CAE office: Very useful: 12; Somewhat useful: 18; Not useful: 9; No opinion: 8; Did not seek input from stakeholder: 11; Number of respondents: 58. APMD/PARM: Very useful: 7; Somewhat useful: 14; Not useful: 5; No opinion: 12; Did not seek input from stakeholder: 20; Number of respondents: 58. Test & Evaluation and Standards Division: Very useful: 16; Somewhat useful: 12; Not useful: 2; No opinion: 11; Did not seek input from stakeholder: 16; Number of respondents: 57. DHS Office of the Chief Information Officer: Very useful: 13; Somewhat useful: 9; Not useful: 9; No opinion: 6; Did not seek input from stakeholder: 21; Number of respondents: 58. DHS Office of Finance (including PA&E): Very useful: 4; Somewhat useful: 10; Not useful: 7; No opinion: 10; Did not seek input from stakeholder: 27; Number of respondents: 58. DHS Office of Policy: Very useful: 6; Somewhat useful: 8; Not useful: 7; No opinion: 10; Did not seek input from stakeholder: 25; Number of respondents: 56. DHS Acquisition Review Board (ARB): Very useful: 15; Somewhat useful: 14; Not useful: 4; No opinion: 7; Did not seek input from stakeholder: 17; Number of respondents: 57. Federal Partners: Very useful: 27; Somewhat useful: 14; Not useful: 1; No opinion: 7; Did not seek input from stakeholder: 9; Number of respondents: 58. Other (please specify below): Very useful: 9; Somewhat useful: 2; Not useful: 0; No opinion: 1; Did not seek input from stakeholder: 5; Number of respondents: 17. If other stakeholders, please specify: data intentionally not reported: 24. How many key performance parameters (KPPs) did the program have at development start, and how many KPPs does the program currently have? 24a. At start: Mean: 7.0; Median: 5; Minimum: 0; Maximum: 51; Number of respondents: 52. 24b. Currently: Mean: 8.0; Median: 5; Minimum: 0; Maximum: 51; Number of respondents: 53. 25. If your program's KPPs have changed or been redefined since development activities began (ADE 2A), how clear or unclear was the process for each of the following? 1. Check box at left if there have been no revisions to the KPPs, and then click here to skip to question 27: Not checked: 28; checked: 32; Number of respondents: 60. Revising key acquisition documents to reflect the change in requirements: Very clear: 10; Somewhat clear: 10; Neither clear nor unclear: 1; Somewhat unclear: 1; Very unclear: 1; No opinion: 3; Number of respondents: 26. Obtaining component leadership approval: Very clear: 13; Somewhat clear: 10; Neither clear nor unclear: 1; Somewhat unclear: 1; Very unclear: 0; No opinion: 1; Number of respondents: 26. Obtaining DHS leadership approval: Very clear: 10; Somewhat clear: 8; Neither clear nor unclear: 2; Somewhat unclear: 3; Very unclear: 1; No opinion: 2; Number of respondents: 26. 26. Which of the following are reasons your program's KPPs have changed or been redefined since development activities began (ADE 2A)? To clearly communicate performance parameters to the contractor: A reason: 9; Not a reason: 12; Do not know: 3; Number of respondents: 24. To make requirements measurable for testing: A reason: 12; Not a reason: 10; Do not know: 2; Number of respondents: 24. Desired performance could not be met with current technology: A reason: 6; Not a reason: 15; Do not know: 3; Number of respondents: 24. Associated capabilities were determined unnecessary: A reason: 6; Not a reason: 14; Do not know: 2; Number of respondents: 22. To demonstrate traceability between MNS, ORD, and TEMP: A reason: 10; Not a reason: 11; Do not know: 3; Number of respondents: 24. Other (please specify below): A reason: 4; Not a reason: 1; Do not know: 1; Number of respondents: 6. If other reasons, please specify: data intentionally not reported: 27. Since your program's design and development activities began (ADE 2A), how have each of the following factors affected your planned capabilities, if at all? Funding availability: Resulted in increased planned capabilities: 11; Planned capabilities remained the same: 16; Resulted in decreased planned capabilities: 21; Not a factor: 9; Do not know: 1; Number of respondents: 58. Changes in the program's schedule: Resulted in increased planned capabilities: 8; Planned capabilities remained the same: 18; Resulted in decreased planned capabilities: 15; Not a factor: 16; Do not know: 0; Number of respondents: 57. Mission(s) changed: Resulted in increased planned capabilities: 9; Planned capabilities remained the same: 11; Resulted in decreased planned capabilities: 2; Not a factor: 35; Do not know: 0; Number of respondents: 57. Key Performance Parameter(s) (KPP) changed: Resulted in increased planned capabilities: 5; Planned capabilities remained the same: 13; Resulted in decreased planned capabilities: 2; Not a factor: 37; Do not know: 0; Number of respondents: 57. End-user(s) input: Resulted in increased planned capabilities: 21; Planned capabilities remained the same: 19; Resulted in decreased planned capabilities: 2; Not a factor: 15; Do not know: 0; Number of respondents: 57. Technology development efforts/availability: Resulted in increased planned capabilities: 11; Planned capabilities remained the same: 15; Resulted in decreased planned capabilities: 6; Not a factor: 25; Do not know: 0; Number of respondents: 57. Other (please specify below): Resulted in increased planned capabilities: 0; Planned capabilities remained the same: 0; Resulted in decreased planned capabilities: 2; Not a factor: 8; Do not know: 2; Number of respondents: 12. [End of table] If other factors, please specify: data intentionally not reported: Section VI: Technology Maturity: 28. Prior to the initiation of development activities (ADE 2A), how many of your program's critical technologies demonstrated full functionality in realistic environments, if any? All critical technologies demonstrated full functionality: 25; Some critical technologies demonstrated full functionality: 19; No critical technologies demonstrated full functionality: 1; Do not know: 5; Not applicable, critical technologies had not been identified at that time: 5; Not applicable, the program has not initiated development activities: 11; Number of respondents: 66. 29. Prior to the initiation of low-rate initial production (ADE 2C), how many reliability goals were met, if any, by production- representative prototypes demonstrated in the intended environment? Prototypes met all reliability goals: 14; Prototypes met some reliability goals: 13; Prototypes met no reliability goals: 0; Do not know: 5; Not applicable, the program has not initiated low-rate initial production: 14; Not applicable, the program will not use low-rate initial production: 21; Number of respondents: 67. 30. Has the program used an independent testing authority? Yes: 35; No: 15; Do not know: 0; Not applicable: 15; Number of respondents: 65. Section VII: Resource Allocation: 31. Does your program rely on any funds other than DHS appropriations? Yes: 21; No: 49; Do not know: 0; Number of respondents: 70. 32. Does your program use a five-year funding plan to project resource needs? Yes: 65; No: 3; Do not know: 0; Number of respondents: 68. 33. Did your program's funding levels in each of the following budget documents meet your program's required funding needs as reflected in your APB? 1. Check box at left if your program does not have an APB, and then click here to skip to question 34: Not checked: 56; checked: 15; Number of respondents: 71. At program start, component's commitment in a Resource Allocation Plan (RAP): Yes, funds were equivalent: 38; No, funds in document were above the APB: 2; No, funds in the document were below the APB: 6; Do not know: 5; Not applicable: 4; Number of respondents: 55. At program start, DHS's commitment in a Resource Allocation Decision (RAD): Yes, funds were equivalent: 33; No, funds in document were above the APB: 0; No, funds in the document were below the APB: 8; Do not know: 6; Not applicable: 7; Number of respondents: 54. Component's FY11 RAP: Yes, funds were equivalent: 29; No, funds in document were above the APB: 3; No, funds in the document were below the APB: 16; Do not know: 2; Not applicable: 2; Number of respondents: 52. DHS's FY11 RAD: Yes, funds were equivalent: 30; No, funds in document were above the APB: 1; No, funds in the document were below the APB: 17; Do not know: 2; Not applicable: 2; Number of respondents: 52. OMB's FY11 decision: Yes, funds were equivalent: 30; No, funds in document were above the APB: 2; No, funds in the document were below the APB: 16; Do not know: 2; Not applicable: 1; Number of respondents: 51. FY11 budget request submitted to Congress: Yes, funds were equivalent: 31; No, funds in document were above the APB: 1; No, funds in the document were below the APB: 15; Do not know: 1; Not applicable: 4; Number of respondents: 52. FY11 Congressional enacted funds: Yes, funds were equivalent: 28; No, funds in document were above the APB: 2; No, funds in the document were below the APB: 17; Do not know: 1; Not applicable: 4; Number of respondents: 52. 34. Which of the following events, if any, have attributed to overall funding instability (e.g. a change in planned out-year funding from one five-year funding plan to the next five-year funding plan)? 1. Check box at left if anticipated funding levels have not changed from one year to the next, and then click here to skip to question 36: Not checked: 61; checked: 10; Number of respondents: 71. Congressional mark did not match the President's budget: Event occurred and out-year funding increased: 5; Event occurred but no change to out-year funding: 4; Event occurred and out-year funding decreased: 17; Event did not occur: 17; Not applicable: 15; Number of respondents: 58. Continuing Resolution: Event occurred and out-year funding increased: 3; Event occurred but no change to out-year funding: 30; Event occurred and out-year funding decreased: 10; Event did not occur: 2; Not applicable: 14; Number of respondents: 59. Program delays resulting from technical challenges: Event occurred and out-year funding increased: 0; Event occurred but no change to out-year funding: 12; Event occurred and out-year funding decreased: 3; Event did not occur: 25; Not applicable: 18; Number of respondents: 58. Mission/requirements change: Event occurred and out-year funding increased: 3; Event occurred but no change to out-year funding: 7; Event occurred and out-year funding decreased: 7; Event did not occur: 25; Not applicable: 16; Number of respondents: 58. Another program's acquisition funding levels affected this program's own planned funding: Event occurred and out-year funding increased: 1; Event occurred but no change to out-year funding: 2; Event occurred and out-year funding decreased: 18; Event did not occur: 17; Not applicable: 19; Number of respondents: 57. Original cost estimates did not reflect true costs: Event occurred and out-year funding increased: 4; Event occurred but no change to out-year funding: 7; Event occurred and out-year funding decreased: 3; Event did not occur: 26; Not applicable: 18; Number of respondents: 58. Other (please specify below): Event occurred and out-year funding increased: 2; Event occurred but no change to out-year funding: 0; Event occurred and out-year funding decreased: 3; Event did not occur: 0; Not applicable: 12; Number of respondents: 17. If other events, please specify: data intentionally not reported: 35. If your program has experienced funding instability (e.g. a change in planned out-year funding from one five-year funding plan to the next five-year funding plan), did it affect your program in each of the following ways? Increased costs by less than eight percent: Yes: 10; No: 24; Do not know: 3; Not applicable: 21; Number of respondents: 58. Increased costs by eight percent or more: Yes: 9; No: 24; Do not know: 4; Not applicable: 21; Number of respondents: 58. Resequencing of program's discrete segments, increments or delivery of capabilities: Yes: 29; No: 9; Do not know: 2; Not applicable: 18; Number of respondents: 58. Pushed out delivery and caused a schedule breach: Yes: 23; No: 13; Do not know: 3; Not applicable: 18; Number of respondents: 57. Reduced performance parameters: Yes: 8; No: 29; Do not know: 2; Not applicable: 16; Number of respondents: 55. Other (please specify below): Yes: 7; No: 1; Do not know: 0; Not applicable: 14; Number of respondents: 22. If other events, please specify: data intentionally not reported: 36. If a gap existed between FY11 enacted funding and FY11 required funding, how effectively were you, as a program manager, able to directly communicate the impact on your program to DHS and component leadership? 1. Check box at left if your program did not experience a gap between FY11 enacted and required funding, and then click here to skip to question 37: Not checked: 36; checked: 35; Number of respondents: 71. Component leadership (Component head, CAE, etc.): Very effectively: 19; Somewhat effectively: 5; Not effectively: 1; No opinion: 8; No direct communication: 2; Number of respondents: 35. DHS leadership (Deputy Secretary, USM, PARM officials, etc.): Very effectively: 14; Somewhat effectively: 3; Not effectively: 1; No opinion: 7; No direct communication: 10; Number of respondents: 35. 37. If you would like to elaborate on how resource allocation (i.e. funding) has affected the program's ability to achieve cost, schedule and performance goals, please use the following space. data intentionally not reported: Section VIII: Workforce: 38. Since the program was initially staffed, how many program managers have overseen the program management office (PMO)? Mean: 2.5; Median: 2; Minimum: 0; Maximum: 12; Number of respondents: 61. [End of table] 1. Check box at left if do not know: Not checked: 63; checked: 8; Number of respondents: 71. 39. What is the number of government FTEs in your PMO for each of the following functional areas? Program Management: Number of government FTEs staffed at initiation of development activities: Mean: 3.5; Median: 2; Minimum: 1; Maximum: 15; Number of respondents: 51. Number of government FTEs currently staffed: Mean: 4.7; Median: 3; Minimum: 0; Maximum: 23; Number of respondents: 60. Number of government FTEs currently identified as a need by the program: Mean: 5.9; Median: 4; Minimum: 1; Maximum: 18; Number of respondents: 49. Business functions (includes auditing, business, cost estimating, financial management, property management, and purchasing): Number of government FTEs staffed at initiation of development activities: Mean: 5.6; Median: 2; Minimum: 1; Maximum: 63; Number of respondents: 43. Number of government FTEs currently staffed: Mean: 10.7; Median: 3; Minimum: 1; Maximum: 252; Number of respondents: 53. Number of government FTEs currently identified as a need by the program: Mean: 14.4; Median: 4; Minimum: 1; Maximum: 346; Number of respondents: 47. Engineering and technical (includes systems planning, research, development and engineering; life cycle logistics; test and evaluation; production, quality and manufacturing; and facilities engineering): Number of government FTEs staffed at initiation of development activities: Mean: 4.9; Median: 3; Minimum: 0; Maximum: 25; Number of respondents: 40. Number of government FTEs currently staffed: Mean: 15.5; Median: 8; Minimum: 1; Maximum: 150; Number of respondents: 48. Number of government FTEs currently identified as a need by the program: Mean: 13.7; Median: 6; Minimum: 1; Maximum: 98; Number of respondents: 43. 40. Please describe the source(s) and/or method(s) used for identifying the numbers of FTEs in question 39. data intentionally not reported: 41. Please use the following space to comment on how personnel shortfalls, if any, have affected your program. 1. Check box at left if there have been no personnel shortages, and then click here to skip to question 42: Not checked: 52; checked: 19; Number of respondents: 71. data intentionally not reported: Section IX: Communication to DHS: 42. How effective, if at all, are each of the following tools for communicating your program's performance to DHS leadership? Next-generation Periodic Reporting System (nPRS): Very effective: 8; Somewhat effective: 34; Not at all effective: 13; No opinion: 10; Do not use: 4; Number of respondents: 69. Investment Management System (IMS): Very effective: 10; Somewhat effective: 35; Not at all effective: 7; No opinion: 13; Do not use: 4; Number of respondents: 69. ARB reviews: Very effective: 25; Somewhat effective: 27; Not at all effective: 2; No opinion: 7; Do not use: 8; Number of respondents: 69. Informal communication with DHS HQ: Very effective: 28; Somewhat effective: 23; Not at all effective: 1; No opinion: 6; Do not use: 11; Number of respondents: 69. Other (please specify below): Very effective: 8; Somewhat effective: 2; Not at all effective: 0; No opinion: 2; Do not use: 3; Number of respondents: 15. If other tools, please specify: data intentionally not reported: 43. If you reported that any of the tools listed above are not at all effective in communicating your program's current performance to DHS headquarters, or if you have any concerns related to communication tools, please elaborate below. data intentionally not reported: 44. For the data entered into nPRS, who is responsible for entering and/or validating the information? 1. Check box at left if your program does not report into nPRS, and then click here to skip to question 45: Not checked: 65; checked: 6; Number of respondents: 71. Program Manager: Enters Information: 4; Validates Information: 50; Does not have a role in data entry or validation: 6; Do not know: 1; Number of respondents: 61. Program Management Staff other than Program Manager: Enters Information: 37; Validates Information: 17; Does not have a role in data entry or validation: 5; Do not know: 1; Number of respondents: 60. Contractor in the Program Management Office: Enters Information: 39; Validates Information: 1; Does not have a role in data entry or validation: 15; Do not know: 1; Number of respondents: 56. Other (please specify below): Enters Information: 3; Validates Information: 3; Does not have a role in data entry or validation: 6; Do not know: 3; Number of respondents: 15. If other personnel, please specify: data intentionally not reported: 45. For the data entered into IMS, who is responsible for entering and/or validating the information? 1. Check box at left if your program does not report into IMS, and then click here to skip to question 46: Not checked: 66; checked: 5; Number of respondents: 71. Program Manager: Enters Information: 1; Validates Information: 51; Does not have a role in data entry or validation: 7; Do not know: 2; Number of respondents: 61. Program Management Staff other than Program Manager: Enters Information: 30; Validates Information: 19; Does not have a role in data entry or validation: 6; Do not know: 4; Number of respondents: 59. Contractor in the Program Management Office: Enters Information: 38; Validates Information: 4; Does not have a role in data entry or validation: 11; Do not know: 4; Number of respondents: 57. Other (please specify below): Enters Information: 1; Validates Information: 3; Does not have a role in data entry or validation: 7; Do not know: 5; Number of respondents: 16. If other personnel, please specify: data intentionally not reported: Section X: DHS Acquisition Management Initiatives: 46. How familiar are you with each of the following DHS initiatives? Establishing the Integrated Investment Life Cycle Model (IILCM): Very familiar: 9; Somewhat familiar: 24; Not at all familiar: 32; No opinion: 2; Number of respondents: 67. Establishing PARM: Very familiar: 19; Somewhat familiar: 38; Not at all familiar: 9; No opinion: 2; Number of respondents: 68. Establishing the Program Manager Corps: Very familiar: 12; Somewhat familiar: 22; Not at all familiar: 31; No opinion: 2; Number of respondents: 67. Empowering the Component Acquisition Executives (CAEs): Very familiar: 19; Somewhat familiar: 27; Not at all familiar: 18; No opinion: 2; Number of respondents: 66. Establishing Functional Coordination Office (e.g. Screening Coordination Office): Very familiar: 8; Somewhat familiar: 11; Not at all familiar: 45; No opinion: 2; Number of respondents: 66. Creating Executive Steering Councils for program governance: Very familiar: 23; Somewhat familiar: 21; Not at all familiar: 23; No opinion: 1; Number of respondents: 68. Forming the Capabilities and Requirements Council: Very familiar: 4; Somewhat familiar: 22; Not at all familiar: 40; No opinion: 1; Number of respondents: 67. Developing APEX, a decision support tool owned by PARM to capture and synthesize information from nPRS and IMS: Very familiar: 5; Somewhat familiar: 24; Not at all familiar: 37; No opinion: 2; Number of respondents: 68. 47. How helpful, if at all, will the following DHS initiatives be in helping you manage your acquisition program? Establishing the Integrated Investment Life Cycle Model (IILCM): Very helpful: 8; Somewhat helpful: 18; Not at all helpful: 8; No opinion: 32; Number of respondents: 66. Establishing PARM: Very helpful: 9; Somewhat helpful: 32; Not at all helpful: 5; No opinion: 21; Number of respondents: 67. Establishing the Program Manager Corps: Very helpful: 13; Somewhat helpful: 20; Not at all helpful: 6; No opinion: 26; Number of respondents: 65. Empowering the Component Acquisition Executives (CAEs): Very helpful: 25; Somewhat helpful: 21; Not at all helpful: 7; No opinion: 13; Number of respondents: 66. Establishing Functional Coordination Office (e.g. Screening Coordination Office): Very helpful: 4; Somewhat helpful: 13; Not at all helpful: 9; No opinion: 37; Number of respondents: 63. Creating Executive Steering Councils for program governance: Very helpful: 18; Somewhat helpful: 17; Not at all helpful: 8; No opinion: 24; Number of respondents: 67. Forming the Capabilities and Requirements Council: Very helpful: 7; Somewhat helpful: 14; Not at all helpful: 8; No opinion: 36; Number of respondents: 65. Developing APEX, a decision support tool owned by PARM to capture and synthesize information from nPRS and IMS: Very helpful: 4; Somewhat helpful: 16; Not at all helpful: 9; No opinion: 38; Number of respondents: 67. 48. Please use the following space to describe any additional actions that DHS could implement that would help you better manage your acquisition program (i.e. improvements for acquisition governance and document development). data intentionally not reported: Section XI: Summary Statements: 49. Please identify any significant challenges affecting your program's ability to achieve program objectives (i.e. cost, schedule, and capabilities) that have not been adequately addressed above. data intentionally not reported: 50. If you would like, please identify any practices your program has found significantly helpful in managing your program. data intentionally not reported: [End of section] Appendix IV: Major DHS Acquisition Programs and their Key Acquisition Documents: Table 6 below identifies the 71 major Department of Homeland Security (DHS) acquisition programs that responded to our survey. It consists of all the programs DHS included in its 2011 Major Acquisition Oversight List, with the exception of the 6 programs that did not respond to our survey (see table 7), and the 5 programs that were canceled in 2011 (see table 8). Table 6 also identifies whether each program's Mission Need Statement (MNS), Operational Requirements Document (ORD), Acquisition Program Baseline (APB), Integrated Logistics Support Plan (ILSP), and Test and Evaluation Master Plan (TEMP) have been approved at the department level. Table 6: Programs that responded to our survey: Sponsor: Analysis and Operations (A&O); Program: Common Operational Picture (COP); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: Chief Administrative Officer (CAO); Program: St. Elizabeth's; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: Chief Human Capital Officer; Program: HR-IT; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: Chief Information Officer; Program: Infrastructure Transformation Program (ITP); Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: Customs and Border Protection (CBP); Program: Automated Commercial Environment (ACE)/International Trade Data System (ITDS); Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Automated Targeting System (ATS) Maintenance[H]; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Border Patrol Facilities; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Facilities Management and Engineering Tactical Infrastructure (FM&E TI); Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Fleet Management Program (FMP)[H]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Land Ports of Entry Modernization; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Non-Intrusive Inspection (NII) Systems Program; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: SAP[H]; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Strategic Air and Marine Plan; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: CBP; Program: Tactical Communication (TAC-COM); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: CBP; Program: TECS Modernization[D]; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check]. Sponsor: CBP; Program: Transportation; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: CBP; Program: Western Hemisphere Travel Initiative (WHTI); Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: Domestic Nuclear Detection Office; Program: Advanced Spectroscopic Portal (ASP); Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: Federal Emergency Management Agency (FEMA); Program: Housing Inspection Services (HIS); Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: Document was not required at the time of our review. Department-approved documents: APB: Document was not required at the time of our review. Department-approved documents: ILSP: Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: FEMA; Program: Integrated Public Alert and Warning System (IPAWS); Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: FEMA; Program: Logistics Supply Chain Management System (LSCMS); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: FEMA; Program: Risk Mapping, Analysis and Planning (Risk Map)[A]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: Document was not required at the time of our review. Department-approved documents: ORD: Document was not required at the time of our review. Department-approved documents: APB: [Check] Document was not required at the time of our review. Department-approved documents: ILSP: [Check] Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: Immigration and Customs Enforcement (ICE); Program: Atlas; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: ICE; Program: Detention and Removal Operations (DROM); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: ICE; Program: DRO Electronic Health Record (EHR) System; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: Document was not required at the time of our review. Department-approved documents: APB: Document was not required at the time of our review. Department-approved documents: ILSP: Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: ICE; Program: Enforcement Information Sharing (EIS); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: ICE; Program: Student and Exchange Visitor Information System (SEVIS I & II)[D]; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: ICE; Program: TECS Modernization; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: National Protection and Programs Directorate (NPPD); Program: Federal Protective Services; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: NPPD; Program: Infrastructure Information Collection Program and Visualization (IICV IICP)[G]; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: NPPD; Program: National Cybersecurity and Protection System (NCPS); Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check]. Sponsor: NPPD; Program: Next Generation Network (NGN); Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: NPPD; Program: United States Visitor and Immigrant Status Indicator Technology (US-VISIT); Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: Office of Health Affairs; Program: Bio Watch Gen-3; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: Science and Technology (S&T); Program: National Bio and Agro-Defense Facility (NBAF); Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: S&T; Program: National Biodefense Analysis and Countermeasures Center (NBACC) Facility; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: Transportation Security Administration (TSA); Program: Electronic Baggage Screening Program (EBSP); Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: Field Real Estate Management (FREM); Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: HRAccess[H]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: National Explosives Detection Canine Team Program (K9) System; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: Information Technology Infrastructure Program (ITIP)[H]; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: Passenger Screening Program (PSP); Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check]. Sponsor: TSA; Program: Screening Partnership Program[H]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: Secure Flight; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: Security Technology Integrated Program (STIP); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: Specialized Training[H]; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: Transportation Worker Identification Credentialing (TWIC)[H]; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: TSA; Program: TTAC Infrastructure Modernization (TIM); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: United States Coast Guard (USCG); Program: C4ISR[D]; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: USCG; Program: CG Logistics information Management System (CG-LIMS)[E]; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Check] Document was not required at the time of our review. Department-approved documents: APB: Document was not required at the time of our review. Department-approved documents: ILSP: Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: USCG; Program: Core Accounting System (CAS)[H]; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: USCG; Program: Fast Response Cutter (FRC)[D]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: USCG; Program: HC-130H Conversion/Sustainment Projects; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: USCG; Program: HC-130 J Fleet Introduction; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: USCG; Program: HC-144A Maritime Patrol Aircraft (MPA)[D]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: USCG; Program: HH-60 Conversion Projects[D]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check]. Sponsor: USCG; Program: HH-65 Conversion/Sustainment Projects; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check]. Sponsor: USCG; Program: Interagency Operations Center (IOC); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: USCG; Program: Medium Endurance Cutter Sustainment[B]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check] Document was not required at the time of our review. Department-approved documents: ORD: Document was not required at the time of our review. Department-approved documents: APB: Document was not required at the time of our review. Department-approved documents: ILSP: Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: USCG; Program: National Security Cutter (NSC); Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: USCG; Program: Nationwide Automatic Identification System (NAIS)[D]; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: USCG; Program: Offshore Patrol Cutter (OPC); Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Check] Document was not required at the time of our review. Sponsor: USCG; Program: Patrol Boats Sustainment[B]; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check] Document was not required at the time of our review. Department-approved documents: ORD: Document was not required at the time of our review. Department-approved documents: APB: Document was not required at the time of our review. Department-approved documents: ILSP: Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: USCG; Program: Rescue 21; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: [Empty]. Sponsor: USCG; Program: Response Boat -Medium (RB-M)[B]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: Document was not required at the time of our review. Department-approved documents: ORD: Document was not required at the time of our review. Department-approved documents: APB: Document was not required at the time of our review. Department-approved documents: ILSP: Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: USCG; Program: Unmanned Aircraft Systems (UAS)[C, F]; Level: 1; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check] Document was not required at the time of our review. Department-approved documents: ORD: Document was not required at the time of our review. Department-approved documents: APB: Document was not required at the time of our review. Department-approved documents: ILSP: Document was not required at the time of our review. Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: United States Citizenship and Immigration Services (USCIS); Program: Application Support Center (ASC)[H]; Level: 2; Information Technology (IT) vs. Non-IT: Non-IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: USCIS; Program: Benefit Provision -Verification Information System (VIS); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Check]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: USCIS; Program: Integrated Document Production (IDP)[H]; Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Empty]; Department-approved documents: APB: [Empty]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: [Empty]. Sponsor: USCIS; Program: Transformation; Level: 1; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Empty]; Department-approved documents: TEMP: Document was not required at the time of our review. Sponsor: United States Secret Service; Program: IT Modernization (ITM); Level: 2; Information Technology (IT) vs. Non-IT: IT; Department-approved documents: MNS: [Empty]; Department-approved documents: ORD: [Check]; Department-approved documents: APB: [Check]; Department-approved documents: ILSP: [Check]; Department-approved documents: TEMP: Document was not required at the time of our review. Source: GAO analysis of DHS and survey data. [A] Program's document requirements were waived because it is in sustainment. [B] Program's document approval authority was delegated to a component. [C] Program does not yet require any documents because it is in the Need phase. [D] DHS reported that the program breached its baseline after the department approved its APB. An APB breach of cost, schedule, or performance is an inability to meet the threshold value set for each. Within 90 days of the breach, DHS acquisition policy requires that either a new APB be approved, baseline-revision recommendations be made to the Acquisition Decision Authority, or the program is back within APB parameters. [E] Program was designated a non-major acquisition program April 2012. [F] Program has two MNSs. The land-based MNS has been approved at the department-level. The cutter-based MNS has not. [G] Program was reorganized as the Critical Infrastructure Technology and Architecture (CITA) program in fiscal year 2012. [H] PARM officials stated the program was in sustainment at the time AD 102 was signed. [End of table] Table 7 identifies the programs that were included in DHS's 2011 Major Acquisition Oversight List, but did not respond to our survey. Table 7: Programs that did not respond to our survey: Sponsor: A&O; Program: Homeland Security Information Network (HSIN); Level: 2; IT vs. Non-IT: IT. Sponsor: CBP; Program: Advanced Passenger Information System (APIS); Level: 2; IT vs. Non-IT: IT. Sponsor: I&A; Program: National Security System Program (NSSP); Level: 1; IT vs. Non-IT: IT. Sponsor: ICE; Program: Detention and Removal Operations (DRO); Level: 2; IT vs. Non-IT: Non-IT. Sponsor: ICE; Program: Tactical Communications (TAC-COM); Level: 1; IT vs. Non-IT: IT. Sponsor: USCG; Program: Coastal Patrol Boat; Level: 1; IT vs. Non-IT: Non-IT. Source: GAO analysis of DHS and survey data. [End of table] Table 8 identifies the programs that were included in DHS's 2011 Major Acquisition Oversight List, but were canceled in 2011. Table 8: Programs canceled in 2011: Sponsor: CAO; Program: Electronic Records Management System (ERMS); Level: 2; IT vs. Non-IT: IT. Sponsor: Chief Financial Officer; Program: Transformation and Systems Consolidation (TASC); Level: 1; IT vs. Non-IT: IT. Sponsor: CBP; Program: Secure Border Initiative Network (SBInet); Level: 1; IT vs. Non-IT: IT. Sponsor: FEMA; Program: Grants Management Integrated Environment (GMIE); Level: 2; IT vs. Non-IT: IT. Sponsor: Intelligence and Analysis (I&A); Program: Online Tracking Information System; Level: 2; IT vs. Non-IT: IT. Source: GAO analysis of DHS data. [End of table] [End of section] Appendix V: Comments from the Department of Homeland Security: U.S. Department of Homeland Security Washington, DC 20528: August 11, 2012: John P. Hutton: Director, Acquisition and Sourcing Management: U.S. Government Accountability Office: 441 0 Street, NW: Washington, DC 20548: Re: Draft Report: GAG-12-833 "DHS Requires More Disciplined Investment Management to help Meet Mission Needs." Dear Mr. Hutton: Thank you for the opportunity to comment on the draft report on the U.S. Department of Homeland Security (DHS) investment management processes and performance. We appreciate the U.S. Government Accountability Office's (GAO's) work in conducting its review. As described in our response to the June 2011 Integrated Strategy for High Risk Management and the December 2011 Program Management and Execution Playbook, the Department has been implementing a multi- faceted strategy to transform investment management and improve program performance. We are proud of our significant progress and committed to addressing the fundamental management challenges facing the Department. While GAO acknowledges some of the Department's progress in acquisition management, the report references many practices that occurred prior to the time period of the audit. During the period of the audit, we made measurable progress on a number of fronts. For example, we increased program compliance with Acquisition Management Directive (AMC)) 102-01 by requiring programs to have the critical thinking in place before moving through the Acquisition Review Board (ARB) process. We increased program support by launching eight Centers of Excellence (COEs) for Acquisition and Program Management, which are now providing best practices, guidance, and resources to help keep programs on track. Since their inception in Fiscal Year (FY) 2012, COEs have supported more than 100 programs on the Major Acquisition Oversight List, In addition, we successfully launched a Decision Support Tool, enabling the Department to more accurately assess and track the health of major investment programs. The Department's program performance data are more accurate, complete, and transparent, and leadership has greater awareness of potential program risks, not only in the ARB, but between formal program reviews, as well. Further, we recognize that the program survey that GAO conducted captured valuable information; however, without providing context to this type of subjective information, we are concerned that it would be used as a basis for a recommendation. Without substantiated objective data, in our view, survey data cannot be generalized as a whole. For example, cost and schedule slippages (which are referenced throughout the report) are identified in the Quarterly Program Accountability Reports and receive further analysis; however, this is very different than a program breach. Additionally, GAO noted in their report analysis that the magnitude of the total-project cost growth used the DHS Future Years Homeland Security Programs (FYHSPs) issued in 2008 and 2011 (presented in then- year dollars). Using this method of analysis would demonstrate growth for any programs that continue beyond the initial 5-year period. Each year an additional year is added to the FYHSP 5-year plan. As GAO recognized, all the Department's portfolio reviews in 2012 were conducted jointly across the Department's lines of business and the Office of Policy. Additionally, in a recent GAO report, "DHS Needs to Further Define and Implement Its New Governance Process," GA0-12-818, the Department was praised for its progress in portfolio management. DHS is moving forward with its portfolio management strategy to gain a deeper understanding of its major investments from a functional, enterprise-wide perspective to provide an assessment of the proposed funding profiles to support the FY 2014-2018 resource allocation decision. Again, DHS appreciates the GAO's work in performing this audit and the recommendations made to further enhance the acquisition policy and processes. The Department concurs with all the recommendations and submits that two of the five recommendations have been fully addressed. Recommendation 1: Modify DHS acquisition policy to more fully reflect the following program-management practices: A. Require that 1) programs demonstrate technologies in a realistic environment prior to initiating development activities, and 2) manufacturing processes be tested prior to production; B. Require that 1) exit criteria be quantifiable to the extent possible, and 2) consistent information be used across programs at Acquisition Decision Event (ADE) 2B and 2C; C. State that program managers should remain with their programs until the next major milestone when possible. Response: Concur. The Department's acquisition policy, AMD 102-01 and supporting instructions, currently addresses all the elements recommended by GAO. DHS recognizes, however, the need for greater clarity. A draft policy revision was submitted through the DHS directive process in May 2012. The modification will clarify the requirements suggested by GAO, remove conflicting guidance, and enable the Department to more rapidly respond to programs' needs by facilitating the development, approval, and delivery of much needed topic-specific guidance for programs in areas, such as portfolio governance and cost/schedule monitoring. While we are in the process of strengthening our policy, our main focus is to provide best practice and build capabilities and address gaps through our COEs for Acquisition and Program Management initiatives. As reviewed with GAO, a large part of satisfying this recommendation will be through best practices provided to programs through the COEs. The Office of Program Accountability and Risk Management (PARM) supported the formation of eight COEs, which have begun providing program offices best practices, guidance, and expertise in their respective disciplines. The core team for each COE contains a dedicated federal full-time equivalent employee along with voluntary subject matter expert participation from the Components and DHS lines of business. The COE initiative supports the effort to remove DHS from the GAO high- risk list by building program management capabilities, sharing best practices across Components, and proactively identifying and addressing program gaps before they become major problems. DHS will publish best practices through the COEs that support the acquisition process, which will include addressing GAO's recommendations. Recommendation 2: Modify DHS acquisition policy to more fully reflect the following portfolio-management practices: Empower portfolio managers to decide how best to invest resources: A. Establish that investments should be ranked and selected using a disciplined process; B. Establish that 1) resource allocations should align with strategic goals, and 2) the investment review policy should use long-range planning; and; C. Require portfolio reviews 1) annually to consider proposed changes, 2) as new opportunities are identified, and 3) whenever a program breaches its objectives. Response: Concur. In 2012, the Department piloted Joint Functional Portfolio Reviews to aid in business financial and programmatic decision making across the investment lifecycle. The objective is to continue to mature and solidify the process over the next few years. The Joint Portfolio Reviews are planned to continue on an annual basis. DHS will modify the Acquisition Management 102 policy to codify this process. Recommendation 3: Ensure all acquisition programs fully comply with DHS acquisition policy, by obtaining department-level approval for key acquisition documents before approving their movement through the acquisition life cycle. Response: Concur. AMD 102-01 requires programs to demonstrate the necessary critical thinking and planning to support execution of the Department's programs. The ARB determines a course of action on the basis of risks and benefits of execution, as well as the documentation. In effect, when the ARB advances a program, it is "approving" the program's legacy documents. DHS submits that this recommendation has been satisfied and requests closure from GAO. Recommendation 4: Once the department's acquisition programs comply with DHS acquisition policy, prioritize major acquisition programs department-wide and ensure that the department's acquisition portfolio is consistent with DHS's anticipated resource constraints. Response: Concur. As part of the Department's annual Programming, Planning, Budgeting, and Execution activities, the Department works to ensure that the acquisition portfolio falls within the anticipated budget constraints. As the executive office for program execution, PARM is responsible, with guidance and oversight from the Chief Acquisition Officer, for the principle DHS policy for acquisition management, AMD 102-01. PARM is working with Component Acquisition Executives, program managers, and other stakeholders within DHS to change the construct of AMD 102-01 to provide a functionally structured policy with the flexibility, through an innovative structure, that enables DHS to streamline and improve the policy on the basis of stakeholder feedback without needing to reopen the Department's policy change process for the entire Directive and Instruction set. The new process will also facilitate development of new guidebooks, addressing areas such as: portfolio management cost/schedule monitoring and service programs. This is a future long- term objective and is directly aligned to Recommendation #2. Recommendation 5: Clearly document that department-level officials should not delegate ADE decision authority to component-level officials for programs lacking department approved APBs or not meeting agreed-upon cost, schedule, and performance thresholds. Response: Concur. In February 2012, The Under Secretary for Management signed an Amendment to DHS AMD 102-01, section VI (Acquisition Levels and Acquisition Decision Authority) to "clarify that the official identified as the Acquisition Decision Authority may designate the Chair or Member of a duly chartered Executive Steering Committee for a program to discharge specific Acquisition Decision Authority functions for that program, provided that the Acquisition Decision Authority maintains oversight over and accountability for the acquisition. Additionally, any ESC Charter submitted for approval must contain the following language: "...As a condition of such designation, the program must not be in breach of any cost, schedule or performance parameter of the ARB approved Acquisition Program Baseline (APB). Upon a finding of such a breach by the ARB, the Acquisition Decision Authority official would reassume all acquisition decisions for the program until such time as the ARB determines, in writing, that the program has remediated the APB items in breach, re-baselined the program with an ARB approved APB, and the designee is authorized to assume acquisition decision authority once again." DHS submits that this recommendation has been satisfied and requests closure from GAO. Again, thank you for the opportunity to review and comment on this draft report. Technical comments were previously provided under separate cover. Please feel free to contact me if you have any questions. We look forward to working with you in the future. Sincerely, Signed by: [Illegible] for: Jim H. Crumpacker: Director: Departmental GAO-OIG Liaison Office: [End of section] Appendix VI: GAO Contact and Staff Acknowledgments: GAO Contact: John Hutton, (202) 512-4841 or huttonj@gao.gov: Staff Acknowledgments: In addition to the contact named above, Katherine Trimble (Assistant Director), Nathan Tranquilli (Analyst-in-Charge), John Crawford, David Garcia, Jill Lacey, Sylvia Schatz, Rebecca Wilson, Candice Wright, and Andrea Yohe made key contributions to this report. [End of section] Related GAO Products: Immigration Benefits: Consistent Adherence to DHS's Acquisition Policy Could Help Improve Transformation Program Outcomes. [hyperlink, http://www.gao.gov/products/GAO-12-66]. Washington, D.C.: November 22, 2011. Coast Guard: Action Needed As Approved Deepwater Program Remains Unachievable. [hyperlink, http://www.gao.gov/products/GAO-11-743]. Washington, D.C.: July 28, 2011. Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue. [hyperlink, http://www.gao.gov/products/GAO-11-318SP]. Washington, D.C.: March 1, 2011. High-Risk Series: An Update. [hyperlink, http://www.gao.gov/products/GAO-11-278]. Washington, D.C.: February 2011. Defense Acquisitions: Assessments of Selected Weapon Programs. [hyperlink, http://www.gao.gov/products/GAO-11-233SP]. Washington, D.C.: March 29, 2011. Secure Border Initiative: Controls over Contractor Payment for the Technology Component Need Improvement. [hyperlink, http://www.gao.gov/products/GAO-11-68]. Washington, D.C.: May 25, 2011. Department of Homeland Security: Assessments of Selected Complex Acquisitions. [hyperlink, http://www.gao.gov/products/GAO-10-588SP]. Washington, D.C.: June 30, 2010. The Office of Management and Budget's Acquisition Workforce Development Strategic Plan for Civilian Agencies. [hyperlink, http://www.gao.gov/products/GAO-10-459R]. Washington, D.C.: April 23, 2010. Defense Acquisitions: Measuring the Value of DOD's Weapon Programs Requires Starting with Realistic Baselines. [hyperlink, http://www.gao.gov/products/GAO-09-543T]. Washington, D.C.: April 1, 2009. Department of Homeland Security: Billions Invested in Major Programs Lack Appropriate Oversight. [hyperlink, http://www.gao.gov/products/GAO-09-29]. Washington, D.C.: November 18, 2008. Homeland Security: Challenges in Creating an Effective Acquisition Organization. [hyperlink, http://www.gao.gov/products/GAO-06-1012T]. Washington, D.C.: July 27, 2006. Homeland Security: Successes and Challenges in DHS's Efforts to Create an Effective Acquisition Organization. [hyperlink, http://www.gao.gov/products/GAO-05-179]. Washington, D.C.: March 29, 2005. Results-Oriented Cultures: Implementation Steps to Assist Mergers and Organizational Transformations. [hyperlink, http://www.gao.gov/products/GAO-03-669]. Washington, D.C.: July 2, 2003: Best Practices: Better Matching of Needs and Resources Will Lead to Better Weapon System Outcomes. [hyperlink, http://www.gao.gov/products/GAO-01-288]. Washington, D.C.: March 8, 2001. [End of section] Footnotes: [1] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-05-207] (Washington, D.C.: January 2005) [2] For examples, see GAO, Department of Homeland Security: Billions Invested in Major Programs Lack Appropriate Oversight, [hyperlink, http://www.gao.gov/products/GAO-09-29] (Washington, D.C.: November 18, 2008); Department of Homeland Security: Assessments of Selected Complex Acquisitions, [hyperlink, http://www.gao.gov/products/GAO-10-588SP] (Washington, D.C.: June 30, 2010); Secure Border Initiative: Controls over Contractor Payment for the Technology Component Need Improvement, [hyperlink, http://www.gao.gov/products/GAO-11-68] (Washington, D.C.: May 25, 2011); High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-11-278] (Washington, D.C.: February 2011); Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue, [hyperlink, http://www.gao.gov/products/GAO-11-318SP] (Washington, D.C.: March 1, 2011); Coast Guard: Action Needed As Approved Deepwater Program Remains Unachievable, [hyperlink, http://www.gao.gov/products/GAO-11-743] (Washington, D.C.: July 28, 2011) [3] [hyperlink, http://www.gao.gov/products/GAO-10-588SP]. [4] GAO, Defense Acquisitions: Managing Risk to Achieve Better Outcomes, [hyperlink, http://www.gao.gov/products/GAO-10-374T] (Washington, D.C.: Jan. 20, 2010). [5] This forthcoming report assesses the extent to which subsidiary projects within DHS IT investments are meeting their cost and schedule commitments, and the primary causes of any commitment shortfalls. The report focuses on investments that had one or more subsidiary projects exceeding 10 percent of their cost and schedule commitments. A project is a temporary effort (e.g. 9 months) to accomplish a unique product or service, such as adding enhancements to a system. [6] DHS originally identified 82 major acquisition programs in the 2011 major acquisition oversight list, but five of those programs were subsequently canceled in 2011. Seventy-one program managers responded to the survey. See appendix IV. [7] Appendix II identifies key acquisition management practices established in our previous reports examining DHS, the Department of Defense, National Aeronautics and Space Administration, and private sector organizations. [8] The interim version of AD 102 replaced Management Directive 1400, which had governed major acquisition programs since 2006. DHS originally established an investment review process in 2003 to provide departmental oversight of major investments throughout their life cycles, and to help ensure that funds allocated for investments through the budget process are well spent. DHS issued an updated version of AD 102 in January 2010 and subsequently updated the guidebook and appendices. [9] The Secretary of DHS designated the USM the department's Chief Acquisition Officer in April, 2011. [10] According to DHS's fiscal year 2011 Major Acquisition Oversight List, the department's major acquisition programs were sponsored by 12 different component agencies, as identified in appendix IV. [11] 6 U.S.C. § 454. [12] GAO, Defense Acquisitions: Assessments of Selected Weapon Programs, [hyperlink, http://www.gao.gov/products/GAO-10-388SP] (Washington, D.C.: March 30, 2010); Best Practices: Better Matching of Needs and Resources Will Lead to Better Weapon System Outcomes, [hyperlink, http://www.gao.gov/products/GAO-01-288] (Washington, D.C.: March 8, 2011); [hyperlink, http://www.gao.gov/products/GAO-10-588SP]; [hyperlink, http://www.gao.gov/products/GAO-11-743]. [13] The cost estimates DHS reported to Congress were presented in then-year dollars. [14] GAO, Best Practices: Better Support of Weapon System Program Managers Needed to Improve Outcomes, [hyperlink, http://www.gao.gov/products/GAO-06-110] (Washington, D.C.: November 30, 2005); [hyperlink, http://www.gao.gov/products/GAO-10-588SP]. [15] [hyperlink, http://www.gao.gov/products/GAO-10-388SP]. [16] GAO, Immigration Benefits: Consistent Adherence to DHS's Acquisition Policy Could Help Improve Transformation Program Outcomes, [hyperlink, http://www.gao.gov/products/GAO-12-66] (Washington, D.C.: November 22, 2011). [17] GAO, Defense Acquisitions: Assessments of Selected Weapon Programs, [hyperlink, http://www.gao.gov/products/GAO-11-233SP] (Washington, D.C.: March 29, 2011). [18] [hyperlink, http://www.gao.gov/products/GAO-01-288]. [19] Seventy-one survey respondents replied to whether their program experienced funding instability. [20] Forty survey respondents reported at least one effect of funding instability. [21] GAO, Defense Acquisitions: A Knowledge-Based Funding Approach Could Improve Major Weapon System Program Outcomes, [hyperlink, http://www.gao.gov/products/GAO-08-619] (Washington, D.C.: July 2, 2008). [22] The department's cost analysis division was combined with the acquisition program management division to create PARM in 2011. The Cost Estimating and Analysis center of excellence is now responsible for supporting cost analyses within DHS. [23] [hyperlink, http://www.gao.gov/products/GAO-10-588SP]. [24] DHS Instruction Manual 102-01-001, Appendix E, Acquisition Program Office and Component Acquisition Executive Core Staffing Requirements (Oct. 1, 2011). [25] [hyperlink, http://www.gao.gov/products/GAO-10-588SP]. [26] Sixty-two survey respondents reported whether their program experienced a workforce shortfall in government full-time-equivalents in three functional areas; 40 reported that their program had a DHS- approved Acquisition Program Baseline and whether the program had experienced cost growth and schedule slips. [27] GAO, The Office of Management and Budget's Acquisition Workforce Development Strategic Plan for Civilian Agencies, [hyperlink, http://www.gao.gov/products/GAO-10-459R] (Washington, D.C.: April 23, 2010). [28] DHS acquisition policy consists of AD 102-01, an associated guidebook, and 12 appendices. [29] GAO, Homeland Security: Successes and Challenges in DHS's Efforts to Create an Effective Acquisition Organization, [hyperlink, http://www.gao.gov/products/GAO-05-179] (Washington, D.C.: March 29, 2005). [30] In our past work examining weapon acquisition issues and best practices for product development, we have found that leading commercial firms pursue an acquisition approach that is anchored in knowledge, whereby high levels of product knowledge are demonstrated by critical points in the acquisition process. See [hyperlink, http://www.gao.gov/products/GAO-11-233SP]. [31] Sustainment begins when a capability has been fielded for operational use, and it involves the supportability of fielded systems through disposal, including maintenance and the identification of cost reduction opportunities. System operations, support, and sustainment costs tend to approach up to 70 percent of life-cycle costs. [32] [hyperlink, http://www.gao.gov/products/GAO-09-29], [hyperlink, http://www.gao.gov/products/GAO-10-588SP]. [33] DHS Acquisition Instruction/Guidebook 102-01-001: Appendix K, Acquisition Program Baseline; October 1, 2011. [34] DHS's criteria for assessing the cost estimates are based on GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). [35] [hyperlink, http://www.gao.gov/products/GAO-10-588SP]. [36] The department's cost analysis division was combined with the acquisition program management division to create PARM in 2011. The Cost Estimating and Analysis center of excellence is now responsible for supporting cost analyses within DHS. [37] One of these eight programs--the Medium Endurance Cutter Sustainment program--is not required to have a department-approved APB. [38] GAO, Best Practices: An Integrated Portfolio Management Approach to Weapon System Investments Could Improve DOD's Acquisition Outcomes, [hyperlink, http://www.gao.gov/products/GAO-07-388] (Washington, D.C.: March 30, 2007). [39] [hyperlink, http://www.gao.gov/products/GAO-07-388]. [40] [hyperlink, http://www.gao.gov/products/GAO-09-29]. [41] [hyperlink, http://www.gao.gov/products/GAO-08-619]. [42] [hyperlink, http://www.gao.gov/products/GAO-06-110]. [43] [hyperlink, http://www.gao.gov/products/GAO-11-278]. [44] [hyperlink, http://www.gao.gov/products/GAO-05-207]. [45] For examples, see [hyperlink, http://www.gao.gov/products/GAO-09-29], [hyperlink, http://www.gao.gov/products/GAO-10-588SP], [hyperlink, http://www.gao.gov/products/GAO-11-68], [hyperlink, http://www.gao.gov/products/GAO-11-278], [hyperlink, http://www.gao.gov/products/GAO-11-318SP], [hyperlink, http://www.gao.gov/products/GAO-11-743]. [46] Sixty-seven survey respondents reported on their familiarity with the IILCM. [47] GAO, Results-Oriented Cultures: Implementation Steps to Assist Mergers and Organizational Transformations, [hyperlink, http://www.gao.gov/products/GAO-03-669] (Washington, D.C.: July 2, 2003). [48] [hyperlink, http://www.gao.gov/products/GAO-11-278]. [49] Undersecretary for Management, DHS, Fiscal Year 2011--Major Acquisition Oversight List. Memo. (Washington, D.C.: Jan. 28, 2011). [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. [End of document] Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov, (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548.