This is the accessible text file for GAO report number GAO-14-309 entitled 'Major Automated Information Systems: Selected Defense Programs Need to Implement Key Acquisition Practices' which was released on March 27, 2014. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Committees: March 2014: Major Automated Information Systems: Selected Defense Programs Need to Implement Key Acquisition Practices: GAO-14-309: GAO Highlights: Highlights of GAO-14-309, a report to congressional committees. Why GAO Did This Study: The National Defense Authorization Act for Fiscal Year 2012 mandated that GAO select and assess DOD MAIS programs annually through March 2018. This report discusses the results of GAO's second annual assessment. Based on the act's requirements, GAO's objectives were to (1) describe the extent to which selected MAIS programs have changed their planned cost and schedule estimates and met performance targets; (2) assess selected MAIS programs' actions to manage risks; and (3) assess the extent to which selected MAIS programs used key information technology acquisition best practices. To do so, GAO selected 15 of the 42 DOD MAIS programs based on several factors, including representation from multiple DOD components, and summarized the results of analyses of cost, schedule, and performance across the programs. Further, GAO selected 3 of the 15 programs (1 each from DHA, DLA, and Navy) and assessed them against best practices for risk management, requirements management, and project monitoring and control. What GAO Found: Of the 15 selected Department of Defense (DOD) major automated information system (MAIS) programs, 13 had cost information available (2 did not, due to revisions to requirements and changes in scope). Of these 13 programs, 11 experienced changes in their cost estimates, including 7 that experienced increases ranging from 4 to 2,233 percent and 4 that experienced decreases ranging from 4 to 86 percent. Two programs remained unchanged in their cost goals. Additionally, of 14 programs that had schedule information available (1 did not due to revisions to requirements), 13 experienced schedule changes—including 12 that had slippages ranging from a few months to 6 years, and 1 that accelerated its schedule. One program remained on schedule. Further, of 11 programs that had system performance data available, 3 programs met their system performance targets, while 8 did not fully meet their targets. The three programs selected for analysis of risk management demonstrated mixed progress in effectively defining and managing risks. Specifically, the Defense Health Agency's (DHA) Theater Medical Information Program – Joint Increment 2 had implemented key risk management practices. While the Navy's Global Combat Support System – Marine Corps program did not, among other things, update its risk tracking log during a 5-month period in 2013, the program recently updated its risks and mitigation plans, which should help the program to more effectively manage risks going forward. The Defense Logistics Agency's (DLA) Defense Agencies Initiative program had taken steps to implement selected risk management practices, including establishing a risk management board. However, the program was still in the early stages of identifying risks and had not yet identified a comprehensive set of program risks, nor consistently evaluated and categorized its risks. Until this program maintains a complete risk log and mitigation plans, and accurately evaluates and categorizes its risks, it will lack assurance that it is appropriately mitigating all identified risks. The three programs also demonstrated mixed progress in implementing key requirements management and project monitoring and control best practices. Specifically, the Navy program had implemented all key requirements management best practices and the DLA program had recently taken steps to do so. However, while the DHA program had implemented many requirements management best practices, it had not maintained complete traceability between its requirements and work products. Additionally, the DHA program had not updated its capabilities baseline to reflect program scope changes. Until DHA implements these requirements management best practices, stakeholders will lack assurance that the system will have all intended functionality to meet users' needs. Regarding project monitoring and control, each of the programs lacked key practices. For instance, the DLA program had not tracked significant deviations in performance. Additionally, the Navy program did not always take timely corrective actions to address issues. Further, the DHA program did not use earned value management to track contractor performance in meeting planned cost targets, even though these data were being collected and certain contracts met DOD's threshold for its use; as such, the program had not effectively determined progress against the plan. Until the three programs implement these project monitoring best practices, they will be limited in their ability to manage the programs. What GAO Recommends: GAO recommends that DOD direct the programs to address respective weaknesses in their risk management, requirements management, and project monitoring and control practices. DOD concurred with six of GAO's recommendations and partially concurred with the remaining two. GAO maintains that it is important that the DHA program trace all capabilities to their associated requirements and update its capabilities baseline to reflect program scope changes. View [hyperlink, http://www.gao.gov/products/GAO-14-309]. For more information, contact Carol R. Cha at (202) 512-4456 or chac@gao.gov. [End of section] Contents: Letter: Background: Most Selected Programs Changed Their Planned Cost and Schedule Estimates, and Over Half Did Not Fully Meet System Performance Targets: Selected Programs' Implementation of Key Risk Management Practices Varied in Effectiveness: Mixed Progress in Applying Key IT Acquisition Best Practices: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: Profiles of Selected DOD MAIS Programs: Appendix III: Comments from the Department of Defense: Appendix IV: GAO Contact and Staff Acknowledgments: Tables: Table 1: Summary of Cost, Schedule, and System Performance Results for the Selected Programs: Table 2: Changes in Selected Programs' First Approved Baseline Estimates and Latest Planned Total Life-cycle Cost Estimates: Table 3: Selected MAIS Programs' Schedule Slippages Compared to First Approved Baseline Schedules: Table 4: Causes for Schedule Slippages among 12 Selected Programs: Figures: Figure 1: Simplified DOD Organizational Structure: Figure 2: Defense Acquisition Management System Framework: Figure 3: Business Capability Life-cycle Acquisition Model: Abbreviations: AFIPPS: Air Force Integrated Personnel and Pay System: AOC-WS: Air and Space Operations Center-Weapon System: APB: acquisition program baseline: BITI Wireless: Base Information Transport Infrastructure Wireless: CMMI-ACQ: Capability Maturity Model® Integration for Acquisition: DAI: Defense Agencies Initiative: DCGS-A: Distributed Common Ground System-Army: DHA: Defense Health Agency: DOD: Department of Defense: GCCS-M: Global Command and Control System - Maritime: GCSS-MC: Global Combat Support System-Marine Corps: iEHR: Integrated Electronic Health Record: IPPS-A: Integrated Personnel and Pay System-Army: ISPAN: Integrated Strategic Planning and Analysis Network: IT: information technology: IV&V: independent verification and validation: JMS: Joint Space Operations Center Mission System: JPI: Joint Personnel Identification: MAIS: major automated information system: NGEN: Next Generation Enterprise Network: PMBOK®: Project Management Body of Knowledge: TMIP-J: Theater Medical Information Program-Joint: VA: Department of Veterans Affairs: [End of section] United States Government Accountability Office: GAO: 441 G St. N.W. Washington, DC 20548: March 27, 2014: Congressional Committees: The Department of Defense (DOD) is one of the largest and most complex organizations in the world. To meet its mission, it relies heavily on the use of information technology (IT). In this regard, according to DOD's IT investment portfolio for fiscal year 2012, the department spent approximately $35.0 billion for its IT investments.[Footnote 1] Of this amount, DOD officials reported that at least $4.5 billion was spent on major automated information system (MAIS) programs, which are intended to help the department sustain its key operations.[Footnote 2] DOD IT investments that fall within one of the following categories are designated as MAIS programs: (1) program costs in any single year exceed $32 million, (2) total program acquisition costs exceed $126 million, or (3) total life-cycle costs exceed $378 million.[Footnote 3] The Secretary of Defense can also use discretion to designate a program as a MAIS if it does not meet these cost thresholds. The National Defense Authorization Act for Fiscal Year 2012 mandated that we select, assess, and report on DOD MAIS programs annually through March 2018.[Footnote 4] This is the second assessment in our series of annual reviews. Our objectives for this assessment were to (1) describe the extent to which selected MAIS programs have changed their planned cost and schedule estimates and met performance targets, (2) assess selected MAIS programs' actions to manage risks, and (3) assess the extent to which selected MAIS programs have used key IT acquisition best practices. To accomplish the first objective, we selected 15 of the 42 MAIS programs listed in DOD's April 2012 MAIS oversight list for evaluation.[Footnote 5] To select these programs, we first identified programs that met several criteria, including those that had established an acquisition program baseline (APB),[Footnote 6] represented multiple DOD components, and were not included in our first MAIS review.[Footnote 7] This analysis resulted in a selection of nine programs. Next, we selected five additional programs that had been without APBs for the longest periods of time.[Footnote 8] The final program was selected because, of the remaining MAIS programs, it had not established an APB for the longest period of time, was an enterprise resource planning system, and had the largest planned life- cycle costs. To determine the extent to which each of the 15 programs had changed their planned cost and schedule estimates, we compared the program's best cost (in then-year dollars) and schedule estimates established in the first APB (where available) to the latest planned total life-cycle cost and schedule estimates.[Footnote 9] For the programs that had not established APBs, we compared the cost and schedule estimates established in these programs' initial estimates to the latest planned total life-cycle cost estimates (in then-year dollars) and schedule estimates.[Footnote 10] In order to determine whether the programs experienced significant or critical deviations in their cost and schedule estimates, we compared any deviations to thresholds established by statute.[Footnote 11] Specifically, according to the statute, a program is considered to have undergone a "significant" change when it has (1) experienced a schedule change that will cause a delay of more than 6 months but less than a year; (2) experienced an estimated development or full life-cycle cost increase of at least 15 percent, but less than 25 percent over the original estimate; or (3) experienced a significant, adverse change in the expected performance of the system. A program is considered to have undergone a "critical" change when it has (1) experienced a schedule change that will cause a delay of 1 year or more; (2) experienced an estimated development or full life-cycle cost increase of 25 percent or more over the original estimate; (3) failed to achieve a full deployment decision within 5 years after the milestone A decision for the program or, if there was no milestone A decision, the date when the preferred alternative was selected for the program;[Footnote 12] or (4) experienced a change in the expected performance of the system or major IT investment to be acquired under the program that will undermine the ability of the system to perform the functions anticipated. Additionally, to determine whether system performance targets were met, we analyzed each program's system performance targets against actual performance data, and reviewed the results of operational assessments and program evaluations conducted on the systems. We then aggregated and summarized the results of our cost, schedule, and performance analyses across the programs, as well as developed individual program profiles, which are presented in appendix II. To address the second objective, we selected 3 of the 15 programs from the first objective, including 2 programs that had established APBs and had the highest planned total life-cycle costs, and 1 program that had been without a baseline for the longest period of time.[Footnote 13] To assess each program's actions to manage risks, we identified key risk management practices from the Software Engineering Institute's Capability Maturity Model® Integration for Acquisition (CMMI-ACQ) and the Project Management Institute's Guide to the Project Management Body of Knowledge (PMBOK®), and assessed each of the 3 programs against these criteria.[Footnote 14] Specifically, for each of the 3 selected programs, we analyzed risk management documentation, such as risk logs and mitigation plans, to identify levels of risks and determine the status of each program's key risks and the actions that were taken to manage these risks. Additionally, we interviewed program officials about the risks and risk management practices that they used. To address the third objective, we selected the same three programs as in objective two to determine the extent to which each program was implementing (1) requirements management and (2) project monitoring and control best practices, as defined by CMMI-ACQ and PMBOK®. We also assessed these programs against key best practices for employing independent verification and validation (IV&V).[Footnote 15] To determine the extent to which each program's acquisition practices were consistent with these best practices, we assessed program management and systems documentation, such as program requirements and program management reports. We also interviewed program officials to obtain additional information on each program's IT management processes in these areas. We conducted this performance audit from April 2013 to March 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. See appendix I for a more detailed discussion of our objectives, scope, and methodology. Background: DOD is a massive and complex organization. It includes the Office of the Secretary of Defense, the Joint Chiefs of Staff, the military departments, numerous defense agencies and field activities, and various unified combatant commands that contribute to the oversight of DOD's acquisition programs. Figure 1 presents a simplified depiction of DOD's organizational structure. Figure 1: Simplified DOD Organizational Structure: [Refer to PDF for image: Organizational Structure] Top level: Secretary of Defense; * Deputy Secretary of Defense. Second level, reporting to Secretary of Defense: * Department of the Army; * Department of the Navy; * Department of the Air Force; * Office of the Secretary of Defense: - DOD field activities; - Defense agencies; * Inspector General; * Joint Chiefs of Staff: - Combatant commands[A]. Source: GAO analysis based on DOD data. [A] The Chairman of the Joint Chiefs of Staff serves as the spokesperson for the commanders of the combatant commands, particularly for the operational requirements of the commands. [End of figure] In support of its military operations, DOD performs an assortment of interrelated and interdependent business functions, such as logistics management, procurement, health care management, and financial management. As we have previously reported, the DOD systems environment that supports these business functions is overly complex and error prone, and is characterized by (1) little standardization across the department, (2) multiple systems performing the same tasks, (3) the same data stored in multiple systems, and (4) the need for data to be entered manually into multiple systems.[Footnote 16] According to DOD's IT investment portfolio, for fiscal year 2012, the department spent approximately $35.0 billion to operate, maintain, and modernize its IT systems. We have designated DOD's business systems modernization program as high risk for the past 19 years, due to challenges in modernizing the department's business systems environment.[Footnote 17] DOD's Acquisition Guidance for MAIS Programs: Of the $35.0 billion spent on DOD IT investments for fiscal year 2012, according to DOD officials, at least $4.5 billion was for MAIS programs. The MAIS programs include a range of systems, such as communications systems, business systems (e.g., logistics management and financial management systems), and command and control systems, which are intended to provide department and component officials with easy access to information to effectively organize, plan, direct, and monitor mission operations. Prior to November 2013, MAIS programs were required to comply with one of two DOD acquisition frameworks.[Footnote 18] The first framework-- referred to as the defense acquisition management system framework-- applied to all DOD IT acquisition programs except business system modernization programs that exceeded $1 million in total costs. [Footnote 19] The second framework--referred to as the business capability life-cycle acquisition model--applied to all business system modernization programs with total costs that exceeded $1 million.[Footnote 20] The business system modernization programs were required to use this framework instead of the defense acquisition management system framework in an effort to address challenges previously experienced when implementing business systems, such as implementing solutions without fully understanding business needs. The 2008 defense acquisition management system framework established the steps that programs should take as they planned, designed, acquired, deployed, operated, and maintained their systems. This framework consisted of five program life-cycle phases and five related decision points, which are shown in figure 2 and described following the figure.[Footnote 21] The milestone decision authority for programs that complied with this framework was either the Under Secretary of Defense for Acquisition, Technology, and Logistics; the DOD component head; a component acquisition executive; or when authorized, a designee. Figure 2: Defense Acquisition Management System Framework: [Refer to PDF for image: illustration] Materiel development decision; Materiel solution analysis; Milestone review: A; Technology development; Milestone review: B; Engineering and manufacturing development; Milestone review: C; Production and deployment: Full deployment decision; Operations and support. Source: GAO analysis based on DOD data. [End of figure] * Materiel solution analysis: Refine the initial system solution (concept) and create a strategy for acquiring the solution. A decision is made at the end of this phase to authorize entry into the technology development phase--referred to as milestone A. * Technology development: Determine the appropriate set of technologies to be integrated into the system solution while simultaneously refining user requirements. A decision is made at the end of this phase to authorize product development based on well- defined technology and a reasonable system design plan--referred to as milestone B. An APB is first established at the milestone B decision point.[Footnote 22] A program's first APB contains the original life- cycle cost estimate, schedule estimate, and performance parameters that were approved for that program by the milestone decision authority. The first APB is established after the program has assessed the viability of various technologies and refined user requirements to identify the most appropriate technology solution that demonstrates that it can meet users' needs. * Engineering and manufacturing development: Develop a system and demonstrate through developer testing that the system can function in its target environment. A decision is made at the end of this phase to authorize entry of the system into the production and deployment phase or into limited deployment in support of operational testing--referred to as milestone C. * Production and deployment: Achieve an operational capability that satisfies the mission needs, as verified through independent operational test and evaluation, and to implement the system at all applicable locations. * Operations and support: Operationally sustain the system in the most cost-effective manner over its life cycle. In addition to the three milestone decision points included in this framework (milestones A, B, and C), the framework also included two other decision points: (1) materiel development decision, which authorized officials to conduct analyses to assess the potential solutions that can satisfy the program's requirements, and (2) full deployment decision, which authorized the system to be deployed to all remaining locations beyond limited fielding locations.[Footnote 23] In March 2009, the Defense Science Board reported that DOD's acquisition process for IT systems was too long, ineffective, and did not accommodate the rapid evolution of IT.[Footnote 24] As such, the Board recommended that DOD develop new acquisition and requirements development processes for IT systems that would be agile, incremental, and allow requirements to be prioritized based on need and technical readiness. Subsequently, DOD developed a new framework--the business capability life-cycle acquisition model--that outlined the key steps that programs should take through the life cycle of acquisition of each major business system.[Footnote 25] This framework was intended to allow for more flexible acquisition processes that may be tailored to specific programs. Additionally, the framework was intended to address challenges that have previously impacted the delivery of IT business capabilities, such as programs lacking well-defined, strategically linked requirements, and transitioning too quickly from identifying a perceived business problem to implementing a specific solution. Specifically, this model consisted of seven program life- cycle phases and five milestone decision points, as shown in figure 3. The milestone decision authority for programs that were required to comply with this framework was either the Under Secretary of Defense for Acquisition, Technology, and Logistics; a component acquisition executive; or when authorized, a designee. Figure 3: Business Capability Life-cycle Acquisition Model: [Refer to PDF for image: illustration] Business capability definition; Materiel development decision; Investment management; Milestone review/milestone decision authority decision point: A; Prototyping; Milestone review/milestone decision authority decision point: B; Engineering development; Milestone review/milestone decision authority decision point: C; Limited fielding; Full deployment decision; Full deployment; Operations and support. Source: GAO analysis based on DOD data. [End of figure] Of these seven life-cycle phases, six were consistent with or similar to the five phases in the defense acquisition system framework (one of the phases in the defense acquisition system framework, production and deployment, corresponded to two phases in the business capability life- cycle model--limited fielding and full deployment). The seventh phase in the business capability life-cycle model was called business capability definition and occurred at the start of a program. The purpose of this phase was to analyze a perceived business problem or capability gap. This model also included the five decision points included in the defense acquisition management framework--milestones A, B, and C, materiel development decision, and full deployment decision. Statutory Requirements for MAIS Programs: MAIS programs must also comply with annual and quarterly reporting requirements identified in statute.[Footnote 26] In this regard, each calendar year, DOD must submit to Congress a report on each MAIS program, including information on the cost, schedule, and performance of the program. Specifically, DOD must report, among other things, on each program's development and implementation schedules and development and full life-cycle cost estimates; and provide a summary of the key performance parameters for each program. It must also provide a summary of any major changes for each MAIS program. Moreover, on a quarterly basis, the program manager for each MAIS program is required to provide the senior DOD official responsible for the program a report that identifies any variance in the program's cost, schedule, or performance. Depending on the determination after reviewing the variance identified in the quarterly report, the senior DOD official must notify the congressional defense committees of any programs that have experienced either a significant or critical change, as described below: * Significant change. A significant change must be declared if the program has experienced a schedule delay of more than 6 months but less than a year; estimated costs for the program have increased by at least 15 percent but less than 25 percent; or there has been a significant adverse change in the expected performance of the system. If such an event occurs, the senior DOD official must notify the congressional defense committees in writing no later than 45 days after receiving the quarterly report from the program manager. * Critical change. A critical change must be declared if the program failed to achieve a full deployment decision within 5 years after the milestone A decision or, if there was no milestone A decision, the date when the preferred alternative was selected for the program; experienced a schedule delay of more than 1 year; experienced an estimated development or full life-cycle cost increase of 25 percent or more over the original estimate; or experienced a change in the expected performance of the system that will undermine the ability of the system to perform as intended. If such an event occurs, the senior DOD official must carry out an evaluation of the program and submit a report to the congressional defense committees no later than 60 days after receiving the quarterly report from the program manager. [Footnote 27] For programs that declare a critical change, the evaluation must assess the projected cost and schedule for completing the program if current requirements are not modified; assess the projected cost and schedule for completing the program based on a reasonable modification of requirements; and assess the rough order of magnitude of the cost and schedule for any reasonable alternative system or capability. Best Practices for Managing IT Acquisition Programs: Entities such as the Project Management Institute, the Software Engineering Institute at Carnegie Mellon University, and GAO have developed and identified best practices to help guide organizations to effectively plan and manage their acquisitions of major IT systems, such as the MAIS programs.[Footnote 28] Our prior reviews have shown that proper implementation of such practices can significantly increase the likelihood of delivering promised system capabilities on time and within budget.[Footnote 29] These practices include, but are not limited to: * Risk management: A process for anticipating problems and taking appropriate steps to mitigate risks and minimize their impact on program commitments. It involves identifying and documenting risks, categorizing them based on their estimated impact, prioritizing them, developing risk mitigation strategies, and tracking progress in executing the strategies. * Requirements management: Requirements establish what the system is to do, how well it is to do it, and how it is to interact with other systems. Effective management of requirements includes developing criteria for the evaluation and acceptance of requirements, obtaining commitments to requirements, and controlling requirements changes over the course of the program. It also ensures that requirements are validated against user needs and that each requirement traces back to the business need and forward to its design and testing. * Project monitoring and control: Provides an understanding of the project's progress, so that appropriate corrective actions can be taken if performance deviates from plans. Effective practices in this area include monitoring program performance against the program plan, monitoring stakeholder involvement throughout the life of the program, and managing corrective actions to closure. * Independent verification and validation: A process whereby organizations can reduce the risks inherent in system development and acquisition efforts by having a knowledgeable party who is independent of the developer determine that the system or product meets the users' needs and fulfills its intended purpose. GAO Previously Reported on DOD's Challenges in Implementing Certain MAIS Programs: We have previously reported and made recommendations on DOD's efforts to implement certain MAIS programs. * In July 2008, we reported that DOD had not effectively implemented key IT management controls on its Global Combat Support System -Marine Corps (GCSS-MC) program.[Footnote 30] For example, we reported that the program's schedule baseline was not reflective of certain important scheduling practices, such as conducting a schedule risk assessment and allocating schedule reserve. Additionally, we noted that not all program risks had been adequately managed, and certain risk mitigation strategies were either not fully implemented or the strategies did not mitigate the risks, resulting in risks becoming actual problems. As a result, we recommended that DOD, among other things, ensure that the GCSS-MC program office (1) performs a schedule risk analysis to determine the level of confidence in meeting the program's activities and completion date, (2) allocates schedule reserve for high-risk activities on the critical path, (3) tracks and evaluates the implementation of mitigation plans for all risks, and (4) discloses to appropriate program oversight and approval authorities whether mitigation plans have been fully executed and have produced the intended outcome(s). DOD concurred in full or in part with the recommendations in the report and took actions to implement nearly all of them. * In March 2011, we reported that the Navy did not sufficiently analyze alternative acquisition approaches for the Next Generation Enterprise Network (NGEN) program because the alternatives analysis contained key weaknesses in its cost estimates and analysis of operational effectiveness.[Footnote 31] We also found that the Navy did not have a reliable integrated master schedule, and that acquisition decisions were not always performance-and risk-based. We recommended, among other things, that DOD reconsider the selected acquisition approach, ensure that the NGEN integrated master schedule substantially reflects key scheduling practices, and that future acquisition reviews and decisions fully reflect the state of the program's performance and its exposure to risks. DOD concurred in full or in part with nearly all of the recommendations and took actions to implement some of them. Additionally, we reported in September 2012 that, while the Navy did not revisit the analysis of alternatives for NGEN, it had reconsidered and revised its acquisition approach to support program executability and reduce program risk.[Footnote 32] Implementation of this revised acquisition strategy is expected to save the program about $2.58 billion between fiscal years 2013 through 2017. Finally, we also reported in September 2012 that the program's risks were not being adequately mitigated because not all risk mitigation plans were comprehensive and current, and we therefore recommended that DOD develop comprehensive risk mitigation plans. DOD concurred with our recommendation. * In May 2011, we reported that the Air Force's Joint Space Operations Center Mission System (JMS) faced development challenges and risks, such as the use of immature technologies and planning to deliver all capabilities in a single, large increment, versus smaller and more manageable increments.[Footnote 33] We recommended, among other things, that DOD assure that key program risks have been fully assessed to help ensure cost, schedule, and performance goals will be met. We also noted in the report that implementing this recommendation may require dividing the program into separate increments. DOD agreed with this recommendation and noted that the requirement to assess key program risks to ensure cost, schedule, and performance goals is part of the milestone B review, approval, and certification process. * In March 2013, we reported that large variations existed in the extent to which 14 selected programs stayed within planned cost and schedule estimates and met system performance targets.[Footnote 34] We also noted that three selected programs--Air Force's Defense Enterprise Accounting and Management System, Army's Global Combat Support System-Army, and Navy's Consolidated Afloat Networks and Enterprise Services--demonstrated mixed results in effectively defining and managing risks of various levels, and in implementing key requirements management and project monitoring and control best practices. We made recommendations to the Army program to address weaknesses in its risk management and IV&V practices. DOD concurred with these recommendations and stated that it will comply with them. * We reported in February 2014 that the Department of Veterans Affairs (VA) and DOD had abandoned their plans to develop an integrated electronic health record (iEHR) system and were instead pursuing separate efforts to modernize or replace their existing systems in an attempt to create an interoperable electronic health record.[Footnote 35] We also noted that VA and DOD had not substantiated their claims that the revised approach will be less expensive and more timely than the single-system approach. We therefore recommended, among other things, that VA and DOD develop and compare the estimated cost and schedule of their current and previous approaches to creating an interoperable electronic health record and, if applicable, provide a rationale for pursuing a more costly or time-consuming approach. VA and DOD concurred with our recommendations and noted actions that were being taken to address them. Most Selected Programs Changed Their Planned Cost and Schedule Estimates, and Over Half Did Not Fully Meet System Performance Targets: Among the 15 MAIS programs selected for our study, there were large variations in the extent to which programs had changed their planned cost and schedule estimates and met system performance targets. Of the 15 selected MAIS programs, 13 had cost data available, 14 had schedule data available, and 11 had system performance data available. Of the 13 selected programs with cost data available, 11 programs experienced changes in their cost estimates--including 7 that had experienced increases and 4 that had experienced decreases; and 2 remained unchanged in their cost goals. Additionally, of the 14 programs with schedule data available, 13 programs experienced schedule changes-- including 12 that had slippages and 1 that had accelerated its schedule and met a milestone earlier than planned; and 1 that remained on schedule, as planned. Further, of the 11 programs that had system performance data available, 3 programs met their system performance targets, while 8 did not fully meet their targets. Program profiles with cost, schedule, and system performance details on each of the selected programs, including causes for cost and schedule changes, are included in appendix II. Table 1 provides a summary of the cost, schedule, and performance results for the 15 selected programs. Table 1: Summary of Cost, Schedule, and System Performance Results for the Selected Programs: Air Force: Component/program: Air Force Integrated Personnel and Pay System (AFIPPS)[A,B]; No change in cost estimate: [Empty]; Change in cost estimate: cost increase 8%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 1.5 years; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Empty]. Component/program: AOC-WS Increment 10.2[B,C]; No change in cost estimate: [Check]; Change in cost estimate: [Empty]; No change in schedule estimate: [Check]; Change in schedule estimate: [Empty]; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Empty]. Component/program: Base Information Transport Infrastructure (BITI) Wireless; No change in cost estimate: [Empty]; Change in cost estimate: cost increase 8%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 6 months; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check]. Component/program: Integrated Strategic Planning and Analysis Network (ISPAN) Increment 2; No change in cost estimate: [Empty]; Change in cost estimate: cost decrease 4%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 6 months; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check]. Component/program: JMS Increment 1; No change in cost estimate: [Check]; Change in cost estimate: [Empty]; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule acceleration 2 months; Met system performance targets: [Check]; Did not fully meet system performance targets: [Empty]. Army: Component/program: Distributed Common Ground System - Army (DCGS-A) Increment 1; No change in cost estimate: [Empty]; Change in cost estimate: cost decrease 9%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 3 months; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check][D]. Component/program: Integrated Personnel and Pay System - Army (IPPS-A) Increment 1; No change in cost estimate: [Empty]; Change in cost estimate: cost increase 10%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 1 year; Met system performance targets: [Check]; Did not fully meet system performance targets: [Empty]. Component/program: JPI Version 2[B,E]; No change in cost estimate: [Empty]; Change in cost estimate: [Empty]; No change in schedule estimate: [Empty]; Change in schedule estimate: [Empty]; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Empty]. Component/program: Defense Health Agency (DHA); No change in cost estimate: [Empty]; Change in cost estimate: [Empty]; No change in schedule estimate: [Empty]; Change in schedule estimate: [Empty]; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Empty]. Component/program: iEHR Increment 1[F]; No change in cost estimate: [Empty]; Change in cost estimate: [Empty]; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 11 months; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check]. Component/program: TMIP-J Increment 2; No change in cost estimate: [Empty]; Change in cost estimate: cost increase 2,233%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 6 years; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check][D]. Defense Logistics Agency (DLA): Component/program: DAI[A]; No change in cost estimate: [Empty]; Change in cost estimate: cost increase 159%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 5 years; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check]. Component/program: EProcurement; No change in cost estimate: [Empty]; Change in cost estimate: cost increase 4%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 2 months; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check]. Navy: Component/program: Global Command and Control System - Maritime (GCCS- M) Increment 2; No change in cost estimate: [Empty]; Change in cost estimate: cost decrease 86%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 3.5 years; Met system performance targets: [Check]; Did not fully meet system performance targets: [Empty]. Component/program: GCSS-MC Increment 1; No change in cost estimate: [Empty]; Change in cost estimate: cost increase 302; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 6 years; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Check]. Component/program: NGEN Increment 1[A,B]; No change in cost estimate: [Empty]; Change in cost estimate: cost decrease 15%; No change in schedule estimate: [Empty]; Change in schedule estimate: schedule slippage 2 years; Met system performance targets: [Empty]; Did not fully meet system performance targets: [Empty]. Component/program: Total; No change in cost estimate: 2; Change in cost estimate: 11; (7 cost increases; 4 cost decreases); No change in schedule estimate: 1; Change in schedule estimate: 13; (12 schedule slippage; 1 schedule acceleration); Met system performance targets: 3; Did not fully meet system performance targets: 8. Source: GAO analysis of data provided by DOD officials. [A] Three programs--AFIPPS; DAI; and NGEN Increment 1--had not yet established an APB. As such, we compared their latest cost and schedule estimates against initial estimates. These initial estimates were based on limited information about the program's requirements and the viability of technologies available to meet the program's needs. [B] System performance data for these programs were not available due to two primary reasons: (1) programs were early in development or implementation of the systems and had not yet fully deployed any portion of the systems, or (2) the program was not under contract. [C] The AOC-WS Increment 10.2 program recently established its first APB in October 2013. However, in December 2012, prior to establishing its first APB, the program declared a critical change as required by law because it had not achieved a full deployment decision within 5 years from the time the program selected the technology to be used. [D] While neither DCGS-A Increment 1 nor TMIP-J Increment 2 fully met performance measures, these programs deferred or removed the problematic capabilities from the programs and subsequent tests on the modified system releases showed that the systems performed acceptably once those capabilities were removed. [E] Cost and schedule data were not available for JPI Version 2 because, as of December 2013, program officials were working to revise JPI's requirements and expected to reach milestone B (the point at which an APB would be established) about 18 months after the program's requirements are approved. [F] As of January 2014, the iEHR program did not have a cost estimate available that was reflective of recent scope changes made to the program. iEHR officials stated that they plan to revise the cost estimate to reflect recent scope changes, and expected this update to occur at milestone C (authorizes a program to begin limited fielding of the system), which is currently planned for May 2014. [End of table] Most of the Selected Programs Experienced Changes in Their Planned Total Life-cycle Cost Estimates: Eleven of the 15 selected programs experienced changes in their planned total life-cycle cost estimates. Specifically, 4 programs had reduced their cost estimates and 7 had increased these estimates. Four programs experienced decreases in their planned total life-cycle cost estimates, ranging from about 4 percent to 86 percent. According to DOD officials, these decreases were due to cost savings from competitive contracting, implementing new technology that cost less, program budget cuts, or transferring costs to another DOD program. * The latest life-cycle cost estimate (as of December 2013) for Navy's NGEN Increment 1 program had decreased about 15 percent from its initial estimate--from $25.4 billion to $21.6 billion.[Footnote 36] Program officials attributed this decrease to cost savings from competitive contracting. * The latest life-cycle cost estimate for Navy's GCCS-M Increment 2 program had decreased about 86 percent from its first APB estimate-- from $4.4 billion in February 2006 to approximately $641.8 million as of December 2013. GCCS-M officials attributed this decrease primarily to the Navy's transfer of certain hardware requirements--such as procuring and installing servers and other network equipment--to Navy's common computing environment that will be provided by Navy's Consolidated Afloat Networks and Enterprise Services program, and the removal of costs associated with operations and maintenance and military personnel that are now planned to be funded by sources outside of the program office. * The latest life-cycle cost estimate (as of March 2014) for Air Force's ISPAN Increment 2 program had decreased about 4 percent from its first APB estimate--from $152.5 million in November 2010 to $146.2 million. According to program officials, this decrease was due to the program's switch from a systems architecture environment that included multiple hardware items, such as computers and servers, to a virtualized environment in which certain computers are software- based.[Footnote 37] * The latest life-cycle cost estimate for Army's DCGS-A Increment 1 program had decreased about 9 percent from its first APB estimate of $11.2 billion in March 2012 to approximately $10.2 billion as of December 2013. DCGS-A officials attributed this decrease, among other things, to a reduction in the number of brigade combat teams, which was due, in part, to the Budget Control Act of 2011 (which required DOD to reduce its future expenditures) and the drawdown of forces in Afghanistan and Iraq. Seven programs experienced increases in the planned total life-cycle cost estimates, ranging from 4 to 2,233 percent. * The latest life-cycle estimate for DHA's TMIP-J Increment 2 program increased approximately 2,233 percent from its first APB estimate of $67.7 million in November 2002 to $1.58 billion as of December 2013. Program officials attributed the cost increase to the addition of capabilities originally intended to be included in a future increment, new requirements necessary to meet the needs of the warfighter, and the inclusion of operations and maintenance costs that were not included in the first APB because it was initially thought that such costs would be paid by the military services. * The latest life-cycle cost estimate for Navy's GCSS-MC Increment 1 program (as of December 2013) had increased approximately 302 percent from its first APB estimate of $461.4 million in June 2007 to $1.86 billion. Program officials attributed the cost increase to technical challenges associated with developing the second release and extending the period of contractor maintenance to allow additional time for transferring post-deployment system support to the government. * The latest life-cycle cost estimate for DLA's DAI program increased about 159 percent from the program's initial estimate of approximately $209.2 million in March 2007 to $543.0 million as of December 2013. Program officials attributed this program's cost increases to adding 5 years to the estimate's life cycle, understating initial assumptions on licensing and hardware costs, and a need for additional change management efforts. * The latest life-cycle cost estimate (as of December 2013) for the Air Force's AFIPPS program had increased about 8 percent from the program's initial estimate of about $1.72 billion in July 2010 to $1.86 billion. Program officials reported that the increases in costs were primarily due to (1) an increase in contractor staff within the program office, (2) DOD's direction to the program to switch from using a development environment hosted at a contractor site to one hosted by DOD's Defense Information Systems Agency, and (3) the addition of new requirements, including supportability requirements for the network and training environment. * The Air Force's BITI Wireless program had an approximately 8 percent increase in its latest cost estimate compared to its first APB estimate--from $499.5 million in April 2010 to $541.4 million as of December 2013. Program officials reported that the cost increase was due to the need to maintain the program's wireless infrastructure longer than originally planned, which increased operations and maintenance costs. * The latest life-cycle cost estimate (as of December 2013) for Army's IPPS-A Increment 1 program increased about 10 percent from its first APB estimate of $358.4 million in March 2012 to about $395.3 million. IPPS-A officials attributed this increase to significant program schedule slippages. * Further, DLA's EProcurement program had about a 4 percent increase in its latest life-cycle cost estimate compared to its first APB estimate--from $528.0 million in March 2012 to $549.7 million as of December 2013. According to program officials, this increase was due to a 2-month slip in the full deployment date, which shifted estimates into the next fiscal year and added inflation. As of December 2013, two programs had not experienced any cost increases or decreases in their planned total life-cycle cost estimates when compared to their first APB estimates. * The latest cost estimate (as of December 2013) for Air Force's JMS Increment 1 program was about $155.6 million, which is the same as the program's first APB, which was established in April 2013. * The Air Force's AOC-WS Increment 10.2 program's first APB cost estimate was established in October 2013. The program spent approximately $176 million and took 6 years (when compared to the program's 2007 initiation date) before it established this APB. Program officials attributed the delays in developing the APB, in part, to the Air Combat Command's determination that the original scope was unaffordable and a subsequent re-planning of the program to reduce its scope. One program did not have cost information available. * In February 2013, the Vice Chairman of the Joint Chiefs of Staff determined that the proposed 10-year funding profile for the Army's JPI Version 2 program was unaffordable. Subsequently, the program was working to revise JPI's requirements and reassess the functionality that will be included in the program. Program officials expect to reach milestone B (the point at which an APB would be established) about 18 months after the program's requirements are approved. Additionally, DHA's iEHR increment 1 program did not have a cost estimate available that was reflective of recent scope changes made to the program. Program officials stated that they plan to revise the cost estimate to reflect recent scope changes, and expected this update to occur at milestone C (authorizes a program to begin limited fielding of the system), which is currently planned for May 2014. Table 2 provides a summary of the percent of cost increase or decrease for each selected program's latest planned total life-cycle cost estimates. Table 2: Changes in Selected Programs' First Approved Baseline Estimates and Latest Planned Total Life-cycle Cost Estimates: Component: Air Force; Program: AFIPPS; Percent change in life-cycle cost estimate: increase 8%[A]. Program: AOC-WS Increment 10.2; Percent change in life-cycle cost estimate: 0. Program: BITI Wireless; Percent change in life-cycle cost estimate: increase 8%. Program: ISPAN Increment 2; Percent change in life-cycle cost estimate: decrease 4%. Program: JMS Increment 1; Percent change in life-cycle cost estimate: 0. Component: Army; Program: DCGS-A Increment 1; Percent change in life-cycle cost estimate: decrease 9%. Program: IPPS-A Increment 1; Percent change in life-cycle cost estimate: increase 10%. Program: JPI Version 2; Percent change in life-cycle cost estimate: n/a[B]. Component: DHA; Program: iEHR Increment 1; Percent change in life-cycle cost estimate: n/a[C]. Program: TMIP-J Increment 2; Percent change in life-cycle cost estimate: increase 2,233%. Component: DLA; Program: DAI; Percent change in life-cycle cost estimate: increase 159%[A]. Program: EProcurement; Percent change in life-cycle cost estimate: increase 4%. Component: Navy; Program: GCCS-M Increment 2; Percent change in life-cycle cost estimate: decrease 86%. Program: GCSS-MC Increment 1; Percent change in life-cycle cost estimate: increase 302%. Program: NGEN Increment 1; Percent change in life-cycle cost estimate: decrease 15%[A]. Source: GAO analysis of data provided by DOD officials. [A] AFIPPS, DAI, and NGEN Increment 1 had not yet established an APB. As such, we compared their latest cost estimates against initial estimates. These initial estimates were based on limited information about the program's requirements and the viability of technologies available to meet the program's needs. [B] As of December 2013, JPI Version 2 did not have cost information available. In February 2013, the Vice Chairman of the Joint Chiefs of Staff determined that the proposed 10-year funding profile for the program was unaffordable. As of December 2013, the program was working to revise JPI's requirements and reassess the functionality that will be included in the program. [C] As of January 2014, the iEHR program did not have a cost estimate available that was reflective of recent scope changes made to the program. iEHR officials stated that they plan to revise the cost estimate to reflect recent scope changes, and expected this update to occur at milestone C (authorizes a program to begin limited fielding of the system), which is currently planned for May 2014. [End of table] Most of the Selected Programs Experienced Schedule Estimate Changes: Thirteen of the 15 selected programs had experienced changes in their schedule estimates, including 1 program that accelerated its schedule and met its full deployment milestone earlier than planned and 12 programs that had experienced slippages. One of the selected programs did not experience a change in its schedule estimate. Specifically, the Air Force's AOC-WS Increment 10.2 program had just established its first APB schedule estimate in October 2013. One of the selected programs accelerated its schedule compared to its first APB schedule estimate. Specifically, Air Force's JMS Increment 1 program accelerated its full deployment date by about 2 months and was deemed fully deployed in April 2013. Program officials attributed the early completion of this milestone to a successful operational trial period and initial operational capability decision. Twelve of the 15 selected programs had experienced slippages in their planned schedule estimates, ranging from a few months to 6 years. Two of the 12 programs had experienced significant slippages in their schedules, having experienced 11-month and 12-month delays, respectively. Six programs had experienced critical slippages of more than 1 year. For example, * Compared to its first APB schedule, Navy's GCSS-MC Increment 1 program experienced a 6-year slippage in its full deployment date, currently scheduled for the fourth quarter of 2015. These delays were due to technical challenges in developing the second release of the system. * DHA's TMIP-J Increment 2 program experienced an over 6-year slippage in its full deployment date compared to its first APB schedule--from May 2009 to the first quarter of fiscal year 2016. Program officials attributed this delay primarily to an increase in requirements related to supporting warfighters; and configuration management and software usage problems experienced when preparing the first release for deployment. * Navy's GCCS-M Increment 2 program experienced a 3.5 year slippage in its full deployment decision date compared to its first APB schedule-- from August 2007 to March 2011. Program officials attributed this slippage to schedule delays in the availability of ships for operational testing, and a program restructure in 2007, which required new work to be done to develop software in accordance with the newly proposed common infrastructure and software baseline. * DLA's DAI program experienced over a 5-year slip in its planned date to obtain approval to begin production and deployment of the system (referred to as milestone C) when compared to its initial schedule, which planned for it to occur in January 2009. In September 2013, the program was restructured into two increments, and DOD decided that increment 1 would not proceed to milestone C and would be placed into the operations and support phase. DOD also decided that additional milestones would be reached through the second increment. However, as of January 2014, program officials were uncertain when they would set a schedule for increment 2 and reach milestone C.[Footnote 38] DAI officials attributed this slippage to fluctuation in the number of agencies to deploy DAI and the agency deployment schedule; the change in program designation to pre-MAIS (meaning it was expected to meet MAIS thresholds), which resulted in additional oversight requirements; and to the change in parent organization from the Business Transformation Agency to DLA. * Navy's NGEN Increment 1 program experienced an almost 2-year slippage in its planned date to obtain approval to begin production and deployment of the system when compared to its initial schedule-- from August 2011 to June 2013. Program officials attributed the slippages to the need to conduct more detailed planning for acquiring NGEN services; delays in solicitation activities; the proposals received not being of the desired quality, which led to delays in contract award; and a contract award protest. * Air Force's AFIPPS program experienced about a 1.5-year schedule slip in its planned date for milestone B (authorizes a program to begin system development) when compared to its initial schedule--from the first quarter of fiscal year 2013 to June 2014. Program officials reported that the program decided to delay milestone B until after the development contract was awarded because, among other things, the work to be performed under the contract is expected to better define the program and provide additional details that can be used when developing the program's first APB. One program did not have schedule information available. * Army's JPI Version 2 program did not have schedule information available because, in February 2013, the Vice Chairman of the Joint Chiefs of Staff determined that the proposed 10-year funding profile for JPI Version 2 was unaffordable. Subsequently, the program has been working to revise JPI's requirements and reassess the functionality that will be included in the program. Program officials expect to reach milestone B (the point at which an APB would be established-- including an approved schedule estimate) about 18 months after the program's requirements are approved. Table 3 provides a summary of the slippages experienced by the selected MAIS programs. Table 3: Selected MAIS Programs' Schedule Slippages Compared to First Approved Baseline Schedules: Component: Air Force; Program: AFIPPS; Schedule slippage since first APB (slipped milestone): 1.5 years (milestone B)[A]. Program: AOC-WS Increment 10.2; Schedule slippage since first APB (slipped milestone): none. Program: BITI Wireless; Schedule slippage since first APB (slipped milestone): 6 months (full deployment). Program: ISPAN Increment 2; Schedule slippage since first APB (slipped milestone): 6 months (full deployment). Program: JMS Increment 1; Schedule slippage since first APB (slipped milestone): none. Component: Army; Program: DCGS-A Increment 1; Schedule slippage since first APB (slipped milestone): 3 months (full deployment decision). Program: IPPS-A Increment 1; Schedule slippage since first APB (slipped milestone): 1 year (milestone C). Program: JPI Version 2; Schedule slippage since first APB (slipped milestone): n/a[B]. Component: DHA; Program: iEHR Increment 1; Schedule slippage since first APB (slipped milestone): 11 months (full deployment decision). Program: TMIP-J Increment 2; Schedule slippage since first APB (slipped milestone): 6 years (full deployment). Component: DLA; Program: DAI; Schedule slippage since first APB (slipped milestone): 5 years (milestone C)[A]. Program: EProcurement; Schedule slippage since first APB (slipped milestone): 2 months (full deployment). Component: Navy; Program: GCCS-M Increment 2; Schedule slippage since first APB (slipped milestone): 3.5 years (full deployment decision). Program: GCSS-MC Increment 1; Schedule slippage since first APB (slipped milestone): 6 years (full deployment). Program: NGEN Increment 1; Schedule slippage since first APB (slipped milestone): 2 years (milestone C)[A]. Source: GAO analysis of data provided by DOD officials. [A] An APB was never established for AFIPPS, DAI, or NGEN Increment 1. As such, we compared their initial schedule estimates to their latest estimates. These initial estimates were based on limited information about the program's requirements and the viability of technologies available to meet the program's needs. [B] As of December 2013, JPI Version 2 did not have schedule information available and the program was working to revise its requirements and reassess the functionality that will be included in the program. [End of table] Program officials attributed the schedule slippages for the 12 programs to numerous causes, ranging from needing more time for planning and test activities to contractor performance problems. Table 4 provides examples of schedule slippage causes identified by the programs. Table 4: Causes for Schedule Slippages among 12 Selected Programs: Air Force: Component/program: AFIPPS; More time needed for planning activities: [Check]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Empty]; More time needed for test activities: [Empty]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Component/program: BITI Wireless; More time needed for planning activities: [Empty]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Check]; More time needed for test activities: [Empty]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Component/program: ISPAN Increment 2; More time needed for planning activities: [Check]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Empty]; More time needed for test activities: [Check]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Army: Component/program: DCGS-A Increment 1; More time needed for planning activities: [Empty]; System performance problems: [Check]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Empty]; More time needed for test activities: [Empty]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Component/program: IPPS-A Increment 1; More time needed for planning activities: [Empty]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Check]; Deployment-related issues: [Empty]; More time needed for test activities: [Empty]; Contract award delays: [Check]; Contractor performance problems: [Check]; Organizational restructure: [Empty]. DHA: Component/program: iEHR Increment 1; More time needed for planning activities: [Check]; System performance problems: [Check]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Empty]; More time needed for test activities: [Check]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Component/program: TMIP-J Increment 2; More time needed for planning activities: [Empty]; System performance problems: [Check]; Unanticipated requirements or unplanned work: [Check]; Deployment-related issues: [Check]; More time needed for test activities: [Empty]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. DLA: Component/program: DAI; More time needed for planning activities: [Check]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Empty]; More time needed for test activities: [Empty]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Check]. Component/program: EProcurement; More time needed for planning activities: [Empty]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Check]; More time needed for test activities: [Empty]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Navy: Component/program: GCCS-M Increment 2; More time needed for planning activities: [Empty]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Check]; Deployment-related issues: [Empty]; More time needed for test activities: [Check]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Component/program: GCSS-MC Increment 1; More time needed for planning activities: [Empty]; System performance problems: [Check]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Empty]; More time needed for test activities: [Empty]; Contract award delays: [Empty]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Component/program: NGEN Increment 1; More time needed for planning activities: [Check]; System performance problems: [Empty]; Unanticipated requirements or unplanned work: [Empty]; Deployment-related issues: [Empty]; More time needed for test activities: [Empty]; Contract award delays: [Check]; Contractor performance problems: [Empty]; Organizational restructure: [Empty]. Component/program: Total; More time needed for planning activities: 5; System performance problems: 4; Unanticipated requirements or unplanned work: 3; Deployment-related issues: 3; More time needed for test activities: 3; Contract award delays: 2; Contractor performance problems: 1; Organizational restructure: 1. Source: GAO analysis of data provided by DOD officials. [End of table] Further discussion of the specific causes for schedule slippages among the 12 programs is included in appendix II. Three of the Selected Programs Reported Meeting System Performance Targets, While Eight Reported Not Fully Meeting Targets, and Four Did Not Have System Performance Data Available: Three of the 15 selected programs reported meeting their system performance targets. These programs were Air Force's JMS Increment 1 program, Army's IPPS-A Increment 1 program, and the Navy's GCCS-M Increment 2 program. For example, a system operational evaluation was completed on the JMS system in December 2012, which concluded that JMS was effective for the limited scope of operational capabilities delivered and that increment 1 met the targets for its two key performance parameters related to displaying the user-defined operational picture and supporting network-based military operations. Additionally, as of December 2013, the Navy had tested three of the four planned system configurations for GCCS-M increment 2. The Navy determined that all three of the system configurations were operationally effective and suitable; however, it identified deficiencies with two configurations of the system. The program has since addressed those deficiencies. Four programs did not have system performance data available. Specifically, system performance data for Air Force's AFIPPS and AOC- WS Increment 10.2 programs; and Army's JPI Version 2 program were not available because these programs were either early in the development or implementation stages and had not yet fully deployed any portion of the systems. Additionally, system performance data for Navy's NGEN Increment 1 program were not available because the program had not yet transitioned to the new NGEN contract and, as such, the Navy had not yet evaluated system performance targets for NGEN. On the other hand, 8 of the 15 selected programs reported experiencing system performance problems, which resulted in these systems not performing as intended and reducing the value of the systems. Specifically, each of these 8 programs had experienced numerous system deficiencies. For example, an initial operational test and evaluation of release 1 of the Army's DCGS-A Increment 1 system in 2012 determined that it was operationally effective with limitations. Specifically, the system was unable to meet certain requirements, such as synchronizing data passed between different classified network domains (e.g., between secret and top secret networks). Additionally, in December 2012, the Navy's GCSS-MC Increment 1 program reported that the second system release was unable to successfully complete developmental and operational testing due to technical problems associated with certain capabilities in that release, including synchronizing remote computers to the primary system. Further, in September 2012, an operational evaluation on the Air Force's BITI Wireless system identified 14 deficiencies among three of the six critical operational areas that were evaluated, including network command and control. As of December 2013, five of the eight programs with system performance issues were still experiencing these issues, and the other three programs--Army's DCGS-A Increment 1, Navy's GCSS- MC Increment 1, and DHA's TMIP-J Increment 2--had removed or deferred the problematic capabilities from the scope of the programs, rather than correcting the issues. Selected Programs' Implementation of Key Risk Management Practices Varied in Effectiveness: According to CMMI-ACQ and PMBOK®, an effective risk management process identifies potential problems before they occur, so that risk-handling activities may be planned and invoked, as needed, across the life of the project in order to mitigate adverse impacts on achieving objectives. Specifically, key risk management practices include: * identifying risks, threats, and vulnerabilities that could negatively affect work efforts; * evaluating and categorizing each identified risk using defined risk categories and parameters, such as likelihood and consequence, and determining each risk's relative priority; * developing risk mitigation plans for selected risks to proactively reduce the potential impact of risk occurrence; and: * monitoring the status of each risk periodically and implementing the risk mitigation plan as appropriate. DLA's DAI Program Had Recently Taken Steps to Implement Risk Management Practices, but More Work Remains to Effectively Identify and Manage Risks: DLA's DAI program is intended to modernize the financial management processes of 21 defense agencies and components and is a key component of the DOD plan for achieving fully auditable financial statements by September 30, 2017. DAI began as a non-MAIS program in 2007 and was declared a MAIS program with two increments in September 2013. Increment 1 deployed five releases to 11 defense agencies and components from 2008 to 2012. Increment 2 was to provide new and enhanced capabilities and to deploy to the remaining 10 defense agencies and components. As of January 2014, DAI officials were uncertain when the program would begin system development of increment 2. DLA had taken steps to implement risk management practices for DAI, but it did not have a documented process for managing risks during the first 6 years of the program and recent efforts to implement key risk management practices still need improvement. Although it is uncertain how DAI was impacted by the lack of a documented risk management process during the first 6 years of the program--during which it deployed five releases to 11 defense agencies and components--the program's costs increased by about 159 percent and its schedule slipped by over 5 years (as discussed in appendix II). * Identify risks, threats, and vulnerabilities that could negatively affect work efforts. The DAI program had not fully identified risks, threats, and vulnerabilities that could negatively affect work efforts, but recently took steps to do so. Specifically, the program did not document risks during the first 6 years of the program, and instead allowed risks to be handled at the team level. DAI established a risk management board in July 2013 to review and approve risks on a monthly basis, and began documenting risks in a log. However, the program office was still in the early stages of identifying and approving risks, and had not been accurately maintaining its risk log. Specifically, the July through October 2013 risk logs did not accurately capture the status of the potential risks identified by the program, and therefore it was not clear which risk(s) had been approved by the risk management board. The program office recognized the lack of maturity in its risk management practices and hired a risk management expert to help the program improve its risk management efforts in September 2013. As a result, the November 2013 risk log accurately reflected risk management board decisions and appropriately identified the three risks that had been approved by the board. However, program officials stated that the three risks do not represent a comprehensive set of risks facing the program and added that they are still in the process of identifying and approving additional program risks. While these are positive steps, until the program is routinely identifying and managing a risk log that represents a complete set of risks facing the program, effective risk management of the DAI program will be limited. * Evaluate and categorize each identified risk using defined risk categories and parameters, such as likelihood and consequence, and determine each risk's relative priority. The DAI program had not consistently evaluated and categorized its identified risks, but recently took steps to do so. Specifically, as previously stated, the program office did not document risks during the first 6 years of the program, but recently began establishing a structured risk management process. As part of this effort, the program office finalized a risk management plan in June 2013 that included key risk management practices, such as processes for evaluating and categorizing identified risks using defined risk categories and parameters, including likelihood and consequence, and determining each risk's relative priority. However, as previously stated, the program office was still in the early stages of identifying and assessing risks and the assessments in the early risk logs that were established under the new risk process--July through November 2013--did not consistently align with the defined parameters. For example, DAI did not appropriately assess consequence to cost based on the program's five defined levels (i.e., percent of budget increase) for two of the three approved risks in the November 2013 risk log. As previously stated, officials told us that they were still learning the process and had hired an expert to help improve their risk management practices. Without conducting appropriate risk assessments, the program will lack assurance that it is prioritizing its resources for risk mitigation in the most effective manner. * Develop risk mitigation plans for selected risks to proactively reduce the potential impact of risk occurrence. The DAI program had not fully developed risk mitigation plans because it was still in the early stages of implementing its risk management practices and had only recently begun developing risk mitigation plans for identified risks. In the absence of appropriate mitigation plans for all program risks, the program will lack assurance that it is avoiding the likelihood that those risks materialize into issues. * Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate. The DAI program was still in the early stages of implementing its risk management practices and monitoring the status of its risks. Additionally, the program's discussion of potential risks in weekly program status reviews did not accurately reflect risk status and did not align with the risk log. For example, during an August 2013 status review, the program office rated certain program areas, such as configuration management and requirements, "green" (meaning no issues) even though the office had identified potential risks or issues in each of the areas. Officials recognized that the weekly program status reviews needed to better align with program risks. Subsequently, in January 2014, officials reported that they were using the risk log to inform the weekly reviews. Navy's GCSS-MC Program Had Recently Shown Progress in Implementing Key Risk Management Practices: GCSS-MC Increment 1 is intended to support logistics planners and operators worldwide to manage combat logistics, including planning, warehousing, distribution, depot maintenance, and asset visibility. In August 2013, following a critical change that was declared in December 2012 due to technical challenges associated with capabilities in GCSS- MC's second planned release, and a subsequent review of the program, the Navy reduced the scope of the program by removing one of the two planned releases from the first increment of GCSS-MC. In place of the second release, the Navy added an enhancement release to the first release that is intended to provide minimal functionality to users without network connectivity. Program officials reported that, by March 2013, DOD had fielded the first release--which provided logistics capabilities to users that had access to the system via the internet--to all intended users except those in Afghanistan because the Navy did not want to disrupt ongoing combat operations. As of December 2013, program officials expected to begin fielding the enhancement release in the second quarter of 2015. The Navy's GCSS-MC program's risk log was out of date and its risk management board did not conduct risk review meetings between March 2013 and July 2013; however, the Navy had recently taken steps to implement key risk management practices for GCSS-MC, including resuming monthly risk management board meetings and updating its risks and plans for mitigating them. * Identify risks, threats, and vulnerabilities that could negatively affect work efforts. GCSS-MC had identified risks, threats, and vulnerabilities that could negatively affect work efforts. However, between March and July 2013, the program's overall risk log was out of date and the program's risk management board did not conduct risk review meetings that assessed risks and associated mitigation plans on a monthly basis. Specifically, although program officials stated that they were managing risks at the product level on an ongoing basis, the program-level risk log provided in May 2013 showed that the status of risks had not been updated since February 2013, when the program stopped conducting monthly risk management board meetings. The program office reported that these activities were halted during certain periods of the critical change review because the primary work originally planned to be performed during that time was the development of the second planned release, which was paused during the review. However, during this time, operations and maintenance activities with the first release were still ongoing, yet risks were not being tracked and monitored by the risk management board. It was not until August 2013--at the end of the critical change review--that the program resumed the risk management board meetings on a monthly basis. In November 2013 the program had updated its risks and its plans for mitigating them. Continued implementation of reinstituted risk management activities should help ensure that, moving forward, the program is properly identifying and managing its program risks. * Evaluate and categorize each identified risk using defined risk categories and parameters, such as likelihood and consequence, and determine each risk's relative priority. During the program's recent reassessment and validation of its risks, as discussed above, GCSS-MC had evaluated and categorized its risks. For example, as of November 2013, the program had categorized 3 of its 14 approved risks as low risk, 8 as medium risk, and 3 as high risk. * Develop risk mitigation plans for selected risks to proactively reduce the potential impact of risk occurrence. The program had developed mitigation plans for its identified risks. However, as stated previously, these risks and mitigation plans were out of date for approximately 5 months and the program had recently taken steps to update its mitigation plans for its risks. * Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate. Prior to the conclusion of the program's latest critical change, the program had not periodically monitored the status of each program-level risk. Specifically, the program's risk management board had not conducted risk review meetings from March 2013 to July 2013. GCSS-MC officials stated that risk reviews were not held because the program was focused on addressing its critical change. Program officials also stated that they were regularly monitoring the status of product-level risks throughout the critical change, as discussed above. As stated previously, the program had recently taken steps to improve its risk management process, including monitoring and assessing program-level risks and associated mitigation plans on a monthly basis. By taking these actions, the GCSS-MC program had established and implemented key practices as part of its risk management process. Doing so should increase the likelihood that the program is positioned to mitigate negative impacts from potential problems before they occur. DHA's TMIP-J Program Had Implemented Key Risk Management Practices: TMIP-J Increment 2 is a set of applications that support warfighters and health care providers in military theater operations with patient, medical logistics, and medical command and control data. The program also integrates with medical systems at sites that support military bases. Increment 2 is intended to upgrade legacy systems and add significant new functionality to the first increment, including support for wounded warriors. TMIP-J Increment 2 is intended to be fielded in three releases. Program officials estimated that 80 percent of planned capabilities were met with the first installed release, which began fielding in 2008. In December 2013, the program was granted full deployment decision based on test results for the second release, and the third release was in the requirements development phase. The TMIP-J program had implemented key practices as part of its risk management process. * Identify risks, threats, and vulnerabilities that could negatively affect work efforts. The TMIP-J program had identified risks, threats, and vulnerabilities that could negatively affect work efforts. In particular, as of June 2013, key risks as identified by the program office included (1) the possibility of a backup database not being synchronized with the production database if the size of the backup database is not increased, and (2) the possibility of the backup database needing to be re-synchronized to the production database if a certain requirement is implemented in the production database. * Evaluate and categorize each identified risk using defined risk categories and parameters, such as likelihood and consequence, and determine each risk's relative priority. The program had evaluated and categorized its risks based on probability and impact. For example, the program reported that the first aforementioned key risk had a "highly likely" probability (meaning a 60-79 percent probability of occurrence) and "significant" performance impact (meaning it may cause significant degradation in technical performance). * Develop risk mitigation plans for selected risks to proactively reduce the potential impact of risk occurrence. TMIP-J had developed mitigation plans to proactively reduce the potential impact of risk occurrence. For example, the risk mitigation plan for the first aforementioned key risk included the program planning and requesting additional backup database space for the next 5 years. For the second aforementioned key risk, the program determined that there is no mitigation strategy available for the risk; instead, the program planned to accept the risk and resynchronize the backup database as soon as possible. * Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate. The program monitored its risks and documented the status of risk mitigation actions that had been taken. For example, the program monitored its risks on a quarterly basis and tracked the number of open mitigation steps for each risk. In taking these actions, the TMIP-J program had established and utilized effective risk management practices. Doing so should better position the program to mitigate adverse impacts from potential problems before they occur. Mixed Progress in Applying Key IT Acquisition Best Practices: DLA's DAI, Navy's GCSS-MC, and DHA's TMIP-J programs demonstrated varied progress in implementing IT acquisition best practices for requirements management and project monitoring and control. Specifically, while DAI and GCSS-MC had fully implemented requirements management best practices, TMIP-J had not clearly defined its capabilities nor maintained complete traceability between all of its requirements and work products. Regarding project monitoring and control, DAI, GCSS-MC, and TMIP-J had each implemented selected project monitoring and control best practices. However, DAI had not documented significant deviations in performance; GCSS-MC had not always taken corrective actions to address issues in a timely manner; and TMIP-J had not effectively determined its progress against the project plan nor appropriately communicated to stakeholders on the status of the program. Without fully implementing effective acquisition management practices, these programs may be at risk of not meeting planned cost and schedule milestones, and may implement systems that do not fully meet user needs. The DLA and Navy Programs Had Fully Implemented Key Requirements Management Best Practices, and the DHA Program Had Implemented Many Key Practices, but Could Not Always Trace Requirements to User Needs: According to requirements management best practices, effective requirements management involves[Footnote 39] * establishing criteria for identifying appropriate requirements providers; * establishing objective criteria for the evaluation and acceptance of requirements; * assessing the impact of requirements on existing commitments; * reviewing project plans, activities, and work products to ensure that they are consistent with the defined requirements; and: * ensuring traceability between the requirements and work products. DLA's DAI Program Had Recently Taken Steps to Implement Key Requirements Management Best Practices: Prior to January 2014, the DAI program had not consistently ensured traceability between its requirements and work products, but the program had recently taken steps to implement key requirements management best practices and, as of January 2014, it was ensuring traceability between its requirements and work products. * Establish criteria for identifying appropriate requirements providers. DAI had established criteria for distinguishing appropriate requirements providers. Specifically, the program's requirements management plan identified roles and responsibilities for the entities that were to identify and maintain requirements. * Establish objective criteria for the evaluation and acceptance of requirements. DAI had established criteria for evaluating and accepting requirements. Specifically, the program required standard information to be submitted for system change requests, such as the rationale for the change and the level of effort needed to implement it. The program also had standard criteria for evaluating the requests, such as feasibility of the request. * Assess the impact of requirements on existing commitments. DAI had assessed whether new requirements would impact existing commitments. For example, the program had assessed the impact that system change requests would have on DAI's business processes, the program's schedule, and the level of effort and resources needed. * Review project plans, activities, and work products to ensure that they are consistent with the defined requirements. The program had ensured consistency between its defined requirements and project plans. For example, following DAI's restructuring into two increments in September 2013, the program updated its integrated master schedule to reflect this change. * Ensure traceability between the requirements and work products. As of January 2014, the program was ensuring traceability between its requirements and work products. However, prior to this, the program had not consistently maintained this traceability. Specifically, the program had not maintained its requirements traceability matrix with complete and up-to-date information. For example, in a May 2013 version of the matrix, almost 3,000 out of over 4,300 requirements were missing implementation status information. In response to our audit finding, the program updated its matrix. Specifically, a matrix from September 2013 addressed the nearly 3,000 requirements that had been missing status information. Additionally, while DAI had mapped each of its higher-level capabilities to its associated lower-level requirements, the program had not completed its mapping of each of its lower-level requirements to a higher-level capability. For example, as of September 2013, nearly 600 requirements in the program's traceability matrix did not identify a source from which the requirements originated. Officials recognized there were issues with the matrix and stated that it had been consolidated from multiple requirements matrices and that the program was in the process of reconciling the consolidated data. By January 2014, the program had reconciled the data in the traceability matrix. Program officials also stated that they were preparing to acquire a requirements management tool in the second quarter of fiscal year 2014 that is intended to help improve requirements management capabilities--including traceability--and interface with the program's existing configuration management tool. Continued implementation of these requirements management activities should increase the likelihood that the program develops a system that includes all intended functionality. As a result, the DAI program had established effective requirements management best practices, which should help ensure that the DAI system will be deployed with functionality that meets users' needs. Navy's GCSS-MC Program Had Fully Implemented Key Requirements Management Best Practices: GCSS-MC had fully implemented key best practices for its requirements management process. * Establish criteria for identifying appropriate requirements providers. The program had established criteria to identify appropriate requirements providers. Specifically, in November 2013, the Navy drafted a systems engineering plan that identified roles and responsibilities for the entities that were to identify and maintain requirements. * Establish objective criteria for the evaluation and acceptance of requirements. GCSS-MC had established criteria for evaluating and accepting new requirements. For example, according to the program's draft systems engineering plan, the engineering review team is expected to assess new requirements based on impacts to cost, schedule, and performance. * Assess the impact of requirements on existing commitments. The program had assessed the impact of requirements on existing commitments. For example, the program assessed the impact that system requirement change requests would have on users prior to approving the requests in December 2012. In addition, in November 2011, the program's requirements oversight council removed requirements from the scope of GCSS-MC Increment 1 in order to reduce cost and schedule risk. * Review project plans, activities, and work products to ensure that they are consistent with the defined requirements. GCSS-MC had ensured consistency between its requirements and project plans. For example, in its November 2013 draft systems engineering plan, the program had documented the Navy's August 2013 decision to remove certain capabilities that were originally planned for the second system release from the scope of Increment 1. * Ensure traceability between the requirements and work products. The program maintained traceability between its requirements and work products. Specifically, for the first system release, GCSS-MC had traced each requirement in its requirements traceability matrix to its associated higher-level capabilities, as well as to its test results. Additionally, program officials reported in December 2013 that GCSS-MC was in the process of completing the design and test specifications for the program's recently established enhancement release, and the program expected to complete the traceability of this release's requirements by March 2014. As a result, the GCSS-MC program had established and utilized effective requirements management practices, which should increase the likelihood that the program delivers a system that meets users' needs. DHA's TMIP-J Program Had Implemented Many Requirements Management Best Practices, but More Work Remains to Ensure Traceability of Requirements to Work Products: TMIP-J had implemented many key requirements management best practices; however, consistency between requirements and project plans could not be determined because the program's scope was not clearly defined, and the program did not maintain complete traceability between its requirements and work products. * Establish criteria for identifying appropriate requirements providers. The program had established criteria for identifying requirements providers. For example, the program's requirements management plan identified roles and responsibilities for the entities that were to identify and maintain requirements. * Establish objective criteria for the evaluation and acceptance of requirements. The program had established criteria for the evaluation and acceptance of requirements. For example, TMIP-J had defined a list of documents that were required to be submitted with requests for new functional requirements, including an analysis of the impact on interfaces and a risk assessment. * Assess the impact of requirements on existing commitments. The program had assessed the impact of requirements on existing commitments. For example, the program had reviewed the cost of adding a requirement, and the risk of dropping a requirement, before approving these changes. * Review project plans, activities, and work products to ensure that they are consistent with the defined requirements. Consistency between requirements and project plans could not be determined because the program's scope was not clearly defined in the master program document establishing expected system capabilities--referred to as the capabilities baseline document (discussed in the following section). Without a clearly defined scope, TMIP-J is limited in its ability to develop project plans and work products that are consistent with the intended program scope. * Ensure traceability between the requirements and work products. The program did not maintain complete traceability between its requirements and work products. Specifically, the program had not updated its capabilities baseline document to include new program requirements that were added after the baseline was established in 2007, such as those for tracking concussions and equipment used during medical evacuations. While the program had documented these new requirements in its traceability matrix, these requirements were not traced to any higher-level capability in the capabilities baseline document. Program officials said that they did not update the capabilities baseline document because of time pressures resulting from an increase in program activity due to the escalation of the War on Terror, and to changes in the program office structure. Additionally, the program did not trace certain capabilities that were in its capabilities baseline document (such as those related to tracking exposure to occupational hazards) to its requirements traceability matrix. Officials stated that these capabilities are being met by applications that are the responsibility of programs outside TMIP-J; however, this information was not recorded in TMIP-J's requirement traceability matrix to show that these capabilities are being addressed elsewhere. Further, the program developed a system capability that did not trace to any program requirement. Specifically, although the program documented in its traceability matrix a requirement to develop a capability to navigate among medical records, and subsequently took steps to develop this capability, it was not a program requirement. According to TMIP-J officials, the documented requirement and the capability that was developed to meet it were both created in error. Program officials did not discover this error until the capability that was developed to meet the requirement failed system testing, after which the program removed it from the system. In the absence of updating the capabilities baseline document to reflect program scope changes and tracing all capabilities to their associated requirements and system components, stakeholders will lack assurance that the system will be deployed with all intended functionality to meet users' needs. The DHA, DLA, and Navy Programs Had Implemented Many Key Project Monitoring and Control Practices, but Lacked Others: According to project management best practices,[Footnote 40] an effective project monitoring and control process provides oversight of the program's performance, in order to allow appropriate corrective actions if actual performance deviates significantly from planned performance. Key activities in tracking the program's performance include: * determining progress against the project plan, * communicating to stakeholders the status of assigned activities, * documenting significant deviations in performance, and: * taking corrective actions to address issues when necessary. Additionally, as we have previously reported, the implementation of IV&V is a best practice for large and complex system development and acquisition programs, and can provide important information to help program officials monitor and control their programs.[Footnote 41] To be effective, IV&V activities should be performed by an entity that is independent of the management processes and products that are being reviewed. DLA's DAI Program Had Implemented Most of the Key Project Monitoring and Control Practices, but Had Not Documented Significant Deviations in Performance: DLA had implemented selected practices for DAI's project monitoring and control, but its ability to monitor and control the program was limited because it had not documented significant deviations in performance. * Determine progress against the project plan. DAI had determined progress against the project plan. For example, the program office monitored progress against the program's integrated master schedule. Additionally, officials told us that they tracked costs expended over time. Further, although two of DAI's major contracts did not exceed DOD's threshold for requiring the use of earned value management, DAI began requiring the use of earned value management for these contracts in October 2013.[Footnote 42] * Communicate to stakeholders the status of assigned activities. The program regularly communicated the status of assigned activities and work products to stakeholders. For instance, the program held weekly and monthly meetings with various stakeholders to review DAI's progress and performance. * Document significant deviations in performance. While the program documented system change requests and problem reports, officials told us that they did not track or document significant deviations in project planning parameters, such as cost and schedule. Officials attributed this to the management of the program being treated differently when DAI was a non-MAIS program. Officials told us that once DAI became a pre-MAIS program in February 2011, they decided to put more structured processes in place. Officials also stated that they intended to begin tracking significant deviations once the second increment of DAI is baselined as a MAIS program, but as of January 2014, DAI officials were uncertain when that would occur. However, since the start of the program in 2007, DAI's planned total life-cycle costs increased about 159 percent and its schedule slipped by over 5 years. Without tracking and documenting such information, stakeholders will be limited in their knowledge of whether the program will be able to provide the intended functionality on time and within budget. * Take corrective actions to address issues. The program took corrective actions to address system issues identified by users through its configuration management process. Specifically, in response to issues identified based on system change requests, the program implemented changes as part of regular system releases. Additionally, in November 2013, it developed an issues log to track other program issues and their planned corrective actions. * Use an IV&V agent. The program had assigned IV&V agents to assess topics of concern that were identified by the DAI program office. For example, in June and November 2012, DAI used an IV&V agent to conduct assessments of the system's ability to exchange data with other systems. Additionally, as of September 2013, officials stated that an independent contractor was conducting assessments to determine DAI's compliance with standards related to financial management and information systems. Navy's GCSS-MC Program Had Implemented Most Key Project Monitoring and Control Practices, but Did Not Always Take Timely Corrective Actions to Address Issues: GCSS-MC had implemented nearly all key practices for project monitoring and control, but the program had not always taken corrective actions to address issues in a timely manner. * Determine progress against the project plan. The program office reported that it consistently monitored the progress of tasks against the project plan. Specifically, although GCSS-MC did not measure progress using its program-level integrated master schedule--which officials stated had been put on hold during the critical change-- program officials reported that they maintained lower-level schedules at the product level. Following the critical change review, in December 2013 the program completed updating the program-level integrated master schedule to align with the results of the critical change and officials reported that they are now using this schedule to monitor progress. Continued use of the product-level schedules and implementation of the updated program-level integrated master schedule should help ensure that the program is able to assess project progress. * Communicate to stakeholders the status of assigned activities. The program regularly communicated to stakeholders on the status of assigned activities. Specifically, although the program's oversight group did not conduct documented quarterly program reviews between September 2012 and June 2013, the program provided updates in other meetings that were held on at least a quarterly basis. * Document significant deviations in performance. The program had reported significant deviations from its project planning parameters. For instance, in August 2013 the program reported to Congress that it had experienced over a 1-year slip in achieving its full deployment decision (when compared to the planned date that the program submitted to Congress in 2010) due to technical challenges with the system. * Take corrective actions to address issues. GCSS-MC did not always identify or take timely corrective actions to address program issues, although it had consistently reported these issues. For example, as early as December 2008, the program was aware of technical complexities involved with delivering GCSS-MC capabilities to users who lacked internet connectivity. Given this, DOD decided to divide the program into two releases and assigned these capabilities to the second major release. During the first half of 2012, the program continued to report that the technical immaturity of the second system release was a problem. Additionally, status reviews from 2012 that discussed this issue did not identify corrective actions for it. However, it was not until August 2013--approximately 4.5 years after the program was aware of these technical complexities--that the program decided to discontinue developing the second release and remove it from the scope of increment 1. According to program officials' estimates, the program had spent approximately $48.4 million developing the second release prior to its removal. The program's August 2013 critical change report noted that the program had focused too much attention on technologies that could not meet the intended capabilities for the second release. In the absence of including timely corrective actions and time frames for implementing these actions in the program's analysis of critical issues identified, and monitoring actions taken against those time frames, the program risks future delays and cost impacts on the program as it develops and implements its enhancement release. * Use an IV&V agent. The program used an IV&V agent to conduct operational testing on the system. For example, in 2011, the Marine Corps' independent test agency evaluated certain capabilities of the GCSS-MC system. DHA's TMIP-J Program Had Implemented Many Project Monitoring and Control Best Practices, but Did Not Effectively Monitor Project Progress nor Communicate to Stakeholders on Program Status: Similar to DAI and GCSS-MC, TMIP-J had implemented many best practices for project monitoring and control; however, the program had not effectively determined its progress against the project plan and had not appropriately communicated to stakeholders on the status of the program. * Determine progress against the project plan. The program office maintained an integrated master schedule to track progress against scheduled activities. However, the program did not use earned value management to track contractor performance in meeting planned cost targets, even though these data were being collected and certain TMIP- J contracts met DOD's thresholds for its use. Instead, the program office tracked the rate of funds spent and funds remaining in each contract's performance period, which did not provide insight into potential cost overruns, as the use of earned value data would have provided. TMIP-J officials said that the program did not use earned value because the program office staff's knowledge on how to conduct earned value analyses was immature. For example, TMIP-J officials reported that, as of September 2013, only three of nine senior program officials had taken required training in the use of earned value management. In the absence of earned value management and staff properly trained to use it, the program will not have the valuable insights into project performance that earned value management provides, and TMIP-J will be limited in its ability to effectively manage the program. Further, the program had not consistently documented the applications that the program must interface with and, as such, could not effectively monitor progress related to these items. In particular, the program did not have an authoritative listing of all applications with which TMIP-J must interface and key program documents cited different numbers and names of such interfaces. Without effectively determining progress against project plans using earned value management, and having an authoritative listing of all interfaces currently included in the program, program officials will not have a complete understanding of the status and scope of the program, thus hindering their ability to fully monitor and control it. * Communicate to stakeholders the status of assigned activities. The program communicated the status of assigned activities to stakeholders during regular progress reviews; however, key information was lacking in those reviews. Specifically, although the program conducted quarterly progress reviews, it did not convey key program scope information regarding the number of current users and planned sites, which officials reported will vary based on the pace of operations. For example, officials estimated that the number of users at full deployment will range from 21,000 to 62,000 depending on operations, such as combat troop withdrawal from the Middle East. Officials said that they have not reported this information to stakeholders because the system must be maintained regardless of the number of users, and because it is the services' responsibility to determine user numbers. However, the TMIP-J program is responsible for the operations and maintenance of the system software and knowing the number of users and sites would help the program plan for system maintenance. TMIP-J officials stated in December 2013 that they intended to determine the number of users and sites and report this information to DOD as part of a report required before the program's full deployment decision milestone, which occurred in late December 2013. Although this report included information on types of units and sites who would use the system (e.g., certain medical support units), it did not identify the total number of users or sites. As such, TMIP-J was limited in its ability to accurately plan for potential system maintenance requests. In the absence of communicating to stakeholders on the status of assigned activities, including key program scope information, stakeholders will not be informed of changes to scope and will be unable to appropriately plan for system maintenance. * Document significant deviations in performance. The program had reported significant deviations from its project planning parameters. For instance, in April 2009, the program reported to Congress a critical change in its cost and schedule estimates. * Take corrective actions to address issues. TMIP-J had taken corrective actions to address issues. For example, the program's April 2009 critical change report identified corrective actions that needed to be taken to improve program management. In 2012, DOD reported that these corrective actions had been substantially implemented. * Use an IV&V agent. TMIP-J used an IV&V agent to verify and validate testing. Specifically, an IV&V agent conducted system integration testing on TMIP-J in 2012. Conclusions: Of the 14 selected MAIS programs that had cost, schedule, and/or system performance data available, 12 had experienced cost increases, schedule slippages, and/or system performance problems, and 5 experienced all three conditions. As such, these 12 programs were either costing more than planned, taking longer than planned to deliver, and/or the systems had not performed as intended. While a number of best practices for risk management, requirements management, and project monitoring and control have been implemented for DAI, GCSS-MC, and TMIP-J, all three programs lacked key practices essential to effectively acquiring their systems. To its credit, the DAI program had recently taken steps to improve its risk management and IT acquisition practices. However, the program's lack of a complete and up-to-date log of risks with associated mitigation plans, as well as its inaccurate evaluation and categorization of its risks, reduce assurances that the program has appropriately mitigated all program risks. Additionally, the program has been limited in its ability to monitor project performance, and will continue to do so until program officials begin tracking significant cost and schedule deviations. In light of this, the program is at risk of not carrying out a program that is on cost and schedule, and performs as expected. Regarding GCSS-MC, while the program recently updated its risks and mitigation plans and had implemented other key risk and requirements management practices, the program's inability to take timely corrective actions to address issues may jeopardize its chances of meeting its planned cost and schedule targets and deploying the system with all intended functionality. Additionally, to TMIP-J's credit, the program had fully implemented risk management best practices; however, the program's lack of a capabilities baseline document that reflects program scope changes and appropriately traces its capabilities to associated requirements and system components reduces assurances that the system will include all intended functionality for meeting users' needs. Further, by TMIP-J opting not to use earned value management to assess contractor performance, or training staff to analyze earned value data and trends, the program limited its ability to identify impending schedule delays and cost overruns, thus hindering its ability to effectively gauge program status. Moreover, TMIP-J's lack of an authoritative listing of all interfaces currently included in the scope of the program and the program's decision to not communicate key status information to stakeholders on the number of system users and sites limits the program's ability to fully monitor and control the program. Therefore, the program is at risk of developing a system that does not meet all users' needs, and may not meet cost and schedule targets. Recommendations for Executive Action: To better ensure that DAI implements effective risk management and IT acquisition best practices, we are recommending that the Secretary of Defense direct the Director of the Defense Logistics Agency to direct the DAI program office to take the following two actions: * Establish a comprehensive risk log that includes all up-to-date risks with evaluations and categorizations that comply with DLA's defined parameters; and associated mitigation plans, * Identify and document significant cost and schedule deviations from the program's plan. To help ensure that GCSS-MC implements effective project monitoring and control best practices, we are recommending that the Secretary of Defense direct the Secretary of the Navy to direct the GCSS-MC program office to include corrective actions and time frames in future analyses of critical issues and monitor actions taken against those time frames. To improve the TMIP-J program's implementation of IT acquisition best practices, we are recommending that the Secretary of Defense direct the Director of the Defense Health Agency to direct the TMIP-J program office take the following five actions: * update the program's capabilities baseline to reflect program scope changes, * trace all capabilities to their associated requirements in the requirements traceability matrix, * implement earned value management in accordance with DOD's policy and train management staff with project oversight responsibilities on the proper use of earned value management, * develop an authoritative listing of all interfaces currently included in the scope of the program, and: * report to stakeholders the number of current and planned system users and sites and provide updates as needed. Agency Comments and Our Evaluation: We received written comments on a draft of this report from DOD's Assistant Secretary of Defense (Acquisition). The comments are reprinted in appendix III. In its comments, the department concurred with six of our eight recommendations and partially concurred with the other two. Specifically, the department concurred with our recommendations that the DAI program establish a comprehensive risk log and associated mitigation plans, and identify and document significant cost and schedule deviations; the GCSS-MC program include corrective actions and time frames in future analyses of critical issues and monitor those actions; and the TMIP-J program implement earned value management, develop an authoritative listing of all interfaces currently included in the scope of the program, and report to stakeholders the number of current and planned system users and sites. The department partially concurred with our recommendation to update the TMIP-J program's capabilities baseline to reflect program scope changes. In this regard, the department stated that the TMIP-J program will ensure that all future program capabilities are traced to their associated requirements in the appropriate requirements traceability matrices and the program office will update the program's capabilities baseline to reflect program scope changes. However, DOD also stated that there is no benefit in updating the requirements documentation for capabilities that have already been built, tested, and deployed. We disagree with that assertion. Without an updated baseline that reflects all program capabilities, key agency governing boards and congressional committees will not have a clear picture of what capabilities were originally planned to be delivered, what capabilities were delivered, and whether there are any gaps between what was planned and delivered. The department also stated that the December 2011 acquisition strategy for TMIP-J addressed the change in current program scope and is mapped to the program's 2007 capabilities baseline document. However, that document is not sufficient in that it did not clearly identify the scope change. Specifically, while the acquisition strategy identified certain capabilities that were not included in the program's capabilities baseline document, it did not distinguish these as new capabilities, nor were these new capabilities written at the same level of detail as those in the capabilities baseline document. As such, the total planned scope of the program was unclear. Therefore, we maintain that additional work is needed to ensure changes in scope have been adequately incorporated and documented. Without clearly identifying and documenting all program capabilities and changes to those capabilities, stakeholders will lack assurance that the system will be deployed with all intended functionality to meet users' needs. The department partially concurred with our recommendation to trace all capabilities to their associated requirements in the requirements traceability matrix. DOD stated that the TMIP-J program will ensure that all future program capabilities are traced to their associated requirements in the appropriate requirements traceability matrices. It also reiterated its assertion that there is no benefit in updating the requirements documentation for capabilities that have already been built, tested, and deployed. While TMIP-J has committed to tracking all future program capabilities, it had not documented in its requirements traceability matrix that capabilities that were in its baseline document were being addressed by programs outside of TMIP-J. We maintain that it is important to trace these capabilities. In the absence of such documentation, it is not clear how all planned capabilities will be addressed. In addition, we received technical comments from DOD's Assistant Secretary of Defense (Acquisition), which we have incorporated, as appropriate. We are sending copies of this report to the appropriate congressional committees; the Secretary of Defense; and other interested parties. This report also is available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. Should you or your staffs have any questions on information discussed in this report, please contact me at (202) 512-4456 or ChaC@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix IV. Signed by: Carol R. Cha: Director: Information Technology Acquisition Management Issues: List of Committees: The Honorable Carl Levin: Chairman: The Honorable James M. Inhofe: Ranking Member: Committee on Armed Services: United States Senate: The Honorable Thomas R. Carper: Chairman: The Honorable Tom Coburn: Ranking Member: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Richard J. Durbin: Chairman: The Honorable Thad Cochran: Ranking Member: Subcommittee on Defense: Committee on Appropriations: United States Senate: The Honorable Howard P. "Buck" McKeon: Chairman: The Honorable Adam Smith: Ranking Member: Committee on Armed Services: House of Representatives: The Honorable Darrell E. Issa: Chairman: The Honorable Elijah Cummings: Ranking Member: Committee on Oversight and Government Reform: House of Representatives: The Honorable Rodney Frelinghuysen: Chairman: The Honorable Pete Visclosky: Ranking Member: Subcommittee on Defense: Committee on Appropriations: House of Representatives: [End of section] Appendix I: Objectives, Scope, and Methodology: The National Defense Authorization Act for Fiscal Year 2012 mandated that we select and assess Department of Defense (DOD) major automated information system (MAIS) programs annually through March 2018.[Footnote 43] This report is the second in our series of annual assessments. Our objectives were to (1) describe the extent to which selected MAIS programs have changed their planned cost and schedule estimates and met performance targets, (2) assess selected MAIS programs' actions to manage risks, and (3) assess the extent to which selected MAIS programs used key information technology (IT) acquisition best practices. To address the first objective, we established the following criteria for selecting a sample of the 42 DOD MAIS programs that were included in DOD's April 2012 MAIS oversight list: * an acquisition program baseline (APB) had been established, * the program was not fully implemented or recently approved for termination, * the program had not recently started, * the program was not included in the first MAIS annual review, [Footnote 44] * at least one enterprise resource planning system was included in our review,[Footnote 45] and: * the programs represented a variety of DOD components. Relying on these criteria, we made an initial selection of nine programs. Next, we selected five additional programs that had been without APBs for the longest periods of time.[Footnote 46] The criteria we used to select the final program were that, of the remaining MAIS programs, it (1) had not established an APB for the longest period of time, (2) was an enterprise resource planning system, and (3) had the largest planned life-cycle costs. The 15 selected programs were: * the Air Force's: - Air and Space Operations Center-Weapon System Increment 10.2; - Air Force Integrated Personnel and Pay System; - Base Information Transport Infrastructure Wireless; - Integrated Strategic Planning and Analysis Network Increment 2; and: - Joint Space Operations Center Mission System Increment 1. * The Army's: - Distributed Common Ground System-Army Increment 1, - Integrated Personnel and Pay System-Army Increment 1, and: - Joint Personnel Identification version 2. * The Defense Health Agency's: - Integrated Electronic Health Record Increment 1 and: - Theater Medical Information Program-Joint (TMIP-J) Increment 2. * The Defense Logistics Agency's: - Defense Agencies Initiative (DAI) and: - Eprocurement. * The Navy's: - Global Combat Support System-Marine Corps (GCSS-MC) Increment 1, - Global Command and Control System - Maritime Increment 2, and: - Next Generation Enterprise Network Increment 1. To address the first objective, we analyzed and compared each selected program's first APB objective cost estimate (in then-year dollars) to the latest life-cycle objective estimate to determine the extent to which planned program costs had changed.[Footnote 47] For the programs that had not established APB estimates, we compared these programs' initial life-cycle objective cost estimates to their latest objective cost estimates (in then-year dollars). Similarly, to determine the extent to which these programs had changed their planned schedule estimates, we compared each program's first APB schedule (or initial schedule, for the programs that had not established APBs) to the latest schedule. We did not compare the latest cost or schedule estimates to subsequent APBs established after the first APB. We relied on the thresholds established by statute to describe the amount of any deviation (i.e., significant or critical) that each program's latest life-cycle cost and schedule estimates experienced from the first APB.[Footnote 48] To develop the schedule graphics included in each program profile in appendix II, we used either the business capability life-cycle acquisition model or the defense acquisition management system framework that was established in December 2008 (which was the most up-to-date framework at the time of our review), depending on which framework was used by the program.[Footnote 49] To determine whether the selected programs met their performance targets, we compared program and system performance targets against actual performance data in test reports. We reviewed the results of operational assessments and program evaluations conducted on the systems. We also reviewed additional information on each program's cost, schedule, and performance, including program documentation, such as DOD's MAIS annual and quarterly reports; APBs; monthly status briefings; system test reports; and our prior reports. We also interviewed program officials from each of the selected MAIS programs to obtain additional information on cost, schedule, and performance. System performance targets were rated as "met" when (1) system tests were passed with no deficiencies or limitations, (2) the program met all of its key performance parameters, or (3) a program had addressed all deficiencies or limitations that were identified during system tests. System performance was rated as "not fully met" when a program either (1) did not fully pass system testing and was still in the process of addressing the deficiencies or limitations identified during system testing; or (2) did not pass system testing and subsequently removed the problematic functionality from the system in order to pass subsequent system tests, instead of fixing the problematic functionality and keeping it in the planned release of the system. We provided our assessments to the program management offices of each selected program for comment. We aggregated and summarized the results of these analyses across the programs, as well as developed individual profiles for each program (see appendix II). To address the second and third objectives, we selected 3 of the 15 programs included in the first objective for an in-depth review. Specifically, we selected the 2 programs that had established APBs and had the highest planned total life-cycle costs--GCSS-MC and TMIP-J-- and 1 program that had been without a baseline for the longest period of time--DAI. To address the second objective, we reviewed risk management documentation from the three selected programs and compared it to key risk management best practices, including the Software Engineering Institute's Capability Maturity Model® Integration for Acquisition (CMMI-ACQ) and Project Management Institute's Guide to the Project Management Body of Knowledge (PMBOK®).[Footnote 50] These key practices included: * identifying risks, threats, and vulnerabilities that could negatively affect work efforts; * evaluating and categorizing each identified risk using defined risk categories and parameters, such as likelihood and consequence, and determining each risk's relative priority; * developing risk mitigation plans for selected risks to proactively reduce the potential impact of risk occurrence; and: * monitoring the status of each risk periodically and implement the risk mitigation plan, as appropriate. Specifically, we analyzed program risk documentation, including monthly risk logs and reports, risk-level assignments, risk management plans, risk mitigation plans, and risk board meeting minutes. Additionally, we interviewed program officials to obtain additional information about their risks and risk management practices. To address the third objective, we analyzed each selected program's IT acquisition documentation and compared it to certain key requirements management and project monitoring and control best practices-- including CMMI-ACQ and PMBOK® practices, and independent verification and validation practices--to determine the extent to which the programs were implementing these practices.[Footnote 51] In particular, the key requirements management best practices were: * establish criteria for identifying appropriate requirements providers; * establish objective criteria for the evaluation and acceptance of requirements; * assess the impact of requirements on existing commitments; * review project plans, activities, and work products to ensure that they are consistent with the defined requirements; and: * ensure traceability between the requirements and work products. Additionally, the key project monitoring and control best practices were: * determine progress against the project plan, * communicate to stakeholders the status of assigned activities, * document significant deviations in performance, * take corrective actions to address issues when necessary, and: * utilize an independent verification and validation agent. Specifically, we analyzed monthly program management review briefings, acquisition strategies, concepts of operations, milestone and baseline review documentation, independent verification and validation reports, significant and critical change documentation, system requirements documentation, requirements management plans, requirements change requests, and system test and defect reports. Further, we interviewed program officials to obtain additional information on each program's management processes in these key IT acquisition areas. An internal subject matter expert validated our assessments on the extent to which the three selected programs implemented key requirements management and project monitoring and control best practices. To assess the reliability of the data that we used to support the findings in this report, we reviewed relevant program documentation to substantiate evidence obtained through interviews with agency officials. We determined that the data used in this report were sufficiently reliable, with the exception of selected risk and requirements data provided by DLA's DAI program, and selected requirements data provided by DHA's TMIP-J Increment 2 program. We discuss limitations with these data in the report. We have also made appropriate attribution indicating the sources of the data. We conducted this performance audit from April 2013 to March 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Profiles of Selected DOD MAIS Programs: This section contains profiles of the 15 selected MAIS programs. Each profile presents data on the program's purpose and status, its latest cost and schedule estimates compared to the first APB (where established) or initial estimates (where an APB had not yet been established), as well as system performance data, where available. [Footnote 52] The first page of each two-page profile contains a description of the program's purpose and a figure that provides a comparison of the program's first APB (where established) or initial schedule to the program's latest schedule. The years depicted on the figure represent calendar years and the milestones represent the program's best estimates of dates for those milestones. The first page also provides (1) essential program details, such as the name of the prime contractor, as well as the total number of active contractors--which includes the prime contractor, as well as any other contractors (and in some cases subcontractors) supporting the program; (2) program costs (in then-year dollars), comparing the program's latest life- cycle cost estimate (separated into acquisition and operations and maintenance costs) to its first APB (where established) or initial estimate (subsequent APBs that may have been established are not identified);[Footnote 53] (3) deployment details, such as the number of expected users and locations to which the system will be deployed; and (4) a summary of the cost, schedule, and performance of each program, which is further discussed on the second page of the profile. The arrows included in the summary box on the first page of each 2- page profile and in the headings on the second page represent whether a program's cost estimate had increased (·) or decreased (), and whether the program's schedule estimate had slipped (‡) or been accelerated to meet milestones earlier than planned (?). The second page of each two-page profile provides detailed information on each program's status, costs, schedule, and performance. In the status section, we discuss recent and upcoming milestones and events for each program. In the cost section, we identify the extent to which the program's life-cycle cost estimate has changed from its first APB (where established) or initial estimate, as well as the causes for any changes identified. In the schedule section, we discuss the extent to which the program's schedule has changed from its first APB (where established) or initial estimate, and the causes for any schedule changes identified. Finally, in the performance section, we identify the extent to which each program has met its established measures, as well as discuss the results of system performance tests. These performance ratings represent a point-in-time assessment as reported by the program. System performance targets were rated as "met" when: (1) system tests were passed with no deficiencies or limitations, (2) the program fully met all of its key performance parameters, or (3) a program had addressed all deficiencies or limitations that were identified during system tests. System performance was rated as "not fully met" when a program either (1) did not fully pass system testing and was still in the process of addressing the deficiencies or limitations identified during system testing; or (2) did not pass system testing and subsequently removed the problematic functionality from the system in order to pass subsequent system tests, instead of fixing the problematic functionality and keeping it in the planned release of the system. Profiles of Selected DOD MAIS Programs: Air and Space Operations Center-Weapon System (AOC-WS) Increment 10.2: The Air Force's AOC-WS Increment 10.2 program is intended to enable personnel at select air and space operations centers to plan, execute, and assess theaterwide air and space operations. Specifically, it is intended to replace the currently fielded AOC 10.1 system and provide additional capabilities, such as dynamic planning and execution; data management; information assurance; predictive battlespace awareness; and airspace management. First acquisition program baseline as of October 2013: Initiation: 2007; Materiel solution analysis: 2007-2013; Critical change: 2013; Milestone B: 2013; Engineering and manufacturing development: 2013-2015; Milestone C; [Planned} Production and deployment: 2015-2016; Full deployment: 2016. [Planned} Latest schedule as of December 2013: Initiation: 2007; Materiel solution analysis: 2007-2013; Critical change: 2013; Milestone B: 2013; Engineering and manufacturing development: 2013-2015; Milestone C; [Planned} Production and deployment: 2015-2016; Full deployment: 2016. [Planned} Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Department of the Air Force; Program owner: Air Combat Command; Prime contractor: Northrop Grumman; Total number of contractors: 13; Fiscal year 2014 funding requested: $62.9 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (10/2013): $5,585.2 Latest estimate (12/2013): $5,585.2: Acquisition: First APB (10/2013): $462.7 Latest estimate (12/2013): $462.7: Operations and maintenance: First APB (10/2013): $5,122.5 Latest estimate (12/2013): $5,122.5: Amount spent to date (as of October 2013): $176.3: System Deployment Details (as of December 2013): Current number of total expected users: 0 of 2,449: Current number of total expected locations: sensitive data[A]; Legacy systems to be replaced: 10; Annual cost of legacy systems: $217.9 million; Number of expected system interfaces: 56. Cost, Schedule, and Performance Summary: * No change in cost estimate; * No change in schedule estimate; * Unavailable system performance data. Source: Data reported by DOD officials. [A] This is a weapons system; as such, deployment details are considered sensitive by DOD. AOC-WS Increment 10.2: Program Status: In August 2008, a reconciled cost estimate showed that the original program scope was unaffordable. As a result, the Air Combat Command directed the program to reduce the scope of AOC-WS Increment 10.2. In October 2012, the program began piloting limited capabilities of the system in test environments. In December 2012, the program declared a critical change as required by law because it had not achieved a full deployment decision within 5 years from the time the program selected the technology to be used.[Footnote 1] In October 2013, the Air Force provided the critical change report to Congress. This report stated that the key factors causing the critical change were (1) a need to reduce the scope of the originally envisioned program, as previously mentioned; (2) changes in defense acquisition policies; and (3) a need to accomplish additional risk reduction work prior to moving into the development phase of the program. As a result, the scope of the program was reduced and the program reached milestone B (authorizes a program to begin system development) in October 2013. No Change in Cost Estimate: As of December 2013, the program had not experienced a change in its cost estimate since its first APB, which was recently established in October 2013. However, the program spent approximately $176 million and took 6 years (when compared to the program's 2007 initiation date) before establishing its first APB and developing a robust estimate for how much AOC-WS Increment 10.2 was expected to ultimately cost. No Change in Schedule Estimate: AOC-WS Increment 10.2 had not experienced a schedule change since establishing its first APB, but it experienced a critical 5-year delay in establishing this APB (when compared against an initial 2008 estimate). Specifically, the program had planned to reach milestone B (at which point an APB would be established) in July 2008, but the first APB was not established until the program reached milestone B in October 2013. This delay was due, in part, to the re-planning of the program when it was determined that the original scope was unaffordable. Additionally, in June 2010, the milestone decision authority directed the program to delay milestone B (authorizes a program to begin system development) until after (1) a modernization contract was awarded; (2) the modernization contractor performed and completed certain activities, including a design review; and (3) the program office revised its acquisition strategy. Unavailable System Performance Data: As previously mentioned, the program was in the early stages of development. Thus, system performance data were not available. [End of Air and Space Operations Center-Weapon System (AOC-WS) Increment 10.2 profile] Air Force Integrated Personnel and Pay System (AFIPPS):[1] AFIPPS is intended to provide a comprehensive, web-based solution to integrate existing personnel and pay processes from 30 of the Air Force's existing systems into one self-service system that can be accessed worldwide. Further, it is intended to support the Air Force's Regular, Reserve, and Air National Guard components. AFIPPS is to be implemented in five releases. The first four releases are to include the design, development, testing, and deployment of leave request management and personnel and pay capabilities, and the fifth release is to provide enhancements to the system. [1] DOD originally planned to develop the Defense Integrated Military Human Resources System, which was to provide a joint, integrated, standardized personnel and pay system for all military personnel departmentwide. Following the cancellation of that program, each military service is now responsible for developing its own integrated personnel and pay system--including AFIPPS. Initial schedule as of May 2010: Prototyping: Initiation: 2009-2012; Materiel development decision {planned}: 2010; Milestone B {planned}: 2012; Engineering development: 2012-2017; Milestone C {planned}: 2017; Limited fielding: 2017; Full deployment {planned}: 2017; Full deployment decision and system fully deployed {planned}: 2018. Latest schedule as of December 2013: Prototyping: Initiation: 2009-2012; Materiel development decision {planned}: 2010; Milestone B {planned}: 2014; Engineering development: 2014-2015; Milestone C {planned}: 2015; Limited fielding: 2015-2018; Full deployment decision and system fully deployed {planned}: 2018. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Department of the Air Force; Program owner: Program Executive Office and Service Acquisition Executive, Air Force; Prime contractor: IBM; Total number of contractors: 9; Fiscal year 2014 funding requested: $90.1 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: Initial estimate (07/2010): $1,715.4; Latest estimate (12/2013): $1,857.8. Acquisition: Initial estimate (07/2010): $769.6; Latest estimate (12/2013): $824.6. Operations and maintenance: Initial estimate (07/2010): $945.8; Latest estimate (12/2013): $1,033.2. Amount spent to date (as of November 2013): $71.8. System Deployment Details (as of December 2013): Current number of total expected users: 0 of about 507,000; Current number of total expected locations: not applicable[A]; Legacy systems to be replaced: 22[B]; Annual cost of legacy systems: $120 million[B]; Number of expected system interfaces: 148. Cost, Schedule, and Performance Summary: * Change in cost estimate (increase). * Change in schedule estimate (increase). * Unavailable system performance data. Source: Data reported by DOD officials. [A] AFIPPS is a web-based system that is available worldwide. [B] In addition to fully replacing 22 systems, AFIPPS is intended to provide a subset of capabilities from 8 additional legacy systems. According to program officials, the $120 million annual cost of the legacy systems includes the costs of the 22 systems to be fully replaced, as well as the costs for the subset of capabilities that will be replaced from the 8 additional systems. AFIPPS: Program Status: In March 2011, the Air Force completed an analysis of alternative approaches for developing the AFIPPS system. Based on the results of that analysis, in June 2011, DOD directed the Air Force to release a request for proposal that would allow potential contractors to choose between two approaches because neither was significantly better than the other. After reviewing contractor responses, in August 2013, the program selected its preferred development approach. Additionally, it awarded a contract for the analysis and decomposition of the Air Force's business processes and requirements for the first four releases of the AFIPPS system. The contract award was protested in September 2013 and the protest was dismissed in December 2013. AFIPPS officials reported in December 2013 that the program expected to begin developing the capabilities for the first release (leave request management) in May 2014 and deploying these capabilities in June 2015. Change in Cost Estimate (increase): The program's cost estimate increased by about 8 percent. Specifically, while the program had not established an APB as of December 2013, the program's latest pre-APB life-cycle cost estimate was about $1.86 billion--an approximately 8 percent increase from the program's initial estimate of $1.72 billion. Program officials reported that the increases in costs were primarily due to (1) an increase in contractor staff within the program office, (2) DOD's direction to the program to switch from using a development environment hosted at a contractor site to one hosted by DOD's Defense Information Systems Agency, and (3) the addition of new requirements, including supportability requirements for the network and training environment. Change in Schedule Estimate (slipped): The program experienced about a 1.5-year schedule slip in the planned date for milestone B (authorizes a program to begin system development) when compared to its initial schedule--from the first quarter of fiscal year 2013 to June 2014. Program officials reported that the program decided to delay milestone B until after the development contract was awarded because, among other things, the work to be performed under the contract is expected to better define the program and provide additional details that can be used when developing the program's first APB. Unavailable System Performance Data: As of December 2013, the program did not expect to deploy any functionality until June 2015. As such, system performance data were not available. [End of Air Force Integrated Personnel and Pay System (AFIPPS) profile] Base Information Transport Infrastructure Wireless (BITI Wireless): The Air Force's BITI Wireless program provides a secure wireless infrastructure, which includes features such as intrusion detection, monitoring, and central administration that incorporates high- availability and multitiered network administration for wireless entry into local area networks at 97 Air Force bases worldwide. Prior to becoming a standalone MAIS program, the wireless infrastructure for 30 of the 97 bases had been provided by the Combat Information Transport System program, but in April 2009, this program was restructured into two smaller programs--BITI (formerly known as Information Transport Services) and Air Force Intranet[1]. [1] The Combat Information Transport System program portfolio was intended to provide the information infrastructure, network management, and network defense capabilities to meet the multimedia information transport needs of Air Force bases. First acquisition program baseline as of April 2010: Production and deployment: Installation: 2009; Full deployment decision: planned 2010; System fully deployed: planned: 2012. Actual schedule as of March 2013: Production and deployment: Installation: 2009; Full deployment decision: actual: 2011; System fully deployed: actual: 2013. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Department of the Air Force; Program owner: Commander, Air Force Space Command; Prime contractors: General Dynamics; NCI, Inc.; and Telos Corporation[A]; Total number of contractors: 3; Fiscal year 2014 funding requested: $15.2 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (04/2010): $499.5; Latest estimate (12/2013): $541.4. Acquisition: First APB (04/2010): $348.9; Latest estimate (12/2013): 202.9. Operations and maintenance: First APB (04/2010): $150.6; Latest estimate (12/2013): 338.5. Amount spent to date $202.9 (as of October 2013). Cost, Schedule, and Performance Summary: * Change in cost estimate (increased); * Change in schedule estimate (slipped); * Did not fully meet system performance targets. System Deployment Details (as of December 2013): Current number of total expected users: about 500,000 of about 500,000: Current number of total expected locations: 97 of 97: Legacy systems replaced: not applicable[B]; Annual cost of legacy systems: not applicable[B]; Number of system interfaces: not applicable[B]. Source: Data reported by DOD officials. [A] For each Air Force base, one of these contractors was selected to complete the wireless infrastructure upgrade at that location. [B] This was a hardware replacement effort. BITI Wireless: Program Status: In June 2009, funds were first obligated by the BITI Wireless program. In March 2013, the Air Force completed its deployment of BITI Wireless at all 97 locations, and the program is currently in an operations and maintenance status. In March 2013, the program began replacing wireless equipment that is no longer supported, or able to receive enhanced encryption capabilities and operating system updates to correct software problems. As of December 2013, the program was working to address deficiencies identified during a 2012 operational evaluation (see performance discussion below). Change in Cost Estimate (increased): As of December 2013, the life-cycle cost estimate for BITI Wireless was about $541.4 million, which represented a cost increase of approximately 8 percent from the program's first APB cost estimate of about $499.5 million. Program officials reported that the increase in costs was due to the need to maintain the program's wireless infrastructure longer than originally planned, which increased operations and maintenance costs by about $187.9 million. Specifically, program officials reported that BITI Wireless extended its hardware refresh cycle from 6 years to 10 years when the program that was planned to succeed BITI Wireless was not funded. While the program's life-cycle cost estimate increased overall, BITI Wireless reduced its acquisition costs by $146 million, in comparison to its first APB estimate. Program officials attributed these savings to the following actions: (1) the program office provided more detailed information in its request for contract proposals, which enabled contractors to more accurately assess equipment needs, crew sizes, and travel costs prior to contract award; and (2) re-competing the wireless infrastructure upgrade contracts for certain locations. Change in Schedule Estimate (slipped): BITI Wireless experienced a 6-month slippage in its full deployment date compared to its first APB schedule--from September 2012 to March 2013. Program officials attributed this delay to, among other things, problems and events that occurred at certain Air Force bases, such as schedule issues due to an increase in operational activity, and technical and power issues. For example, program officials reported that the increase in mission maintenance and flight operations impeded a base's ability to provide access and escort support to the BITI Wireless contractor installation teams. Did Not Fully Meet System Performance Targets: In September 2012, an operational evaluation identified 14 deficiencies among three of the six critical operational areas that were evaluated, including network command and control. Nevertheless, the testing team recommended that the program continue deploying BITI Wireless to the remaining bases. In March 2013, 9 of the deficiencies remained open; however, the program was deemed fully deployed because all of the critical operational areas were either partially or fully effective and suitable. As of August 2013, program officials reported that these 9 deficiencies had not yet been addressed, but as of December 2013, expected them to be addressed by May 2014. [End of Base Information Transport Infrastructure Wireless (BITI Wireless) profile] Defense Agencies Initiative (DAI): The DAI system is intended to modernize the financial management processes of 21[1] defense agencies by streamlining financial management capabilities and transforming the budget, finance, and accounting operations. When DAI is fully implemented, it is expected to have the capability to control and account for all appropriated working capital and revolving funds at each of the 21 agencies and components. DAI is also intended to be a key component of the DOD plan for achieving fully auditable financial statements by September 30, 2017, as required by the National Defense Authorization Act for Fiscal Year 2010. DAI is being deployed in two increments, with the first increment having deployed five releases. [1] The agencies include the Defense Technical Information Center, Missile Defense Agency, Uniformed Services University of the Health Sciences, Defense Health Agency, Defense Threat Reduction Agency, Defense Technology Security Administration, Defense Media Activity, Defense Security Service, Defense Advanced Research Projects Agency, Office of Economic Adjustment, Defense Prisoner of War/Missing Personnel Office, Defense Commissary Agency, DOD Education Activity, Defense Contract Management Agency, Defense Contract Audit Agency, Defense Acquisition University, Defense Security Cooperation Agency, Defense Human Resources Agency, DOD Inspector General, Defense Information Systems Agency - General Fund, and Defense Microelectronics Activity. Initial schedule as of May 2007: Technology development: Initiation and Milestone A: 2007; Engineering and manufacturing development: Milestone B - increment 1a: planned 2008; Production and deployment: Milestone C: planned 2009; Full deployment decision: planned 2009; System fully deployed: planned 2011. Latest schedule as of December 2013: Technology development: Initiation and Milestone A: 2007; Engineering development: Milestone B - increment 1a: 2010; Program restructure[B]: through 2014. Source: GAO analysis of agency data. [A] In February 2011, the program began complying with the business capability life-cycle acquisition model. Prior to that change, the program was complying with the defense acquisition management system framework--which contains fewer life-cycle phases. [B] In September 2013, the program was restructured into two increments. Increment 1 was placed into the operations and support phase with no further milestones planned, and increment 2 was to proceed to a milestone B decision. As of January 2014, program officials were uncertain when the program would reach milestones B and C for increment 2. Program Essentials (as of December 2013): DOD component: Defense Logistics Agency; Program owner: Financial Management; Prime contractors: not applicable[A]; Total number of contractors: 6; Fiscal year 2014 funding requested: $91.2 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: Initial estimate (03/2007): $209.2; Latest estimate (12/2013): $543.0. Acquisition: Initial estimate (03/2007): $114.5; Latest estimate (12/2013): $342.0. Operations and maintenance: Initial estimate (03/2007): $94.7; Latest estimate (12/2013): $201.0. Amount spent to date $235.2 (as of October 2013). Cost, Schedule, and Performance Summary: * Change in cost estimate (increase); * Change in schedule estimate (slipped); * Did not fully meet system performance targets: System Deployment Details (as of December 2013); Current number of total expected users: 9,200 of 74,769; Current number of total expected locations: 11 of 21[B]; Legacy systems to be replaced: 10; Annual cost of legacy systems: $35 million; Number of expected system interfaces: 23 of 36. Source: Data reported by DOD officials. [A] The government is serving as the system integrator overseeing the contractor development teams. [B] This represents the number of defense agencies and components that were approved to implement the system. According to program officials, each agency may have one or more locations. DAI: Program Status: DAI began as a non-MAIS program under the Business Transformation Agency in January 2007 and reached milestone B in October 2010. In February 2011, the program was designated pre-MAIS (meaning it was expected to meet MAIS cost thresholds). In August 2011, responsibility for the program was transferred from the disestablished Business Transformation Agency to the Defense Logistics Agency. Between October 2008 and October 2012, the program developed and deployed five of six planned releases at 11 defense agencies and components, which provided capabilities such as cost accounting and time and labor. In September 2013, the program was declared a MAIS program and restructured into two increments. Increment 1 consisted of the five previously deployed releases and was placed into the operations and support phase. Increment 2 was initiated to provide new and enhanced capabilities to the 11 existing agencies and components and deploy to 10 new agencies and components. As of January 2014, program officials were uncertain when the program would begin system development of increment 2. Change in Cost Estimate (increased): As of December 2013, the latest cost estimate for DAI was about $543.0 million, which was a 159 percent increase from its initial estimate of about $209.2 million, established in March 2007. According to officials, estimated costs increased because 5 years were added to the estimate's life cycle, initial assumptions on licensing and hardware costs were understated, and additional change management efforts were needed. Additionally, according to a DOD inspector general report, a software upgrade was not included in the initial estimate.[Footnote 1] As of December 2013, officials stated that they were preparing an APB that would include updated costs for increment 2 only. Change in Schedule Estimate (slipped): DLA's DAI program experienced over a 5-year slip in its planned date to obtain approval to begin production and deployment of the system (referred to as milestone C) when compared to its initial schedule, which planned for it to occur in January 2009. In September 2013, DOD decided that increment 1 would not proceed to milestone C and would be placed into the operations and support phase, and that additional milestones would be reached through increment 2. However, as of January 2014, program officials were uncertain when they would set a schedule for increment 2 and reach milestone C. DAI officials attributed this slippage to fluctuation in the number of agencies to deploy DAI and the agency deployment schedule; the change in program designation to pre- MAIS (meaning it was expected to meet MAIS thresholds), which resulted in additional oversight requirements; and to the change in parent organization from the Business Transformation Agency to DLA(previously discussed). Did Not Fully Meet System Performance Targets: DAI experienced several system defects, some of which were still being addressed. Specifically, the DOD Inspector General reported that DAI did not fulfill functional capabilities needed to generate reliable financial data and recommended three actions for the DAI program office to address the issues identified.[Footnote 1] According to officials, as of November 2013, the program office had addressed all three recommendations. However, as of December 2013, DAI was still addressing two of five non-critical interfaces that either did not meet requirements or were not tested during a November 2012 interoperability assessment. According to officials, they were in the process of evaluating if there was still a need for one of the interfaces and the other interface had been deferred to a later release. Footnote: [1] DOD Inspector General, Status of Enterprise Resource Planning Systems' Cost, Schedule, and Management Actions Taken to Address Prior Recommendations, DODIG-2013-111 (Alexandria, Va: Aug. 1, 2013). [End of Defense Agencies Initiative (DAI) profile] Distributed Common Ground System - Army (DCGS-A) Increment 1: DCGS-A is intended to be the Army's primary system for collecting, processing, integrating, and displaying intelligence, surveillance, and reconnaissance information about potential adversarial forces, the weather, and the terrain to Army Commanders at all echelons. It is intended to acquire and synthesize data from multiple intelligence sources, such as humans, geospatial information (e.g., imagery of earth's terrain), and information derived from electronic transmissions. DCGS-A increment 1 is to include three software releases that will be integrated into commercial off-the-shelf laptops and servers. First acquisition program baseline as of March 2012: Technology development: Initiation: 2001; Engineering and manufacturing development: Milestone B[A]: 2006; Production and deployment: Milestone C: planned 2012; Full deployment decision: planned 2013 through 2015. Latest schedule as of December 2013: Technology development: Initiation: 2001; Engineering and manufacturing development: Milestone B[A]: 2006; Production and deployment: Milestone C: 2012; Full deployment decision: 2013; System fully deployed: planned 2019. [A] DCGS-A was authorized to proceed beyond milestone B prior to becoming a MAIS program. Program Essentials (as of December 2013): DOD component: Department of the Army; Program owner: U.S. Army Training and Doctrine Command Capability Manager - Sensor Processing; Prime contractors: Lockheed Martin and Booz Allen Hamilton: Total number of contractors: 67; Fiscal year 2014 funding requested: $295.13 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (03/2012): $11,234.4; Latest estimate (12/2013): $10,212.4. Acquisition: First APB (03/2012): $6,598.6; Latest estimate (12/2013): $5,642.9. Operations and maintenance: First APB (03/2012): $4,635.8; Latest estimate (12/2013): $4,569.5. Amount spent to date: $2,810.1 (as of September 2013). Cost, Schedule, and Performance Summary: Change in cost estimate (decreased); Change in schedule estimate (slipped); Did not fully meet system performance targets. System Deployment Details (as of December 2013): Current number of total expected users: 24,858 of 34,290; Current number of total expected locations: 1,722 of 2,175; Legacy systems to be replaced: 25; Annual cost of legacy systems: $255 million[A]; Number of expected system interfaces: 740. Source: Data reported by DOD officials. [A] Program officials reported that the program expects sustainment costs to decrease to approximately $170 million per year beginning in fiscal year 2016 due to the program's implementation of certain acquisition best practices, more efficient enterprise user agreements, and better coordination and accountability with sustainment providers. DCGS-A Increment 1: Program Status: The DCGS-A program was established in May 2001 to consolidate nine sets of intelligence systems. Subsequently, the Army made several significant changes to the program. Specifically, in 2004, DOD began requiring that each of the services' intelligence, surveillance, and reconnaissance systems become interoperable with each other as a family of systems--including DCGS-A.[Footnote 1] In 2005, the Army directed that a capability called the Joint Intelligence Operational Capability- Iraq be integrated into DCGS-A. Additionally, DCGS-A was directed in 2007 to build a hardware and software configuration of the system that was able to be mounted on vehicles used by brigade combat teams. In May 2011, DCGS-A was refocused to concentrate on developing software, and certain hardware requirements related to mounting the system on a specific type of vehicle were removed from the program. In December 2012, the program was granted full deployment decision for a modified configuration of release 1 (discussed in the performance section below). As of December 2013, the program expects to deploy release 2 beginning in the second quarter of fiscal year 2015. Change in Cost Estimate (decreased): As of December 2013, DCGS-A's latest life-cycle cost estimate was approximately $10.2 billion, which was about a 9 percent decrease from its first APB estimate of $11.2 billion established in March 2012. Program officials attributed this decrease, among other things, to a reduction in the number of brigade combat teams, which was due, in part, to the Budget Control Act of 2011 (which required DOD to reduce its future expenditures) and the drawdown of forces in Afghanistan and Iraq. Program officials reported that the reduction in force decreased the amount of DCGS-A infrastructure needed and the amount of DCGS-A equipment that needs to be refreshed every 5 years in accordance with the program's refresh cycle. Change in Schedule Estimate (slipped): As of December 2013, DCGS-A was expected to be fully deployed in September 2019. The program experienced a nominal 3-month slippage in achieving full deployment decision compared to its first APB schedule (established in March 2012)--from September 2012 to December 2012. DCGS-A officials attributed this slippage, in part, to problems experienced during operational testing of release 1 (discussed below). Officials also stated that, based on the results of these tests, the program required additional time to update and receive approval for regulatory and statutory documentation required to achieve full deployment decision, including receiving the final operational test report from the testing agency. Did Not Fully Meet System Performance Targets: In May 2012 through June 2012, the US Army Test and Evaluation Command conducted an initial operational test and evaluation on release 1 of the DCGS-A system and determined that it was operationally effective with limitations, and was not operationally suitable or survivable. Specifically, the system was unable to meet certain requirements, such as synchronizing data passed between different classified network domains (e.g., between secret and top secret networks). Subsequently, the Army modified release 1 to defer the capability for users to access data in the top secret security domain until release 2. In October 2012, additional tests were conducted on the reduced scope of release 1, which showed that the system performed acceptably. However, until the top secret domain is deployed, the system is unable to fully meet two key performance parameters. Footnote: [1] The DCGS family of systems includes Air Force-DCGS, DCGS-A, DCGS- Marine Corps, DCGS-N, and DCGS-Special Operations Forces. [End of Distributed Common Ground System - Army (DCGS-A) Increment 1 profile] EProcurement: EProcurement is intended to provide enterprisewide procurement capabilities, such as managing purchase requests and contract awards, for Defense Logistics Agency acquisition and procurement users. The system is to replace multiple legacy procurement systems to reduce redundancy and cost, and to standardize a contract writing and administration methodology across the agency. EProcurement consists of three releases. Releases 1.0 and 1.1 were to demonstrate limited functionality for a limited user base and focused on providing manual order processing capabilities. Release 1.2 is to provide full procurement capabilities to all users. First acquisition program baseline as of March 2012: Engineering and manufacturing development: Initiation: 2007; Production and deployment: 2012-through 2014; Milestone C: 2012; Full deployment decision: planned 2012; Latest schedule as of December 2013: Engineering and manufacturing development: Initiation: 2007; Production and deployment: 2012-through 2014; Milestone C: 2012; Full deployment decision: 2012; System fully deployed: 2014. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Defense Logistics Agency; Program owner: Acquisition Directorate; Prime contractor: Accenture; Total number of contractors: 6; Fiscal year 2014 funding requested: $16.5 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (03/2012): $528.0; Latest estimate (12/2013): $549.7. Acquisition: First APB (03/2012): $337.8; Latest estimate (12/2013): $354.2. Operations and maintenance: First APB (03/2012): $190.2; Latest estimate (12/2013): $195.5. Amount spent to date: $349.3 (as of October 2013). System Deployment Details (as of December 2013): Current number of total expected users: 2,700 of 4,000; Current number of total expected locations: not applicable[A]; Legacy systems to be replaced: 3; Annual cost of legacy systems: $14.5 million; Number of expected system interfaces: 62. Cost, Schedule, and Performance Summary: * Change in cost estimate (increased); * Change in schedule estimate (slipped); * Did not fully meet system performance targets. Source: Data reported by DOD officials. [A] EProcurement is a web-based system that is intended to be used by acquisition and procurement specialists worldwide. EProcurement: Program Status: The EProcurement program has deployed two of the three planned releases of the system. The first release was fielded in November 2010 to approximately 50 acquisition and procurement users and the second release was fielded in February 2011 to approximately 320 users. As of December 2013, the program was in the process of deploying the third release to all 4,000 planned EProcurement users, and officials expected the system to be fully deployed by February 2014. Change in Cost Estimate (increased): The program's life-cycle cost estimate had increased nominally by about 4 percent compared to the program's first APB cost estimate. Specifically, as of December 2013, the latest life-cycle cost estimate for EProcurement was about $549.7 million, and the program's first APB cost estimate, as of March 2012, was about $528.0 million. According to program officials, this increase was due to a 2-month slip in the full deployment date, which shifted estimates into the next fiscal year and added inflation. Change in Schedule Estimate (slipped): EProcurement had experienced an approximately 2-month slippage in its full deployment date compared to its first complete APB schedule. Specifically, program officials had planned to deploy the system by December 2013; however, as of December 2013, the program estimated that it would be fully deployed in February 2014. EProcurement officials stated that this delay was necessary to reduce program risks during system deployments and ensure that staff would be adequately trained. Additionally, EProcurement experienced a 3-month schedule slippage in the planned date for full deployment decision compared to its first APB schedule--from May 2012 to August 2012. According to EProcurement officials, the slip was due to scheduling conflicts with the program's milestone decision authority. Did Not Fully Meet System Performance Targets: An initial operational test completed in June 2012 determined that release 1.2 of EProcurement was operationally effective and suitable, but had deficiencies in the areas of training, usability, help desk operations, and supportability. Consequently, the Director of Operational Test and Evaluation made seven recommendations to address these deficiencies, including that the program improve the quality of training for its users. According to program officials, as of December 2013, four of the recommendations had been addressed and the remaining three recommendations were expected to be addressed in fiscal year 2014. Additionally, in December 2013, the program reported that it was meeting the targets for all of its performance metrics, such as net- readiness and the number of purchase requests created in a workday. [End of EProcurement profile] Global Combat Support System-Marine Corps (GCSS-MC) Increment 1: GCSS-MC is intended to be the primary technology enabler for the Marine Corps logistics modernization strategy and provides the backbone for all logistics information required by the Marine Air Ground Task Force. GCSS-MC Increment 1 is intended to support logistics planners and operators worldwide to manage combat logistics, including planning, warehousing, distribution, depot maintenance, and asset visibility. First acquisition program baseline as of June 2007: Initiation: 2003; Materiel solution analysis: 2003-2004; Milestone A: 2004; Technology development: 2004-2007; Milestone B: 2007; Engineering and manufacturing development: 2007-2008; Milestone C: planned 2008; Production and deployment: 2008-2010; Full deployment decision: planned 2009; System fully deployed: planned 2010. Latest schedule as of December 2013: Initiation: 2003; Materiel solution analysis: 2003-2004; Milestone A: 2004; Technology development: 2004-2007; Milestone B: 2007; Engineering and manufacturing development: 2007-2010; Critical change: 2008; Milestone C: 2010; Production and deployment: 2010-2016; Significant change: 2012; Critical change: 2013; Full deployment decision: planned 2014; System fully deployed: planned 2015. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Navy, United States Marine Corps; Program owner: Assistant Secretary for Research, Development, and Acquisition, Navy; Prime contractors: Accenture, Oracle Consulting, and Space and Naval Warfare Systems Center Atlantic[A]; Total number of active contractors: 3. Fiscal year 2014 funding requested: $ 72.8 million:. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (06/2007): $461.4; Latest estimate (12/2013): $1,856.3. Acquisition: First APB (06/2007): $194.4; Latest estimate (12/2013): $389.6. Operations and maintenance: First APB (06/2007): $267.0; Latest estimate (12/2013): $1,466.7. Amount spent to date: 617.8 (as of December 2013). Cost, Schedule, and Performance Summary: * Change in cost estimate (increased): * Change in schedule estimate (slipped): * Did not fully meet system performance targets: System Deployment Details (as of December 2013): Current number of total expected users: 26,288 of 26,288[B]; Current number of total expected locations: 72 of 72; Legacy systems to be replaced: 4; Annual cost of legacy systems: $4 million; Number of expected system interfaces: 44. Source: Data reported by DOD officials. [A] The initial contract for system development was awarded to Accenture, but that contract was terminated in 2006. Oracle Consulting then became the prime contractor for system development of the first GCSS-MC release, which was fully deployed by March 2013. Space and Naval Warfare Systems Center Atlantic is a government entity that is responsible for the sustainment of the first release and development of an enhancement release (discussed on the next page). [B] Program officials reported that the program is authorized to have up to 36,000 users. GCSS-MC Increment 1: Program Status: In September 2008, the program declared its first critical change in its cost and schedule estimates due to technical challenges experienced during system development. As a result, Increment 1 was divided into two releases. The first release was to provide GCSS-MC capabilities to users that have access to the system via the Internet. The second release was to provide GCSS-MC to deployed users lacking Internet connectivity. According to program officials, DOD had fielded the first release to all intended users except those in Afghanistan. Officials stated that those users will not receive the first release to avoid disruption of combat operations. As a result, Afghanistan users continue to use legacy logistics systems, which has delayed the retirement of those systems until these users return to the United States. The second release was not deployed due to technical challenges associated with capabilities in that release (see the performance section), which resulted in a critical change to the schedule that was reported in February 2013. In August 2013, based on a review of the critical change, the program removed the second release from the first increment of GCSS-MC. Additionally, the program added an enhancement release that is intended to provide minimal functionality to users without network connectivity, including capturing and storing logistics data on a laptop to enable the data to be later transferred to the network once the user is in a location with internet connectivity. As of December 2013, program officials expected to begin fielding this enhancement release in the second quarter of 2015. Change in Cost Estimate (increased): As of December 2013, the latest life-cycle cost estimate for the reduced scope of GCSS-MC Increment 1 was about $1.86 billion, which is an increase of about 302 percent from the first APB estimate of $461.4 million in 2007. As previously discussed, in September 2008, the program declared a critical change in its cost estimate, which was primarily due to technical challenges associated with developing the second release. Program officials attributed subsequent cost increases to extending the period of contractor maintenance to allow additional time for transferring post-deployment system support to the government, and to the technical immaturity of the second release (see the performance section). Change in Schedule Estimate (slipped): As of December 2013, the reduced scope of the GCSS-MC program is expected to be fully deployed in the fourth quarter of fiscal year 2015, almost 6 years behind its first APB estimate of November 2009. As previously discussed, these delays were due to technical challenges in developing the second release. Did Not Fully Meet System Performance Targets: In 2011, the program reported that the first release of GCSS-MC Increment 1 met both of its key performance parameters related to net readiness and the time it takes for system transactions to be visible to users. However, in 2011, the program conducted developmental testing on the second release and identified significant system deficiencies. Moreover, in December 2012, the program reported that the second release was unable to successfully complete additional developmental and operational testing due to technical problems associated with certain capabilities in that release, including synchronizing remote computers to the primary system. As a result, that release was removed from the scope of Increment 1 (as previously discussed). [End of Global Combat Support System-Marine Corps (GCSS-MC) Increment 1 profile] Global Command and Control System-Maritime (GCCS-M) Increment 2: GCCS-M Increment 2 is intended to provide maritime commanders afloat and at ashore fixed command centers with a single system that integrates and displays available intelligence and environmental information on friendly, hostile, and neutral land, sea, and air forces. GCCS-M requirements were initially met in Increment 1, and Increment 2 is to include additional software capabilities to ensure synchronization and interoperability with the rest of the GCCS family of command and control systems, which includes GCCS-Army, GCCS-Air Force, and GCCS-Joint. GCCS-M Increment 2 is being deployed in four different configurations, based on ship types and sizes. First acquisition program baseline as of February 2006: Initiation and Milestone B: 2005; Engineering and manufacturing development: 2005-2006; Production and deployment: 2006-2011; Milestone C: planned 2006; Full deployment decision: planned 2007. Latest schedule as of December 2013: Initiation and Milestone B: 2005; Engineering and manufacturing development: 2005-2010; Program restructure: 2008; Milestone C: 2010; Production and deployment: 2010-2014; Full deployment decision: 2011. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Department of the Navy; Program owner: Program Executive Office for Command, Control, Communications, Computers and Intelligence; Prime contractor: Science Applications International Corporation; Total number of contractors: 6; Fiscal year 2014 funding requested: $35.6 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate First APB (02/2006): $4,442.0 Latest estimate (12/2013): $641.8: Acquisition First APB (02/2006): $1,388.0 Latest estimate (12/2013): $351.9: Operations and maintenance First APB (02/2006): $3,054.0 Latest estimate (12/2013): $289.9: Amount spent to date: $193.9 (as of December 2013). Cost, Schedule, and Performance Summary: * Change in cost estimate (decreased): * Change in schedule estimate (slipped): * Met system performance targets: System Deployment Details (as of December 2013): Current number of total expected users: 210 of 1,870; Current number of total expected locations: 24 of 269; Legacy systems to be replaced: 1; Annual cost of legacy systems: $12.2 million; Number of expected system interfaces: 69. Source: Data reported by DOD officials. GCCS-M Increment 2: Program Status: In April 2007, GCCS-M Increment 2 was directed to cease all development work and restructure the program to align with the Navy's new common computing environment, to be provided by Navy's Consolidated Afloat Networks and Enterprise Services (CANES) program, and the GCCS-Joint software baseline of the GCCS family of systems.[Footnote 1] This restructure resulted in the removal of hardware requirements from GCCS- M, making it a software-only program. In September 2011, the Navy fielded three of the four planned system configurations. According to program officials, as of December 2013, the Navy was testing the remaining configuration and expected deployment to begin in fiscal year 2014. Change in Cost Estimate (decreased): The program's cost estimate decreased by about 86 percent. Specifically, the first APB cost estimate for GCCS-M Increment 2, established in February 2006, was approximately $4.4 billion and the latest life-cycle cost estimate, as of December 2013, was approximately $641.8 million. According to program officials, this decrease was primarily due to the transfer of hardware requirements--such as procuring and installing servers and other network equipment--to Navy's common computing environment (previously discussed), and the removal of costs associated with operations and maintenance and military personnel that are now planned to be funded by sources outside of the program office. Change in Schedule Estimate (slipped): GCCS-M Increment 2 had experienced a 3.5-year schedule slippage in the program's full deployment decision date compared to its first APB schedule estimate--from August 2007 to March 2011. Program officials attributed this slippage to new software development work that was needed when the program was directed to be restructured to align with a new common infrastructure and software baseline (previously discussed), as well as schedule delays in the availability of ships for operational testing. As of December 2013, program officials expected the system to be fully deployed in December 2014. Met System Performance Targets: As of December 2013, the Navy had tested three of the four planned system configurations for GCCS-M Increment 2, and was in the process of testing the fourth configuration. Specifically, in November 2010, the Navy completed an initial operational test on the first configuration, which determined that the system was operationally effective and suitable, except for a deficiency in logistic supportability. This deficiency was subsequently addressed in June 2011. In May 2011, the Navy completed an initial operational test on the second configuration, which determined that the system was operationally effective and suitable, except for a limitation in training. This deficiency was subsequently addressed in September 2011. The Navy then completed an initial operational test on the third configuration in July 2011, which determined that the system was operationally effective and suitable. In May 2013, system integration testing was completed on the fourth configuration, which determined that the system could integrate into the CANES network environment with no major issues. According to officials, an initial operational test is planned for the fourth configuration during the fourth quarter of fiscal year 2014. Additionally, in August 2013, the program office reported that GCCS-M Increment 2 was meeting the targets for seven of eight key performance metrics, such as operational availability and net readiness. The GCCS- M office reported that the program did not measure performance for the remaining performance metric--equipment survivability--because it was to be met by the network infrastructure provider. Footnote: [1] The CANES program is intended to, among other things, reduce and eliminate existing standalone afloat (i.e., surface ships and submarines) networks and reduce the hardware footprint on 259 afloat and maritime operations center platforms. [End of Global Command and Control System-Maritime (GCCS-M) Increment 2 profile] Integrated Electronic Health Record (iEHR) Increment 1: In February 2013, the Secretaries of DOD and the Department of Veterans Affairs (VA) changed the scope of the iEHR program--which is to include multiple increments--from an effort to develop a single, integrated DOD and VA health record system, to an effort to achieve interoperability between separate existing DOD and VA health record systems. The revised iEHR program is intended to provide the infrastructure and services for standardizing and integrating electronic healthcare data between DOD's and VA's systems. Specifically, increment 1 of iEHR is intended to provide DOD with seven capabilities: (1) enhance user sign-in, (2) enhance medical record views among multiple systems, (3) allow users to roam among multiple devices, (4) upgrade a DOD medical record database, (5) deploy a testing facility for DOD and VA electronic record integration, (6) develop a pilot to consolidate data centers, and (7) develop a pilot graphical user interface. First acquisition program baseline as of February 2013: Prototyping: 2011-2012; Initiation: 2011; Engineering development: 2012-2013; Milestone B: 2012; Limited fielding: 2013-2014; Milestone C: planned 2013; Full deployment: 2014; Full deployment decision: planned 2014. Latest schedule as of January 2014: Prototyping: 2011-2012; Initiation: 2011; Engineering development: 2012-2014; Milestone B: 2012; Limited fielding: 2014; Milestone C: planned 2014; Full deployment: 2014; Full deployment decision: planned 2014. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Defense Health Agency. Program owner: DOD/VA Interagency Program Office. Major contractors: Planned Systems International, ICS-Nett, General Dynamics One Source, SAIC. Total number of contractors: 12. Fiscal year 2014 funding requested: $66.2 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (02/2013): $1,025.9; Latest estimate (01/2014): TBD[A]. Acquisition: First APB (02/2013): $366.3; Latest estimate (01/2014): TBD. Operations and maintenance: First APB (02/2013): $659.6; Latest estimate (01/2014): TBD. Amount spent to date: $199.4 (as of February 2014). Cost, Schedule, and Performance Summary: * Cost estimate to be determined; * Change in schedule estimate (slipped): * Did not fully meet system performance targets. System Deployment Details (as of December 2013): Current number of total expected users: 0 of 102,000; Current number of total expected locations: 1 of 20; Legacy systems to be replaced: 0; Annual cost of legacy systems: not applicable; Number of expected system interfaces: 27. Source: Data reported by DOD officials. [A] As of January 2014, program officials stated that they plan to revise iEHR's cost estimate to reflect recent scope changes (discussed in the following section) at milestone C (authorizes a program to begin limited fielding of the system), which is currently planned for May 2014. iEHR Increment 1: Program Status: As discussed previously, in February 2013, the Secretaries of DOD and VA changed the scope of the iEHR program to an effort to achieve interoperability between separate existing DOD and VA health record systems. In the summer of 2013, DOD officials recommended removing three capabilities from the scope of increment 1. Those capabilities were (1) enhancing user sign-in, (2) enhancing views of medical records among multiple systems, and (3) allowing users to roam among multiple devices. However, in January 2014, DOD officials directed the program to implement these three efforts by May 2014. Additionally, DOD delayed deployment of the testing facility for DOD and VA electronic record integration from December 2013 to May 2014. DOD also removed the regional data center and the graphical user interface from the scope of increment 1. Officials reported that the upgrades to the DOD medical record database were implemented in the third quarter of fiscal year 2013. Cost Estimate To Be Determined: As discussed previously, as of January 2014, program officials stated that they plan to revise the cost estimate to reflect the scope changes discussed above, and expected this update to occur at milestone C (authorizes a program to begin limited fielding of the system), which is currently planned for May 2014. Change in Schedule Estimate (slipped): As of January 2014, the program had experienced an 11-month slip in its planned date for Milestone C--currently planned for May 2014--compared to its first APB schedule estimate. Additionally, the program had experienced an 11-month slip in its planned date for full deployment decision--from December 2013 to November 2014. Officials attributed these slippages to needing additional time to prepare for operational testing of the capabilities that DOD directed to remain in the program in January 2014 (as discussed above). Did Not Fully Meet System Performance Targets: DOD reported that four developmental tests were conducted in fiscal year 2013 and revealed 32 system defects--10 of which remained open as of January 2014. Program officials stated that they plan to resolve these defects by the end of February 2014. As of January 2014, the program planned to assess the performance of the following capabilities in an operational environment by May 2014: (1) enhancing user sign-in, (2) enhancing views of medical records among multiple systems, and (3) allowing users to roam among multiple devices. Additionally, officials stated that they intend to validate the performance of the test center in May 2014. [End of Integrated Electronic Health Record (iEHR) Increment 1 profile] Integrated Personnel and Pay System - Army (IPPS-A) Increment 1:[1] IPPS-A is intended to provide a 24-hour, web-based, integrated human resources system to soldiers, human resources professionals, combatant commanders, personnel and pay managers, and other authorized Army users. Specifically, IPPS-A Increment 1 is to include one release, which is intended to provide a consolidated, foundational database of trusted personnel data that is extracted from 15 existing human resources systems (additional functionality is intended to be part of a different MAIS program--IPPS-A Increment 2). [1] DOD originally planned to develop the Defense Integrated Military Human Resources System, which was to provide a joint, integrated, standardized personnel and pay system for all military personnel departmentwide. Following the cancellation of that program, each military service is now responsible for developing its own integrated personnel and pay system--including IPPS-A. First acquisition program baseline as of March 2012: Initiation: 2009; Engineering and manufacturing development: 2009-2013; Milestone C: planned 2013; Full deployment decision: planned 2013; Production and deployment: 2013-2014. Latest schedule as of December 2013: Initiation: 2009; Engineering and manufacturing development: 2009-2013; Milestone C: planned 2014; Full deployment decision: planned 2014; Production and deployment: 2014. Program Essentials (as of December 2013): DOD component: Department of the Army; Program owner: Department of the Army, Deputy Chief of Staff G-1; Prime contractor: EDC Consulting, LLC; Total number of contractors: 56; Fiscal year 2014 funding requested: $13.4 million[A]. Source: GAO analysis of agency data. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (03/2012): $358.4; Latest estimate (12/2013): $395.3. Acquisition: First APB (03/2012): $157.0; Latest estimate (12/2013): $193.9. Operations and maintenance: First APB (03/2012): $201.4; Latest estimate (12/2013): $201.4. Amount spent to date: $162.3 (as of November 2013). System Deployment Details (as of December 2013): Current number of total expected users: 0 of about 1.2 million: Current number of total expected locations: n/a[B]; Legacy systems to be replaced: 0[C]; Annual cost of legacy systems: $0; Number of expected system interfaces: 15. Cost, Schedule, and Performance Summary: * Change in cost estimate (increased): * Change in schedule estimate (slipped): * Fully met system performance measures: Source: Data reported by DOD officials. [A] Although the program requested about $13.4 million in fiscal year 2014 funding, IPPS-A officials reported in November 2013 that the fiscal year 2014 funding required by the program is $51 million. Officials attributed this increase to schedule slippages experienced by the program (discussed on the next page). [B] IPPS-A Increment 1 is intended to be a web-based database that will be available worldwide. [C] Officials stated that IPPS-A Increment 1 will not replace any legacy systems, but 49 systems are planned to be replaced by IPPS-A Increment 2. IPPS-A Increment 1: Program Status: According to program officials, in June 2013, the Army halted the development of IPPS-A Increment 1. Program officials attributed this to significant contractor performance issues that were experienced when implementing the design of the database (see the performance section). As a result of these issues, the Army Acquisition Executive directed the IPPS-A program to task an independent organization to conduct a review of the design of the database to determine the adequacy and completeness of it, and whether it would meet the program's requirements. The independent review was completed in August 2013 and found that the current design of IPPS-A Increment 1 was at low risk for not meeting the program's requirements, and that the system design was not complete. Based on the results of the independent review, DOD plans to implement a 3-wave deployment approach and decide on whether IPPS-A Increment 1 can proceed into production and deployment in February 2014. Change in Cost Estimate (increased): As of December 2013, the program's latest life-cycle cost estimate was approximately $395.3 million, which was about a 10 percent increase from the program's first APB estimate of $358.4 million. Program officials attributed this increase to significant schedule slippages experienced by the program (discussed in the schedule section). Change in Schedule Estimate (slipped): As of December 2013, the program had experienced a 1-year slip in the planned date for milestone C (authorizes a program to begin production and deployment) compared to its first APB schedule estimate of February 2013. Additionally, the planned date for full deployment decision slipped 1 year--from April 2013 to April 2014. IPPS-A officials attributed these slippages to significant performance issues experienced by the prime contractor (discussed in the performance section), a 90-day delay in awarding the IPPS-A Increment 1 contract, and the addition of new requirements that were added late in the design phase. Fully Met System Performance Targets: In March 2013, the program's prime contractor conducted data validation tests to determine whether the data contained in the IPPS-A Increment 1 database were (1) accurately received from external systems, (2) accurately mapped to the appropriate field in the IPPS-A database, and (3) consistent with the data values received from the external system. According to program officials, the contractor was planning to conduct 39 data validation tests; however, the system failed the first 6 tests and consequently, the remaining data validation tests were discontinued. As a result, the Army Acquisition Executive directed the program to review the system's design, which was completed in August 2013 (as previously discussed). As of January 2014, of the 39 originally planned tests, the program had re-tested the 6 previously failed tests, and conducted 19 more tests. The program successfully passed these 25 tests. According to DOD officials, the program will conduct the remaining 14 tests during wave 2 and 3 deployment activities. [End of Integrated Personnel and Pay System – Army (IPPS-A) Increment 1 profile] Integrated Strategic Planning and Analysis Network (ISPAN) Increment 2: ISPAN Increment 2 provides additional capabilities for DOD's Global Adaptive Planning Collaborative Information Environment, which is a web-enabled environment that allows worldwide users to collaborate online while providing planning and analyses to senior decision makers. ISPAN Increment 2 included two major releases and two enhancement efforts. First acquisition program baseline as of November 2010: Initiation: 2009; Materiel solution analysis: 2009-2010; Milestone B: 2010; Engineering and manufacturing development: 2010-2012; Full deployment decision: planned 2012; Production and deployment: 2012-2013; System fully deployed: planned 2013. Actual schedule as of November 2013: Initiation: 2009; Materiel solution analysis: 2009-2010; Milestone B: 2010; Engineering and manufacturing development: 2010-2012; Full deployment decision: 2012; Production and deployment: 2012-2013; System fully deployed: 2013. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Department of the Air Force; Program owner: U.S. Strategic Command; Prime contractor: Lockheed Martin; Total number of contractors: 10; Fiscal year 2014 funding requested: $8.5 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (11/2010): $152.5; Latest estimate (12/2013): $144.2. Acquisition: First APB (11/2010): $47.6; Latest estimate (12/2013): $40.2. Operations and maintenance: First APB (11/2010): $104.9; Latest estimate (12/2013): $104.0. Amount spent to date: $49.0 (as of October 2013). Cost, Schedule, and Performance Summary: * Change in cost estimate (decrease): * Change in schedule estimate (slipped): * Did not fully meet system performance targets: System Deployment Details (as of December 2013): Current number of total expected users: 8209 of 8209[A]; Current number of total expected locations: not applicable[B]; Legacy systems to be replaced: 0; Annual cost of legacy systems: not applicable; Number of expected system interfaces: 14. Source: Data reported by DOD officials. [A] This represents the total number of registered user accounts that have access to ISPAN. The network is able to accommodate 500 concurrent users. [B] ISPAN Increment 2 is a web-based system that is available worldwide. ISPAN Increment 2: Program Status: The ISPAN Increment 2 program deployed two major releases in December 2011 and September 2012. Subsequently, the program deployed two system enhancements in March 2013 and August 2013. Following the testing and evaluation of the second enhancement, DOD achieved full deployment in November 2013 and is currently in an operations and maintenance status. As discussed in the following performance section, the program is currently working to address deficiencies identified in 2012. Change in Cost Estimate (): The program had experienced a 5 percent cost estimate decrease. Specifically, as of December 2013, the latest life-cycle cost estimate for ISPAN Increment 2 was about $144.2 million, which is approximately $8.3 million less than the program's first APB estimate of $152.5 million. Program officials attributed this to a decrease in acquisition-related costs that occurred when the program switched from a systems architecture environment that included multiple hardware items, such as computers and servers, to a virtualized environment in which certain computers are software-based. Further, officials stated that this change resulted in reduced costs associated with procurement; hardware and software installation; and software licensing. Change in Schedule Estimate (‡): The program had experienced a 6-month slippage in its date for full deployment when compared to its first APB schedule--from May 2013 to November 2013. Program officials reported that the schedule delay was due to needing additional time to (1) define stable requirements for the first major capability release and (2) prepare for the initial operational test and evaluation, and the activities associated with the test event. Did Not Fully Meet System Performance Targets: In summer 2012, the Air Force's Operational Test and Evaluation Center conducted an operational test and evaluation on ISPAN Increment 2 capabilities and ultimately determined that the system was effective, suitable, and mission capable, because the system ultimately met its requirements. However, the Air Force's Operational Test and Evaluation Center reported that at the start of the test, there were 184 deficiencies with the system, which required workarounds in order to complete certain functions. As of August 2013, program officials reported that they had addressed 148 of those deficiencies and, as of December 2013, planned to address the remaining deficiencies by April 2014. [End of Integrated Strategic Planning and Analysis Network (ISPAN) Increment 2 profile] Joint Personnel Identification (JPI) - Version 2: JPI version 2 is intended to provide an Army tactical biometric collection capability to capture an adversary or neutral person's biometric data (e.g., fingerprint, iris image, and facial image) and enroll them into DOD's enterprise authoritative biometric database to positively identify and verify the identity of actual or potential adversaries. JPI version 2 is intended to be used in any theater where military forces are deployed. Initial schedule as of December 2013: Initiation: 2011; Technology development: 2011-2014. Source: GAO analysis of agency data. Program Essentials (as of December 2013): DOD component: Department of the Army: Program owner: Assistant Secretary of the Army for Acquisition, Logistics, and Technology; Prime contractor: CACI and SciTech; Total number of contractors: 22; Fiscal year 2014 funding requested: $12.45 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate Initial estimate (12/2013): TBD[A]. Acquisition Initial estimate (12/2013): TBD[A]. Operations and maintenance Initial estimate (12/2013): TBD[A]. Amount spent to date: $31.2 (as of October 2013). System Deployment Details (as of December 2013): Current number of total expected users: 0 of unknown[B]; Current number of total expected locations: 0 of unknown[B]; Legacy systems to be replaced: unknown[B]; Annual cost of legacy systems: unknown[B]; Number of expected system interfaces: unknown[B]. Cost, Schedule, and Performance Summary: * Unavailable Cost Data; * Unavailable Schedule Data; * Unavailable System Performance Data. Source: Data reported by DOD officials. [A] As discussed in the next section, in February 2013, the Vice Chairman of the Joint Chiefs of Staff determined that the program's proposed 10-year funding profile was unaffordable. Program officials expected to reach milestone B (the point at which an APB would be established--including an approved cost estimate) about 18 months after the program's requirements are approved. [B] JPI version 2 officials have not yet determined the appropriate system solution, nor the expected number of users, system locations, legacy systems to be replaced, and system interfaces. JPI Version 2: Program Status: In January 2011, funds were first obligated for JPI version 2. According to program officials, since that time the program has, among other things, conducted a requirements analysis to determine program requirements, and developed draft system performance specifications and an economic analysis. In February 2013, the Vice Chairman of the Joint Chiefs of Staff determined that the proposed 10-year funding profile for the program was unaffordable. Subsequently, program officials stated that the program has been working to revise JPI's requirements and reassess the functionality that will be included in the program. JPI officials reported that, in October 2013, the Army recommended that the JPI program make certain changes to its baseline requirements document. JPI officials reported that they presented this recommendation to the chair of a DOD requirements oversight board, but as of December 2013, the chair had not yet made a decision. Program officials stated that they expect to reach milestone B (the point at which an APB would be established) about 18 months after the program's requirements are approved. Unavailable Cost Data: As of December 2013, an APB--including an approved cost estimate--had not been established for JPI version 2. As previously discussed, program officials expect to reach milestone B (the point at which an APB would be established) about 18 months after the program's requirements are approved. Unavailable Schedule Data: As of December 2013, an APB--including an approved schedule estimate-- had not been established for JPI version 2. As previously stated, program officials expect to reach milestone B (the point at which an APB would be established) about 18 months after the program's requirements are approved. Unavailable System Performance Data: As of December 2013, JPI version 2 was in the early stages of design and no part of the system had been implemented. As such, system performance data were not available. [End of Joint Personnel Identification (JPI) – Version 2 profile] Joint Space Operations Center Mission System (JMS) Increment 1: JMS is intended to provide an integrated, network-based, space situational awareness and command and control capability for the Joint Force Component Commander for Space at the Joint Space Operations Center near Lompoc, California. Specifically, JMS increment 1 established the foundational capabilities for future JMS increments, including deploying an initial set of operator mission tools, such as providing automated links to existing data sources and a user-defined operational picture to integrate and display information. First acquisition program baseline as of April 2013: Initiation: 2009; Materiel solution analysis: 2009-2013; Program restructure: 2012; Milestone C and Full deployment decision: planned 2013; Production and deployment: 2013; System fully deployed: planned 2013. Actual schedule as of April 2013: Initiation: 2009; Materiel solution analysis: 2009-2013; Program restructure: 2012; Milestone C and Full deployment decision: 2013; Production and deployment: 2013; System fully deployed: 2013. Program Essentials (as of December 2013): DOD component: Department of the Air Force; Program owner: Air Force Space Command; Prime contractor: not applicable[AS]; Total number of contractors: 5; Fiscal year 2014 funding requested: $4.5 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (04/2013): $155.6; Latest estimate (12/2013): $155.6. Acquisition: First APB (04/2013): $146.6; Latest estimate (12/2013): $146.6. Operations and maintenance: First APB (04/2013): $9.0; Latest estimate (12/2013): $9.0. Amount spent to date: $138.6[B] (as of October 2013). System Deployment Details (as of December 2013): Current number of total expected users: 260 of 260; Current number of total expected locations: 1 of 1; Legacy systems to be replaced: 0[C]; Annual cost of legacy systems: not applicable; Number of expected system interfaces: 15. Cost, Schedule, and Performance Summary: * No change in cost estimate; * Change in schedule estimate (improved); * Met system performance targets. Source: Data reported by DOD officials. [A] The government is directly managing the integration of government- and commercially developed software onto commercial-off-the-shelf hardware. [B] Officials stated that the remaining planned acquisition expenditures are to be used for future contract support costs during the integration of increment 1 into increment 2 in fiscal year 2014. [C] Not directly applicable for JMS increment 1. Future JMS increments will replace legacy systems. JMS Increment 1: Program Status: In 2011, the JMS program was restructured to address programmatic challenges that were identified in a February 2011 independent program assessment. Specifically, the assessment found that, among other things, the program was not ready for a milestone B decision and would have to declare a critical change as required by law because it would not be able to achieve full deployment decision within 5 years.[Footnote 1] Additionally, the assessment found that the program's plan to implement the program in a single increment increased program risk and did not optimally support the warfighter. As part of the program restructure, the program was split from a single increment into a multi-increment approach and the government assumed the role of lead integrator. This split was intended to allow JMS to reduce program costs and accelerate the transition from its legacy system by leveraging existing government prototypes that could be used while additional JMS capabilities were developed. Consistent with this approach, the Air Force identified an Air Force Research Lab prototype that already had significant operational capability, met some of the key requirements for JMS, and could immediately be used by the warfighter. In 2011, the JMS program was directed to manage development and testing of this prototype and it was designated JMS increment 1. In December 2012, the Air Force completed testing on JMS increment 1 and, in April 2013, the program was deemed fully deployed. No Change in Cost Estimate: JMS increment 1 did not experience a change in its cost estimate from its first APB, which was established in April 2013. The program had been underway for approximately 4 years before it established its first APB, which was approved about 2 weeks prior to full deployment of increment 1 (see schedule section). As a result, about 90 percent of the estimated costs in the program's first APB represented sunk costs. Change in Schedule Estimate (improved): JMS increment 1 accelerated its full deployment date by about 2 months. Specifically, when the first APB was established on April 2, 2013, the program planned to have increment 1 fully deployed by June 2013; however, increment 1 was deemed fully deployed on April 15, 2013. Officials attributed the early completion of this milestone to a successful operational trial period and initial operational capability decision. Met System Performance Targets: An operational utility evaluation was completed in December 2012, which concluded that the system was effective for the limited scope of operational capabilities delivered and that increment 1 met the targets for its two key performance parameters related to displaying the user- defined operational picture and supporting network-based military operations. Additionally, as of December 2013, officials reported that JMS increment 1 had continued to meet its performance targets. Footnote: [1] 10 U.S.C. § 2445c(d). [End of Joint Space Operations Center Mission System (JMS) Increment 1 profile] Next Generation Enterprise Network (NGEN) Increment 1: NGEN is intended to replace and improve the enterprise network and services (e.g., data storage, e-mail, and video teleconferencing) that were provided by the Navy Marine Corps Intranet through a departmentwide network services contract to Navy and Marine Corps personnel worldwide.[1] NGEN Increment 1 will transition the service provider, while maintaining the same network infrastructure and services that were provided by the Navy Marine Corps Intranet. Increment 1 is also intended to form the foundation for the Navy's future networking environment. [1] To bridge the time between the end of the Navy Marine Corps Intranet contract in 2010 and full transition to NGEN, Navy awarded a $4.9 billion continuity of services contract to the existing provider-- Hewlett-Packard--which is scheduled to end in April 2014. Initial schedule as of May 2010: Initiation[A]: 2007; Materiel solution analysis: 2007-2011; Milestone C: planned 2011; Production and deployment: 2011-2014; Full deployment decision[B]: planned 2012; System fully deployed[B]: planned 2014. Latest schedule as of December 2013: Initiation[A]: 2007; Materiel solution analysis: 2007-2013; Milestone C: 2011; Production and deployment: 2013-2014; Full deployment decision[B]: planned 2014; System fully deployed[B]: planned 2014. Source: GAO analysis of agency data. [A] According to officials, although the program office was established in July 2007, the program's funds first obligated date was August 2009. Between program initiation and funds first obligated, NGEN program office activities were funded by its predecessor program--Navy Marine Corps Intranet. [B] NGEN refers to these milestones as final transition decision and final transition complete because the program is transitioning to the new service contract, while maintaining the same fully-deployed network. Program Essentials (as of December 2013): DOD component: Department of the Navy; Program owner: Assistant Secretary of the Navy, Research, Development and Acquisition; Prime contractor: Hewlett Packard Enterprise Services; Total number of contractors: 5; Fiscal year 2014 funding requested: $1,791.7 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: Initial estimate (11/2011): $25,447.7; Latest estimate (12/2013): $21,587.1: Acquisition: Initial estimate (11/2011): $926.3; Latest estimate (12/2013): $926.3: Operations and maintenance: Initial estimate (11/2011): $24,521.4; Latest estimate (12/2013): $20,660.8: Amount spent to date: $2,817.7 (as of December 2013). Cost, Schedule, and Performance Summary: * Change in cost estimate (decreased); * Change in schedule estimate (slipped); * Unavailable system performance data. System Deployment Details (as of December 2013): Current number of total expected users: 0 of 800,000[A]; Current number of total expected locations: 0 of 2,500[A]; Legacy systems to be replaced: not yet determined[B]; Annual cost of legacy systems: not yet determined; Number of expected system interfaces: not applicable. Source: Data reported by DOD officials. [A] According to officials, the number of users and locations will increase gradually as the network transitions to the new contract. [B] Officials stated that the total number of legacy systems is being determined through an ongoing effort by the Navy Enterprise Information Governance Board to review legacy networks and decide whether they should be consolidated into NGEN. NGEN Increment 1: Program Status: Navy released a request for proposals for NGEN in May 2012 and awarded the contract to Hewlett Packard Enterprise Services in June 2013. Subsequently, a bid protest was filed in July 2013 that was denied in October 2013, and officials stated that they have since begun performance of the contract. Since the Navy Marine Corps Intranet contract ended in 2010, Navy has been operating under the continuity of services contract, which continues to deliver network services until Navy fully transitions to NGEN. Change in Cost Estimate (decrease): NGEN's planned total life-cycle cost estimate had decreased by about 15 percent. Specifically, while the program had not established an APB as of December 2013, the program's initial cost estimate of about $25.4 billion, established in November 2011, had decreased to about $21.6 billion in its latest estimate, as of December 2013. This was due to a $3.9 billion decrease in operations and maintenance costs, which officials told us was the result of a shift in the contracting approach to allow for either separate or bundled services, thus providing increased competition and potentially greater cost savings.[Footnote 1] Change in Schedule Estimate (slipped): NGEN had experienced an almost 2-year slip in milestone C (authorizes a program to begin production and deployment, and authorized award of the NGEN contract) compared to its initial schedule estimate--from August 2011 to June 2013.[Footnote 2] NGEN had also experienced about a 7- month slip in its full deployment date (which the program calls final transition complete), from May 2014 to December 2014. Officials attributed the slippages to the need to conduct more detailed planning for acquiring NGEN services; delays in solicitation activities; the proposals received not being of the desired quality, which led to delays in contract award; and the contract award protest (previously discussed). As a result of the delays, Navy officials stated that they are planning to extend a continuity of services contract. Unavailable System Performance Data: Navy had not yet evaluated system performance targets for NGEN because it had not yet transitioned to the new NGEN contract. Thus, system performance data were not available. Navy plans to conduct performance assessments to ensure that the network meets performance targets under the NGEN contract, with the first assessment planned to occur in the third quarter of fiscal year 2014. According to officials, until the NGEN performance measurements are fully implemented, the network will be subject to the performance targets of the current network operating under the continuity of services contract, which had been meeting its performance targets, as of December 2013. Footnotes: [1] In March 2011, we recommended that Navy reconsider its acquisition approach for NGEN (GAO, Information Technology: Better Informed Decision Making Needed on Navy's Next Generation Enterprise Network Acquisition, GAO-11-150 (Washington, D.C.: Mar. 11, 2011)). In September 2012, we reported that Navy had reconsidered and made changes to how NGEN services were to be acquired (GAO, Next Generation Enterprise Network: Navy Implementing Revised Approach, but Improvement Needed in Mitigating Risks, GAO-12-956 (Washington, D.C.: Sept. 19, 2012)). [2] We previously reported that Navy was experiencing delays in the NGEN program and that these delays had compressed the timeline for, and increased the risks associated with, transitioning to the new network before the end of the continuity of services contract (GAO-11- 150 and GAO-12-956). [End of Next Generation Enterprise Network (NGEN) Increment 1 profile] Theater Medical Information Program-Joint (TMIP-J) Increment 2: First acquisition program baseline as of November 2002: Initiation and Milestone B: 2002; Engineering and manufacturing development: 2002-2004; Milestone C: planned 2004; Production and deployment: 2004-2013. Latest schedule as of December 2013: Initiation and Milestone B: 2003; Engineering and manufacturing development: 2002-2007; Milestone C: 2008; Critical change: 2009; Production and deployment: 2008-2015; Full deployment decision: 2014; System fully deployed: planned 2015. TMIP-J Increment 2 is a set of applications that support warfighters and health care providers in areas of military operations (called theaters) by maintaining patient electronic health records, integrating medical logistics information, tracking patient movement, and providing medical command and control data. The program also integrates with medical systems at sites that support theaters, called sustaining bases. TMIP-J Increment 2 is intended to upgrade legacy systems and add significant new functionality to the first increment, including increased support for wounded warriors. This increment is intended to be fielded in three releases. Program Essentials (as of December 2013): DOD component: Defense Health Agency; Program owner: Health Affairs Force Health Protection and Readiness; Prime contractors: Evolvent, Data Networks Corporation, KRATOS, LEIDOS, Base Tech, Space and Naval Warfare Systems Center Atlantic, Deloitte, MSGI; Total number of contractors: 17; Fiscal year 2014 funding requested: $71.8 million. Program Costs (then-year dollars in millions): Life-cycle cost estimate: First APB (11/2002): $67.7; Latest estimate (12/2013): $1,579.2. Acquisition: First APB (11/2002): $67.7; Latest estimate (12/2013): $826.6. Operations and maintenance: First APB (11/2002): $0.0; Latest estimate (12/2013): $752.6. Amount spent to date: $535.0 (as of October 2013): Cost, Schedule, and Performance Summary: Change in cost estimate (increased); Change in schedule estimate (slipped); Did not fully meet system performance targets. System Deployment Details (as of December 2013): Current number of total expected users: 30,000 of 21,000 to 62,000[A]; Current number of total expected locations: 509 to 1273 of 743 to 1856[A]; Legacy systems to be replaced: 2; Annual cost of legacy systems: $430,000[B]; Number of expected system interfaces: 114. Source: Data reported by DOD officials. [A] According to Defense Health Agency officials, the number of expected users and expected locations varies based on current demand for military operations. [B] This is the annual cost for the one remaining legacy system; the other was replaced in February 2009. TMIP-J Increment 2: Program Status: In 2008, the first of three TMIP-J Increment 2 releases was approved for limited fielding. Program officials estimated that this first release delivered 80 percent of the capabilities planned for the increment. Also in 2008, the program reported a critical change in its cost and schedule estimates due to an increase in new requirements to support the influx of wounded warfighter initiatives, which increased the program's scope. For example, capabilities were added to track warfighter concussions. The second and third releases are intended to enhance some of the applications that were deployed as part of the first release, as well as address new requirements, such as the previously mentioned concussion tracking. In September 2012, a capability to facilitate viewing the patient record, called context management, was dropped from the scope of this increment. In December 2013, the program achieved full deployment decision based on test results for the second release (discussed in the performance section) and the third release was in the requirements development phase. Change in Cost Estimate (increased): As of December 2013, the latest life-cycle cost estimate for TMIP-J Increment 2 was about $1.58 billion, which represented an approximately 2,233 percent increase from the program's first APB estimate of $67.7 million, established in November 2002. The program reported that key reasons for this increase were (1) the addition of capabilities originally intended to be included in a future increment, including new warfighter requirements (previously discussed); and (2) operations and maintenance costs. Program officials stated that operations and maintenance costs were not included in the first APB estimate because it was initially thought that such costs would be paid by the individual military services. Change in Schedule Estimate (slipped): As of December 2013, TMIP-J Increment 2 is expected to be fully delivered to the Services in the first quarter of fiscal year 2016-- more than 6 years later than the May 2009 delivery date estimated in the first complete APB. Program officials reported that key reasons for this schedule change were, as previously discussed, the addition of new warfighter requirements and capabilities originally intended for another increment, and the Services' request to delay the third release deployment due to their budget and schedule constraints. Program officials also reported that other delays were due, in part, to configuration management and software usage problems experienced when preparing the first release for deployment. Did Not Fully Meet System Performance Targets: The first release of TMIP-J Increment 2 was tested in 2008 and found to be operationally effective, suitable, and survivable, with limitations, which did not prevent fielding. In August 2012, the second release did not fully pass system qualification testing due to defects with the previously mentioned context management capability. Subsequently, the program removed this capability from the increment and the test director recommended that the second release proceed to system acceptance testing. In December 2013, based on tests that were conducted primarily in a simulated environment, a multiservice operational test and evaluation determined that the second release was operationally effective and suitable for the Army, Air Force, Marine Corps, and Navy; and survivable for each of those services except the Navy, which must correct a major defect related to backup and restoration before the system can be fully deployed to the Navy. Additionally, the evaluation showed that 8 of 46 critical and external interfaces passed testing, while the remaining 38 interfaces could not be evaluated until they are fielded. [End of Theater Medical Information Program- Joint (TMIP-J) Increment 2 profile] [End of Appendix II] Appendix III: Comments from the Department of Defense: Assistant Secretary of Defense: 3015 Defense Pentagon: Washington, DC 20301-3015: March 13, 2014: Ms. Carol R. Cha: Director, Information Technology Acquisition Management Issues: U.S. Government Accountability Office: 441 G Street, N.W. Washington, DC 20548: Dear Ms. Cha: This is the Department of Defense (DoD) response to the General Accounting Office (GAO) Draft Report, GAO-14-309, "Major Automated Information Systems: Selected Defense Programs Need To Implement Key Acquisition Practices" dated February 11, 2014. Detailed comments on the report recommendations are enclosed. Sincerely, Signed by: Ms. Katrina McFarland: Assistant Secretary of Defense (Acquisition): Enclosure: As stated. GAO DRAFT REPORT DATED FEBRUARY 11, 2014: GAO-14-309 (GAO CODE 311602): "Major Automated Information Systems: Selected Defense Programs Need To Implement Key Acquisition Practices" Department Of Defense Comments To The GAO Recommendations: Recommendation 1: To better ensure that Defense Agencies Initiative (DAI) implements effective risk management and IT acquisition best practices, GAO is recommending that the Secretary of Defense direct the Director of the Defense Logistics Agency (DLA) to direct the DAI program office to establish a comprehensive risk log that includes all up-to-date risks with evaluations and categorizations that comply with DLA's defined parameters and associated mitigation plans. DoD Response: Concur. DAI is maintaining a comprehensive risk log that includes all up-to-date risks with evaluations and categorizations that comply with DLA's defined parameters and associated mitigation plans. The risks are evaluated and categorized consistent with DLA Instruction 6601, Program Level Risk Management within the Acquisition Life Cycle, dated May 7, 2013. The risks are reviewed in weekly meetings and are documented in a risk log that represents a comprehensive list of all up-to-date risks, consequences and associated mitigation plans. Recommendation 2: To better ensure that Defense Agencies Initiative (DAI) implements effective risk management and IT acquisition best practices, GAO is recommending that the Secretary of Defense direct the Director of the Defense Logistics Agency to direct the DAI program office to identify and document significant cost and schedule deviations from the program's plan. Dal) Response: Concur. During weekly meetings, the DAI program reviews the integrated master schedule and cost information to identify and document significant cost and schedule deviations from the DAI program plan. Recommendation 3: To help ensure that Global Combat Support System — Marine Corps (GCSS-MC) implements effective project monitoring and control best practices, GAO is recommending that the Secretary of Defense direct the Secretary of the Navy to direct the GCSSMC program office to include corrective actions and time frames in future analyses of critical issues and monitor actions taken against those timeframes. DoD Response: Concur. The GCSS-MC program office will include corrective actions and time frames in future analyses of critical issues and monitor actions taken against those timeframes. Recommendation 4: To improve the Theater Medical Information Program — Joint (TMIP-J) program's implementation of IT best practices, GAO is recommending that the Secretary of Defense direct the Director of the Defense Health Agency to direct the TMIP-J program office to update the program's capabilities baseline to reflect program scope changes. DoD Response: Partially concur. The approved December 2011 TMIP-J Acquisition Strategy addressed the change in current program scope and is mapped to the 2007 TMIP-J Capability Production Document. While there is no benefit in updating the requirements documentation for capabilities that have already been built, tested, and deployed, the TMIP-J program will ensure that all future program capabilities are traced to their associated requirements in the appropriate requirements traceability matrices and the program office will update the program's capabilities baseline to reflect program scope changes. Recommendation 5: To improve the TMIP-J program's implementation of IT best practices, GAO is recommending that the Secretary of Defense direct the Director of the Defense Health Agency to direct the TMIP-J program office to trace all capabilities to their associated requirements in the requirements traceability matrix. DoD Response: Partially concur. While there is no benefit in updating the requirements documentation for capabilities that have already been built, tested, and deployed, the TMLP-J program will ensure that all future program capabilities are traced to their associated requirements in the appropriate requirements traceability matrices. Recommendation 6: To improve the TMIP-J program's implementation of IT best practices, GAO is recommending that the Secretary of Defense direct the Director of the Defense Health Agency to direct the TMIP-J program office to implement earned value management in accordance with DoD's policy and train management staff with project oversight responsibilities on the proper use of earned value management. DoD Response: Concur. The TMIP-J program plans to increase the breadth of its existing Earned Value Management (EVM) program through analysis of applicable contracts, through increasing the EVM expertise of government or contractor staff, and through tracking the completion of training to ensure the program office completes required EVM training in a timely manner. One hundred percent of the senior program officials have completed the first of two required EVM courses by September 2013 and one-third of the senior staff has completed both required EVM courses. The remaining individuals, half of whom are new to the program since August 2013, are on track to complete the second EVM course within the required timeframe. Recommendation 7: To improve the TMIP-J program's implementation of IT best practices, GAO is recommending that the Secretary of Defense direct the Director of the Defense Health Agency to direct the TMIP-I program office to develop an authoritative listing of all interfaces currently included in the scope of the program. DoD Response: Concur. The TMIP-J program has consistently maintained an authoritative list of all interfaces currently included in the scope of the program. The TMIP-J Increment 2, Release 2, System View 6 schematic reflects a total of 112 interfaces, and given its a living document, the number of interfaces is subject to change. Recommendation 8: To improve the TMIP-J program's implementation of IT best practices, GAO is recommending that the Secretary of Defense direct the Director of the Defense Health Agency to direct the TMIP-J program office to report to stakeholders the number of current and planned system users and sites and provide updates as needed. DoD Response: Concur. The TMIP-J program will report to stakeholders the number of current and planned system users and sites and provide updates as needed on the number of TMIP-J sites, as identified by the military services. However, the ability of the TMIP-J program to accurately capture this information is dependent upon close collaboration with the military services since obtaining the number of users and sites in a dynamic Theater environment is challenging. [End of section] Appendix IV: GAO Contact and Staff Acknowledgments: GAO Contact: Carol R. Cha at (202) 512-4456 or ChaC@gao.gov: Staff Acknowledgments: In addition to the contact name above, the following staff also made key contributions to this report: Shannin O'Neill, Assistant Director; Milton Clement; Rebecca Eyler; Claudia Fletcher; Javier Irizarry; Emily Kuhn; Madhav Panwar; and Jeanne Sung. [End of section] Footnotes: [1] DOD's IT investment portfolio identifies all of its IT investments and associated costs within the department and its components. [2] The $4.5 billion represents the amount that DOD officials reported they spent in fiscal year 2012 for 41 of 42 2012 MAIS programs. Budget information was not available for the remaining program. [3] 10 U.S.C. § 2445a(a). [4] Pub. L. No. 112-81, § 1078 (Dec. 31, 2011) requires that we report on these assessments no later than March 30 of each year from 2013 through 2018. [5] The 15 MAIS programs included in our review were: Air Force's Air and Space Operations Center-Weapon System (AOC-WS) Increment 10.2; Air Force Integrated Personnel and Pay System (AFIPPS); Base Information Transport Infrastructure Wireless (BITI Wireless); Integrated Strategic Planning and Analysis Network (ISPAN) Increment 2; and Joint Space Operations Center Mission System (JMS) Increment 1; Army's Distributed Common Ground System-Army (DCGS-A) Increment 1, Integrated Personnel and Pay System-Army (IPPS-A) Increment 1, and Joint Personnel Identification (JPI) version 2; Defense Health Agency's (DHA) integrated Electronic Health Record (iEHR) Increment 1 and Theater Medical Information Program-Joint (TMIP-J) Increment 2; Defense Logistics Agency's (DLA) Defense Agencies Initiative (DAI) and Eprocurement; and Navy's Global Combat Support System-Marine Corps (GCSS-MC) Increment 1; Global Command and Control System - Maritime (GCCS-M) Increment 2; and Next Generation Enterprise Network (NGEN) Increment 1. [6] A program's APB contains the life-cycle cost estimate, schedule estimate, and performance parameters that were approved for that program by the milestone decision authority. [7] The 14 MAIS programs included in our first review were: Air Force's Defense Enterprise Accounting and Management System Increment 1, Expeditionary Combat Support System Increment 1, Financial Information Resource System, Information Transport Services Increment 1, and Mission Planning Systems Increment 4; Army's Global Combat Support System - Army; Global Command and Control System - Army Block 4; and Tactical Mission Command; Defense Information Systems Agency's Global Combat Support System - Joint Increment 7 and Teleport Generation 3; and Navy's Common Aviation Command and Control System Increment 1, Consolidated Afloat Networks and Enterprise Services, Distributed Common Ground System - Navy Increment 1, and Navy Enterprise Resource Planning. GAO, Major Automated Information Systems: Selected Defense Programs Need to Implement Key Acquisition Practices, [hyperlink, http://www.gao.gov/products/GAO-13-311] (Washington, D.C.: Mar. 28, 2013). [8] During our review, one of these programs--Air Force's AOC-WS Increment 10.2--established an APB. [9] An estimate in then-year dollars includes the effects of economic inflation. The first APB is established after the program has assessed the viability of various technologies and refined user requirements to identify the most appropriate technology solution that demonstrates that it can meet users' needs. DOD guidance refers to a program's best cost and schedule estimates as objective estimates. [10] Prior to establishing an APB, programs establish initial cost and schedule estimates. These estimates are based on limited information about the program's requirements and the viability of technologies available to meet the program's needs. [11] 10 U.S.C. § 2445c. [12] Prior to the November 2013 DOD interim policy discussed in footnote 18, a milestone A decision either authorized entry into the technology development phase (for programs following DOD's 2008 defense acquisition management system framework, discussed later) or the prototyping phase (for programs following DOD's business capability life-cycle acquisition model, also discussed later). [13] The three selected MAIS programs are DAI, GCSS-MC Increment 1, and TMIP-J Increment 2. [14] Software Engineering Institute, Capability Maturity Model® Integration for Acquisition (CMMI-ACQ), Version 1.3 (Pittsburgh, Pa.: November 2010); Project Management Institute, A Guide to the Project Management Body of Knowledge (PMBOK® Guide), Fifth Edition, (Newton Square, Pa: 2013). "PMBOK" is a trademark of the Project Management Institute, Inc. [15] GAO, Information Technology: DHS Needs to Improve Its Independent Acquisition Reviews, [hyperlink, http://www.gao.gov/products/GAO-11-581] (Washington, D.C.: July 28, 2011). [16] GAO, Business Systems Modernization: DOD Continues to Improve Institutional Approach, but Further Steps Needed, [hyperlink, http://www.gao.gov/products/GAO-06-658] (Washington, D.C.: May 15, 2006) and DOD Financial Management: Implementation Weaknesses in Army and Air Force Business Systems Could Jeopardize DOD's Auditability Goals, [hyperlink, http://www.gao.gov/products/GAO-12-134] (Washington, D.C.: Feb. 28, 2012). [17] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-13-283] (Washington, D.C.: February 2013). [18] DOD recently issued an interim policy, in November 2013, which updates the first acquisition framework--the defense acquisition management system framework--and incorporated guidance from the second framework--the business capability life-cycle acquisition model. This interim policy is also intended to supersede the business capability life-cycle acquisition model, and includes additional decision points, and guidance for applying the framework to multiple acquisition models. As of November 2013, this interim policy represents the only framework for acquiring MAIS programs. This updated framework was not used during this review. Deputy Secretary of Defense Memorandum, Defense Acquisition, November 26, 2013. [19] DOD Instruction 5000.02, Operation of the Defense Acquisition System (Dec. 8, 2008). [20] Directive-Type Memorandum 11-009, Acquisition Policy for Defense Business Systems (DBS) (June 23, 2011). [21] The November 2013 interim policy that updates the defense acquisition management system framework includes additional decision points, and guidance for applying the framework to multiple types of acquisitions, such as software-intensive systems and incrementally acquired systems. This interim policy replaced the 2008 defense acquisition management system framework policy and applies to all DOD acquisition programs. [22] An APB reflects the threshold and objective values for the minimum number of cost, schedule, and performance attributes that describe the program over its life cycle. [23] Limited fielding was the deployment of a capability to a limited number of users to test the capability in an operational environment. [24] Defense Science Board, Report of the Defense Science Board Task Force on Department of Defense Policies and Procedures for the Acquisition of Information Technology (Washington, D.C.: March 2009). [25] As discussed earlier, DOD's recent November 2013 interim policy that updates the defense acquisition management system framework also incorporates guidance from and replaced the business capability life- cycle acquisition model. This updated framework was not used during this review. [26] 10 U.S.C. §§ 2445b and 2445c. [27] In certain cases, DOD does not need to carry out an evaluation and submit a report. Specifically, if the senior DOD official with milestone decision authority determines that a critical change is primarily due to an extension of a program and involves minimal developmental risk, the official may instead submit to the congressional defense committees a certification that the official has made those determinations. This certification must be submitted within 45 days after receiving the quarterly report. [28] PMBOK®; CMMI-ACQ; and GAO, Executive Guide: Information Technology Investment Management, A Framework for Assessing and Improving Process Maturity, [hyperlink, http://www.gao.gov/products/GAO-04-394G] (Washington, D.C.: March 2004). [29] See, for example, GAO, Information Technology: Foundational Steps Being Taken to Make Needed FBI Systems Modernization Management Improvements, [hyperlink, http://www.gao.gov/products/GAO-04-842] (Washington, D.C.: Sept. 10, 2004) and Information Technology: FBI Is Implementing Key Acquisition Methods on Its New Case Management System, but Related Agencywide Guidance Needs to Be Improved, [hyperlink, http://www.gao.gov/products/GAO-08-1014] (Washington, D.C.: Sept. 23, 2008). [30] GAO, DOD Business Systems Modernization: Key Marine Corps System Acquisition Needs to Be Better Justified, Defined, and Managed, [hyperlink, http://www.gao.gov/products/GAO-08-822] (Washington, D.C.: July 28, 2008). [31] GAO, Information Technology: Better Informed Decision Making Needed on Navy's Next Generation Enterprise Network Acquisition, [hyperlink, http://www.gao.gov/products/GAO-11-150] (Washington, D.C.: Mar. 11, 2011). [32] GAO, Next Generation Enterprise Network: Navy Implementing Revised Approach, but Improvement Needed in Mitigating Risks, [hyperlink, http://www.gao.gov/products/GAO-12-956] (Washington, D.C.: Sept. 19, 2012). [33] GAO, Space Acquisitions: Development and Oversight Challenges in Delivering Improved Space Situational Awareness Capabilities, [hyperlink, http://www.gao.gov/products/GAO-11-545] (Washington, D.C.: May 27, 2011). [34] [hyperlink, http://www.gao.gov/products/GAO-13-311]. [35] GAO, Electronic Health Records: VA and DOD Need to Support Cost and Schedule Claims, Develop Interoperability Plans, and Improve Collaboration, [hyperlink, http://www.gao.gov/products/GAO-14-302] (Washington, D.C.: Feb. 27, 2014). [36] Initial cost estimates are based on limited information about the program's requirements and the viability of technologies available to meet the program's needs. [37] Virtualization is a technology that allows multiple, software- based machines to run in isolation, side-by-side, on the same physical machine. Virtual machines can be stored as files, making it possible to save a virtual machine and move it from one physical server to another. [38] Initial schedule estimates are based on limited information about the program's requirements and the viability of technologies available to meet the program's needs. [39] CMMI-ACQ, Version 1.3 (November 2010), and PMBOK® Guide, Fifth Edition, (2013). [40] CMMI-ACQ, Version 1.3 (November 2010), and PMBOK® Guide, Fifth Edition (2013). [41] GAO, Homeland Security: U.S. Visitor and Immigrant Status Indicator Technology Program Planning and Execution Improvements Needed, [hyperlink, http://www.gao.gov/products/GAO-09-96] (Washington, D.C.: Dec. 12, 2008) and Information Technology: Actions Needed to Fully Establish Program Management Capability for VA's Financial and Logistics Initiative, [hyperlink, http://www.gao.gov/products/GAO-10-40] (Washington, D.C.: Oct. 26, 2009). [42] Earned value management is a project management tool that, when properly used, can provide accurate assessments of project progress, produce early warning signs of impending schedule delays and cost overruns, and provide unbiased estimates of anticipated costs at completion. [43] Pub. L. No. 112-81 § 1078 (Dec. 31, 2011). [44] The 14 MAIS programs included in our first review were: Air Force's Defense Enterprise Accounting and Management System Increment 1, Expeditionary Combat Support System Increment 1, Financial Information Resource System, Information Transport Services Increment 1, and Mission Planning Systems Increment 4; Army's Global Combat Support System - Army; Global Command and Control System - Army Block 4; and Tactical Mission Command; Defense Information Systems Agency's Global Combat Support System - Joint Increment 7 and Teleport Generation 3; and Navy's Common Aviation Command and Control System Increment 1, Consolidated Afloat Networks and Enterprise Services, Distributed Common Ground System - Navy Increment 1, and Navy Enterprise Resource Planning. GAO, Major Automated Information Systems: Selected Defense Programs Need to Implement Key Acquisition Practices, [hyperlink, http://www.gao.gov/products/GAO-13-311] (Washington, D.C.: Mar 28, 2013). [45] An enterprise resource planning system is an automated system using commercial off-the-shelf software consisting of multiple, integrated functional modules that perform a variety of business- related tasks, such as general ledger accounting, payroll, and supply chain management. [46] During our review, one of these programs--Air Force's Air and Space Operations Center-Weapon System Increment 10.2--established an APB. [47] DOD guidance refers to a program's best cost and schedule estimates as objective estimates. [48] 10 U.S.C. § 2445c(c), (d). With regard to schedule and cost deviations, a program is considered to have undergone a "significant" change when it has (1) experienced a schedule change that will cause a delay of more than 6 months but less than a year; (2) estimated its life-cycle costs to have increased by at least 15 percent, but less than 25 percent, over the original estimate; or (3) experienced a significant, adverse change in the expected performance of the system. A program is considered to have undergone a "critical" change when it has (1) experienced a schedule change that will cause a delay of 1 year or more; (2) estimated its life-cycle costs to have increased by 25 percent or more over the original estimate; (3) failed to achieve a full deployment decision within 5 years after the milestone A decision for the program or, if there was no milestone A decision, the date when the preferred alternative is selected for the program; or (4) experienced a change in the expected performance of the system or major IT investment to be acquired under the program that will undermine the ability of the system to perform the functions anticipated. [49] At the end of our review, DOD issued an interim policy in November 2013 that updated the defense acquisition management system framework, and incorporated guidance from and replaced the business capability life-cycle acquisition model. This updated framework was not used during this review. [50] Software Engineering Institute, Capability Maturity Model® Integration for Acquisition (CMMI-ACQ), Version 1.3 (November 2010); Project Management Institute, A Guide to the Project Management Body of Knowledge (PMBOK® Guide), Fifth Edition, (Newton Square, Pa: 2013). "PMBOK" is a trademark of the Project Management Institute, Inc. [51] CMMI-ACQ, PMBOK®, and GAO, Information Technology: DHS Needs to Improve Its Independent Acquisition Reviews, [hyperlink, http://www.gao.gov/products/GAO-11-581] (Washington, D.C.: July 28, 2011). [52] A program's first APB contains the original life-cycle cost estimate, schedule estimate, and performance parameters that were approved for that program by the milestone decision authority. The first APB is established after the program has assessed the viability of various technologies and refined user requirements to identify the most appropriate technology solution that demonstrates that it can meet users' needs. [53] An estimate in then-year dollars includes the effects of economic inflation. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]