This is the accessible text file for GAO report number GAO-10-522 
entitled 'Defense Acquisitions: Strong Leadership Is Key to Planning 
and Executing Stable Weapon Programs' which was released on May 6, 
2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to the Committee on Armed Services, U.S. Senate: 

United States Government Accountability Office: 
GAO: 

May 2010: 

Defense Acquisitions: 

Strong Leadership Is Key to Planning and Executing Stable Weapon 
Programs: 

GAO-10-522: 

GAO Highlights: 

Highlights of GAO-10-522, a report to the Committee on Armed Services, 
U.S. Senate. 

Why GAO Did This Study: 

For several decades, Congress and the Department of Defense (DOD) have 
explored ways to improve the acquisition of major weapon systems, yet 
program outcomes and their underlying causes have proven resistant to 
change. Last year, we reported that the cumulative cost growth in DOD’
s portfolio of major programs was $296 billion. The opportunity to 
achieve meaningful improvements may now be at hand with the recent 
introduction of major reforms to the acquisition process. 

In response to a mandate from this Committee, GAO has issued several 
reports about DOD’s budget and requirements processes to support 
weapon program stability. This follow-on report focuses on (1) 
identifying weapon programs that are achieving good outcomes, (2) the 
factors that enable some programs to succeed, and (3) lessons to be 
learned from these programs to guide implementation of recent reforms. 
GAO analyzed DOD’s portfolio of major defense programs and conducted 
case study reviews of five programs. 

What GAO Found: 

While GAO’s work has revealed significant aggregate cost and schedule 
growth in DOD’s portfolio of major defense acquisition programs, 
individual programs within the portfolio vary greatly in terms of cost 
growth and schedule delays. Our analysis of individual program 
performance found that 21 percent of programs in DOD’s 2008 major 
defense acquisition portfolio appeared to be stable and on track with 
original cost and schedule goals. These programs tended to represent 
relatively smaller investments, with just under 9 percent of total 
dollars invested in these programs. Programs that appeared to be on 
track were markedly newer and had development cycles that were shorter 
than highly unstable programs. 

The stable programs we studied were supported by senior leadership, 
run by disciplined program managers, and had solid business cases that 
were well-executed. These programs benefited from strong leadership 
support, in some cases because the programs were perceived as having 
an immediate need and, therefore, were viewed as a higher priority by 
senior leaders. Their program managers tended to share key attributes 
such as experience, leadership continuity, and communication skills 
that facilitated open and honest decision making. As a result, these 
programs established sound, knowledge-based business plans before 
starting development and then executed those plans using disciplined 
approaches. They pursued evolutionary or incremental acquisition 
strategies, leveraged mature technologies, and established realistic 
cost and schedule estimates that accounted for risk. They were able to 
invest in early planning and systems engineering, and made trade-offs 
to close gaps between customer needs and available resources to arrive 
at a set of requirements that could be developed within cost and 
schedule targets. After approval, the programs resisted new 
requirements and maintained stable funding. These practices are in 
contrast to prevailing pressures to force programs to compete for 
funds by exaggerating achievable capabilities, underestimating costs, 
and assuming optimistic delivery dates. 

Congress and DOD have taken major steps toward reforming the defense 
acquisition system that may increase the likelihood weapon programs 
succeed in meeting their planned cost and schedule objectives. Many of 
these steps are consistent with key elements in our case study 
analysis. In particular, the new DOD policy and legislative provisions 
place greater emphasis on front-end planning and establishing sound 
business cases for starting programs. For example, the provisions 
strengthen systems engineering and cost estimating, and require early 
milestone reviews, prototyping, and preliminary designs. They are 
intended to enable programs to refine a weapon system concept and make 
cost, schedule, and performance trade-offs before significant 
commitments are made. Fundamentally, the provisions should help 
programs replace risk with knowledge, and set up more executable 
programs. If reform is to succeed, however, programs that present 
realistic strategies and resource estimates must succeed in winning 
approval and funding. 

What GAO Recommends: 

While no new recommendations are being made, previous GAO 
recommendations have been incorporated into recent reforms. In this 
report, we present lessons learned to help effectively implement these 
reforms. In written comments, DOD noted that it has recently 
instituted several major changes to acquisition policy that are aimed 
at starting programs right. 

View [hyperlink, http://www.gao.gov/products/GAO-10-522] or key 
components. For more information, contact Michael J. Sullivan at (202) 
512-4841 or sullivanm@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Some 2008 Major Defense Acquisition Programs Appeared to Be on Track 
to Meet Their Original Cost and Schedule Projections: 

Stable Programs Had Strong Leadership Support, Disciplined Program 
Managers, and Executable Business Cases: 

Recent Policy and Legislative Reform Initiatives Reflect Key 
Characteristics of Successful Programs: 

Conclusions: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Comments from the Department of Defense: 

Appendix III: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Stable Programs: 

Table 2: Case Study Programs: 

Table 3: Comparison of Factors Contributing to Stable Programs and 
Recent Acquisition Reform Initiatives: 

Table 4: Point Scores for Stability Assessment: 

Figures: 

Figure 1: Distribution of DOD Acquisition Programs by Number of 
Programs and Dollar Value: 

Figure 2: Average Length of Development Cycle for 36 Programs in 
Production (in months): 

Figure 3: Distribution of Stable Programs by Years Since Development 
Start: 

Figure 4: Key Factors That Enable Program Success: 

Figure 5: Comparison of Original and 2007 Development Funding 
Estimates for P-8A: 

Figure 6: Comparison of Original and 2007 Development Funding 
Estimates for SM-6: 

Figure 7: Comparison of Original and 2007 Development Funding 
Estimates for the ARH: 

Figure 8: Comparison of Original and 2007 Development Funding 
Estimates for Global Hawk: 

Figure 9: P-8A Annual Development Funding Requested and Received: 

Abbreviations: 

ACTD: Advanced Concept Technology Demonstration: 

ARH: Armed Reconnaissance Helicopter: 

DAMIR: Defense Acquisition Management Information Retrieval: 

DOD: Department of Defense: 

FCS: Future Combat Systems: 

GPS: Global Positioning System: 

HIMARS: High Mobility Artillery Rocket System: 

IOC: Initial Operational Capability: 

JDAM: Joint Direct Attack Munition: 

MDAP: Major defense acquisition program: 

PAUC: Program Acquisition Unit Cost: 

RDT&E: Research, development, test and evaluation: 

SAR: Selected Acquisition Report: 

SDB: Small Diameter Bomb: 

SM: STANDARD Missile: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

May 6, 2010: 

The Honorable Carl Levin: 
Chairman: 
The Honorable John McCain: 
Ranking Member: 
Committee on Armed Services: 
United States Senate: 

For several decades, Congress and the Department of Defense (DOD) have 
been exploring ways to improve the acquisition of major weapon 
systems, yet poor program outcomes and their underlying causes have 
proven resistant to change. Last year, we reported that the cumulative 
cost growth in DOD's portfolio of 96 major defense acquisition 
programs was $296 billion from first estimates, and the average delay 
in delivering promised capabilities to the warfighter was 22 months. 
[Footnote 1] However, in recent years Congress and DOD have made 
changes to policies which have led to improvements in some aspects of 
the acquisition process. For example, our 2010 assessment of major 
weapon system acquisition programs found that programs started since 
2006 have higher levels of technology maturity when they begin system 
development.[Footnote 2] More importantly, the opportunity to achieve 
widespread and meaningful improvements in weapon system programs may 
now be at hand, as both Congress and DOD have recently introduced 
major reforms to the defense acquisition process, including changes 
to: improve the department's ability to balance requirements with 
resources; establish a stronger foundation for starting programs; and 
execute programs more effectively. While these changes are promising, 
the challenge to achieving better program outcomes will be not only to 
ensure that they are consistently put into practice, but also to 
confront the environment in DOD that has made the weapon acquisition 
area resistant to reform. 

In the Senate Armed Services Committee report for the 2006 National 
Defense Authorization Act, the committee directed us to review DOD's 
budget and requirements processes and assess how these processes can 
better support program stability in major weapon system acquisition. 
In response to this mandate, we have issued several reports that 
highlight weaknesses in these processes.[Footnote 3] This report, the 
last in response to this mandate, identifies practices already present 
in the acquisition environment that are contributing to stable 
acquisition programs and yielding good outcomes. Lessons learned from 
successful weapon programs may provide additional guidance for 
programs implementing recent acquisition reforms. Specifically, we (1) 
identified and described programs within DOD's major weapon system 
acquisition program portfolio that were stable and on track to meet 
cost and schedule targets outlined at program development start; (2) 
determined what factors enabled some stable programs to achieve these 
cost and schedule targets; and (3) analyzed recent acquisition reform 
initiatives to determine how lessons learned from these stable 
programs can be of use as DOD implements acquisition reform. 

To conduct our work, we identified stable programs in DOD's portfolio 
of major weapon system acquisition programs by analyzing data from the 
December 2007 Selected Acquisition Reports (SAR), the most recent full 
reports issued by DOD as of the time we conducted our work, as well as 
other program data.[Footnote 4] We determined each program's growth in 
development cost, unit cost, and schedule from its original program 
baseline estimate and categorized the program as "stable," "moderately 
unstable," or "highly unstable."[Footnote 5] From among the more 
stable programs, and other programs identified as successful by 
acquisition experts, we selected the following five programs as case 
studies: the Army's High Mobility Artillery Rocket System (HIMARS); 
the Air Force's Joint Direct Attack Munition (JDAM) and Small Diameter 
Bomb (SDB); and the Navy's Poseidon Multi-Maritime Aircraft (P-8A) and 
STANDARD Missile-6 (SM-6). For each case study, we reviewed key 
documents, and interviewed past and present program officials to 
identify key factors contributing to the program's stability. We also 
leveraged prior GAO work where we had identified enablers of stability 
in other programs, such as the Navy's F/A-18E/F Super Hornet and the 
Air Force's F-16 Fighting Falcon, as well as causes of instability in 
unstable programs, such as the Air Force's F-22 Raptor and Global Hawk 
programs. We reviewed recent legislative and policy changes relating 
to defense acquisitions. More information about our scope and 
methodology is provided in appendix I. We conducted this performance 
audit from November 2008 to May 2010, in accordance with generally 
accepted government auditing standards. Those standards require that 
we plan and perform the audit to obtain sufficient, appropriate 
evidence to provide a reasonable basis for our findings and 
conclusions based on our audit objectives. We believe that the 
evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

Background: 

Over the last several years, our work has highlighted a number of 
underlying systemic causes for cost growth and schedule delays in 
weapon programs. At the strategic level, DOD's processes for 
identifying warfighter needs, allocating resources, and managing 
acquisitions, which together define its weapon system investment 
strategy, do not work together effectively. As a result, the 
department often fails to balance the competing needs of the 
warfighter and commits to more programs than available resources can 
support. At the program level, DOD's culture and environment often 
allow programs to start with too many unknowns. In other words, 
programs enter the acquisition process without a full understanding of 
requirements; with cost and schedule estimates based on overly 
optimistic assumptions; and with insufficient knowledge about 
technology, design, and manufacturing. 

Prior GAO work in best practices has emphasized the importance of 
having a sound business case when starting major defense programs. The 
business case in its simplest form is demonstrated evidence that (1) 
the identified needs are real and necessary and can best be met with 
the chosen weapon system concept and (2) the chosen concept can be 
developed and produced within existing resources--funding, time, 
technologies, and people. In the DOD weapon system acquisition 
environment, a business case is typically established at Milestone B 
when significant resources are committed. Programs are then measured 
against the business case established at Milestone B. 

A primary reason for cost and schedule problems is the encouragement 
within the acquisition environment of overly ambitious and lengthy 
product developments--sometimes referred to as "revolutionary" or "big 
bang" acquisition programs--that embody too many technical unknowns 
and insufficient knowledge about performance and production risks. The 
knowledge gaps are largely the result of a lack of early and 
disciplined systems engineering analysis of a weapon system's 
requirements prior to beginning system development which translates 
customer needs into a producible weapon system. If this early systems 
engineering is not performed, as has often been the case with DOD's 
major acquisitions in the past, significant cost increases can occur 
as the system's requirements become better understood by the 
government and contractor. 

With high levels of uncertainty about requirements, technologies, and 
design, program cost estimates and their related funding needs are 
often understated, effectively setting programs up for cost and 
schedule growth. We recently assessed both service and independent 
cost estimates for 20 major weapon system programs and found that 
while the independent estimates were somewhat higher, both estimates 
were too low in most cases.[Footnote 6] In some programs, cost 
estimates were off by billions of dollars. Estimates this inaccurate 
do not provide the necessary foundation for sufficient funding 
commitments. The programs we reviewed frequently lacked sufficient 
knowledge and detail to develop sound cost estimates. Without this 
knowledge, cost estimators may fail to adequately account for risk and 
uncertainty because they are relying on overly optimistic assumptions. 

Recognizing the need for more discipline in weapon system acquisition, 
Congress and the department have been making changes to policies in 
recent years and we have seen improvements in performance across some 
aspects of the acquisition process. Most recently, the acquisition 
reforms that took place in 2009 indicate a desire on the part of 
Congress for improvement and willingness on the part of the department 
to change its culture and achieve better outcomes, and address some of 
the sources of instability that we have identified in our prior work. 

Some 2008 Major Defense Acquisition Programs Appeared to Be on Track 
to Meet Their Original Cost and Schedule Projections: 

While our work has revealed significant aggregate cost and schedule 
growth in DOD's major defense acquisition program (MDAP) portfolio, 
individual programs within the portfolio vary greatly in the extent to 
which they experience cost and schedule growth. In our analysis of 63 
individual programs and subprograms in DOD's 2008 MDAP portfolio, we 
found that 21 percent appeared to be stable and on track to meet 
original cost and schedule projections. These stable programs entailed 
relatively small investments and had shorter development cycles than 
programs reporting substantial cost growth and schedule slip. They 
were also significantly newer than less stable programs. 

Approximately 21 percent, or 13, of the 63 MDAP programs that filed 
December 2007 SARs and were at least 3 years into development at the 
time[Footnote 7] were stable, meaning they appeared to be on track to 
end close to the cost and schedule estimates established at 
development start.[Footnote 8] These 13 programs are shown in table 1 
below. 

Table 1: Stable Programs: 

Programs in Production as of December 2007: 

Program: EA-18G Growler electronic warfare aircraft; 
Service: Navy; 
Development cost growth (percent): 3.8; 
Unit cost growth (percent): 4.7; 
Delay in initial capability (months): 0. 

Program: Minuteman III Propulsion Replacement Program; 
Service: Air Force; 
Development cost growth (percent): -6.1; 
Unit cost growth (percent): 6.0; 
Delay in initial capability (months): 0. 

Program: National Airspace System; 
Service: Air Force; 
Development cost growth (percent): 5.4; 
Unit cost growth (percent): 8.8; 
Delay in initial capability (months): 5. 

Program: Small Diameter Bomb Increment I; 
Service: Air Force; 
Development cost growth (percent): -4.9; 
Unit cost growth (percent): -14.3; 
Delay in initial capability (months): -1. 

Programs in Development as of December 2007: 

Program: AGM-88E Advanced Anti-Radiation Guide Missile; 

Service: Navy; 
Development cost growth (percent): 6.2; 
Unit cost growth (percent): 0.0; 
Delay in initial capability (months): 4. 

Program: E-2D Advanced Hawkeye surveillance aircraft; 
Service: Navy; 
Development cost growth (percent): 5.8; 
Unit cost growth (percent): 9.6; 
Delay in initial capability (months): 0. 

Program: Family of Advanced Beyond Line-of-Sight Terminals; 
Service: Air Force; 
Development cost growth (percent): 0.9; 
Unit cost growth (percent): 9.0; 
Delay in initial capability (months): 0. 

Program: Mobile User Objective System; 
Service: Navy; 
Development cost growth (percent): 6.5; 
Unit cost growth (percent): -1.2; 
Delay in initial capability (months): 6. 

Program: Navy Multiband Terminals; 
Service: Navy; 
Development cost growth (percent): -1.6; 
Unit cost growth (percent): -4.8; 
Delay in initial capability (months): 0. 

Program: P-8A Multi-mission Maritime Aircraft; 
Service: Navy; 
Development cost growth (percent): -3.9; 
Unit cost growth (percent): 0.6; 
Delay in initial capability (months): 0. 

Program: PATRIOT/Medium Extended Air Defense System Combined Aggregate 
Program Fire Unit; 
Service: Army; 
Development cost growth (percent): -5.6; 
Unit cost growth (percent): -4.4; 
Delay in initial capability (months): 0. 

Program: PATRIOT/Medium Extended Air Defense System Combined Aggregate 
Program Missile; 
Service: Army; 
Development cost growth (percent): -10.2; 
Unit cost growth (percent): -3.3; 
Delay in initial capability (months): 0. 

Program: STANDARD Missile-6; 
Service: Navy; 
Development cost growth (percent): -7.7; 
Unit cost growth (percent): -3.7; 
Delay in initial capability (months): 0. 

Source: GAO analysis of DOD data. 

[End of table] 

We assessed another 24 of these 63 programs as moderately unstable, 
and the remaining 26 as highly unstable. 

The 13 programs that we assessed as stable tended to be smaller, 
representing just under 9 percent (or $103 billion in fiscal year 2009 
dollars) of the total $1.15 trillion estimated total cost for these 
programs. (See figure 1.) 

Figure 1: Distribution of DOD Acquisition Programs by Number of 
Programs and Dollar Value: 

[Refer to PDF for image: 2 pie-charts] 

By number of programs: 
Highly unstable: 41%; 
Moderately unstable: 38%; 
Stable: 21%. 

By dollar value of programs: 
Highly unstable: 55%; 
Moderately unstable: 36%; 
Stable: 9%. 

Source: GAO analysis of DOD data. 

[End of figure] 

The average total program cost of the stable programs was $7.9 
billion, while the averages were $17.2 billion and $24.5 billion for 
moderately and highly unstable programs, respectively. 

Thirty-eight of the programs we analyzed had completed their 
development cycles and started production by December 2007. The four 
stable programs in this group tended to have shorter development 
cycles, on average more than 2 years shorter than the average for 
highly unstable programs. (See figure 2.) 

Figure 2: Average Length of Development Cycle for 36 Programs in 
Production (in months): 

[Refer to PDF for image: vertical bar graph] 

Stable (4 programs): 
Average months in development: 66.8. 

Moderately unstable (16 programs): 
Average months in development: 78.2. 

Highly unstable (16 programs): 
Average months in development: 92.7. 

Source: GAO analysis of DOD data. 

Note: This analysis excludes two satellite programs, because those 
acquisitions have been conducted under DOD's space acquisition process 
rather than its weapons acquisition process, and therefore the 
Milestone C dates listed for these programs are not necessarily 
indicative of the start of production, as they are for other programs. 

[End of figure] 

The programs that we assessed as stable were also markedly newer than 
the less stable programs. Ten--or 77 percent--of the 13 stable 
programs had started development just 3 to 5 years prior to the 
December 2007 reporting date, compared to only 25 percent of the 
moderately unstable programs and 4 percent of the highly unstable ones 
that started in the same timeframe. (See figure 3 for breakdown of 
stable programs' years since development start.) 

Figure 3: Distribution of Stable Programs by Years Since Development 
Start: 

[Refer to PDF for image: vertical bar graph] 

Years since Milestone B: 3 to 5; 
Number of programs: 10. 

Years since Milestone B: 5 to 10; 
Number of programs: 1. 

Years since Milestone B: 10 or more; 
Number of programs: 2. 

Source: GAO analysis of DOD data. 

[End of figure] 

Stable Programs Had Strong Leadership Support, Disciplined Program 
Managers, and Executable Business Cases: 

The stable programs we studied had strong senior leadership support, 
disciplined program managers, and solid business plans which were well-
executed. (Figure 4 presents a notional illustration of the 
relationship of these key factors.) These programs benefited from 
strong leadership support, in some cases because the programs were 
perceived as having an immediate need and, therefore, were viewed as a 
higher priority by senior leaders in the services and DOD. Their 
program managers tended to share key attributes such as experience, 
leadership continuity, and communication skills that facilitated open 
and honest decision making. As a result, these programs established 
sound, knowledge-based business plans before starting development and 
then executed those plans using disciplined approaches. They pursued 
incremental acquisition strategies, leveraged mature technologies, and 
established realistic cost and schedule estimates that accounted for 
risk. They were able to invest in early planning and systems 
engineering, and made trade-offs to close gaps between customer needs 
and available resources to arrive at a set of requirements that could 
be developed within cost and schedule targets. After approval, the 
programs resisted new requirements and maintained stable funding. 
These practices are in contrast to prevailing pressures in DOD that 
force programs to compete for funds by exaggerating achievable 
capabilities, underestimating costs, and assuming optimistic delivery 
dates. 

Figure 4: Key Factors That Enable Program Success: 

[Refer to PDF for image: illustration] 

The image is a pyramid, depicted as follows: 

Base: 
Sound business case; 

Middle section: 
Disciplined execution; 

Peak: 
Desired outcomes. 

From base to peak: 
Senior leadership support; 
Strong program managers. 

Source: GAO. 

[End of figure] 

While we found 13 programs that were able to maintain stability and 
stay on track toward good outcomes, we focused in depth on the details 
of 5 programs to identify reasons for their success. These were the 
Air Force's Small Diameter Bomb and Joint Direct Attack Munition 
programs, the Navy's STANDARD Missile-6 and Poseidon Multi-Maritime 
Aircraft (P-8A) programs, and the Army's High Mobility Artillery 
Rocket System program.[Footnote 9] Table 2 summarizes cost and 
schedule outcomes of each of our case study programs. 

Table 2: Case Study Programs: 

Program: Small Diameter Bomb (SDB) Increment 1; 
Service: Air Force; 
Success indicators as of December 2007: SDB reduced development costs 
by almost 5 percent and unit costs by more than 14 percent, and made 
the system available one month earlier than anticipated. 

Program: Joint Direct Attack Munition (JDAM); 
Service: Air Force; 
Success indicators as of December 2007: JDAM reduced unit cost by 25 
percent which, in part, enabled the Air Force to purchase more than 
twice the number of units originally expected. 

Program: STANDARD Missile-6 (SM-6) Extended Range Active Missile; 
Service: Navy; 
Success indicators as of December 2007: The SM-6 program was on track 
to reduce expected development costs by more than 7 percent and unit 
costs by almost 4 percent, and expected to deliver initial capability 
on schedule. 

Program: P-8A Multi-mission Maritime Aircraft; 
Service: Navy; 
Success indicators as of December 2007: P-8A was on track to reduce 
estimated development costs by almost 4 percent with less than 1 
percent increase in unit cost and was scheduled to deliver initial 
capability on-time. 

Program: High Mobility Artillery Rocket System (HIMARS); 
Service: Army; 
Success indicators as of December 2007: HIMARS delivered initial 
capability on time in 2005. While development cost for the program 
grew about 20 percent from original estimates, this largely reflects a 
subsequent need to up-armor the vehicle in order to face new threats 
in the wars in Iraq and Afghanistan. 

Source: GAO analysis of DOD data. 

[End of table] 

Strong Leadership Support and Unique Circumstances Were Key to 
Achieving Stable Programs: 

In the stable programs we studied, we found that strong, consistent 
support from DOD and service leadership fostered the planning and 
execution of solid business plans while helping program managers to 
adapt to the inevitable program perturbations. Program managers of 
stable and successful programs were able to make knowledge-based, 
disciplined decisions from the start and resist pressure to overreach 
or add requirements because of this strong institutional support. In 
some cases, high-level support may have been especially strong due to 
the particular circumstances of the programs, such as being designated 
as a high priority. For example: 

* Early in the planning for SDB, the Air Force Chief of Staff 
established a clear written "Commander's Intent" stating that his 
priority was to have the weapon system available for use by the end of 
2006. According to program officials, having this priority from the 
top was invaluable in helping the program establish and maintain an 
effective business plan for acquiring the system. For example, as the 
SDB operational requirements were going through the DOD requirements 
determination process, the Joint Staff attempted to add more 
requirements. In particular, the Army wanted SDB to have additional 
protections from nuclear attack; however, the program office got 
support from the user community, which, buttressed by support from the 
Office of the Secretary of Defense, managed to keep nuclear hardening 
from becoming a requirement by arguing effectively that it did not 
make sense for the weapon. In addition, SDB was designated as the Air 
Force's highest priority program at Eglin Air Force Base, and program 
officials told us SDB was given preference for resources above all 
other programs on-base. Furthermore, SDB was designated as a 
"pathfinder program"--a group of programs established to pioneer ways 
to gain speed and credibility in acquisition. 

* According to program officials, the termination of the STANDARD 
Missile-2 (SM-2) Block IVA program due to cost, schedule, and 
performance problems prompted the Navy to modify the next in its 
series of planned standard missile programs. Initially, the Navy 
proposed the STANDARD Missile-5 (SM-5) program, which was intended to 
develop sophisticated targeting capabilities. However, strong support 
from senior acquisition leaders allowed program officials to advocate 
for a more achievable and affordable "80 percent solution" which 
resulted in the SM-6 program. According to an early SM-6 program 
manager, the urgent need for a successful program helped set the stage 
for conducting thorough and detailed planning work prior to Milestone 
B. In addition, the program received strong support from the Assistant 
Secretary of the Navy for Research, Development and Acquisition in 
obtaining full funding for a realistic, risk-based cost estimate. 

* According to an early program manager, although a mission need for 
HIMARS was identified in the early 1990s, the program was not started 
right away due to a lack of funding. However, prototypes of the system 
were developed through an Advanced Concept Technology Demonstration 
(ACTD) project--a DOD process to get new technologies that meet 
critical military needs into the hands of users faster and at less 
cost--and successfully demonstrated by the Army. This early program 
manager told us HIMARS had been dubbed "a 70 kilometer sniper rifle" 
by troops who have used the system in theater, because of its range 
and accuracy. Based on this success and the cancellation at the time 
of the Army's Crusader artillery program, HIMARS took on greater 
importance for the Army. 

* JDAM was conceived of as a program that would integrate Global 
Positioning System (GPS) technologies into a precision strike guidance 
system. JDAM was designated as one of five high-profile programs to be 
executed under the Defense Acquisition Pilot Project, which gave JDAM 
the authority to implement certain acquisition reform provisions under 
the Federal Acquisition Streamlining Act of 1994 before they were 
published in regulation. According to studies of the JDAM program, 
advocates for the acquisition reform movement wanted quick, highly 
visible wins from these pilot programs to demonstrate the benefits of 
the legislation's strategy. JDAM program officials have stated that 
this sense of urgency and the increased flexibility provided to these 
pilot programs allowed officials to move quickly to achieve doable 
requirements. For instance, JDAM officials were able to streamline the 
program's milestone review process and reporting procedures. In 
addition, according to program officials, the program allowed the 
contractor to maintain configuration control and use commercial 
products and make use of various commercial acquisition procedures 
that were uncommon at the time, with the goal of lowering costs. Also, 
the program was freed to structure innovative solutions such as a 20-
year warranty included in the unit price. Furthermore, according to an 
early program manager for JDAM, the program was given very specific 
direction by the Air Force Chief of Staff to develop and produce the 
bomb at a low unit cost. This support allowed the program to make 
cost, schedule, and performance trade-offs and arrive at a solid 
business case for delivering the weapon system. According to the early 
program manager, one year into the program, a senior Air Force general 
pushed JDAM to accelerate its delivery schedule. However, when the 
program manager asked the general if this had been cleared with the 
Air Force Chief of Staff, the subject was dropped. 

* The primary goal for the P-8A program was to replace the 
capabilities of the P-3C Orion, an aging Navy patrol aircraft that is 
very important to maritime security objectives, but is beginning to 
experience significant structural fatigue problems and must be 
replaced. According to program officials, this sense of immediacy has 
heightened the P-8A's priority and has forced the Navy leadership to 
set realistic requirements for quick delivery. However, an early 
program manager said that, rather than rushing into system 
development, the leadership permitted the program to conduct a 
detailed planning phase and establish a well-balanced package of cost, 
schedule and performance. 

Beyond our case studies, we have also seen examples in the past where 
strong support from high level officials and unique circumstances 
contributed to program success. DOD acquisition officials, for 
example, noted that senior Navy leadership may have made the decision 
to pursue a cautious, more evolutionary approach for development of 
the F/A-18E/F Super Hornet tactical aircraft, in part because of (1) 
embarrassment over the much publicized failure of the A-12 stealth 
attack aircraft program which was terminated after considerable 
investment when the Navy determined that the contractor was unable to 
develop and deliver an aircraft that met requirements; and (2) 
diminishing capacity of the aircraft's previous generation--the F/A-18 
C/D Hornet. In contrast, an official observed that the F-22 Raptor 
fighter aircraft program took a significantly more revolutionary, 
risky approach, likely due, in part, to the fact that the legacy F-15 
Eagle fighter aircraft were still performing so well. 

Stable Programs Had Strong Program Managers Who Shared Key Attributes: 

In addition to support from the top, program managers from successful 
programs tended to have similar attributes for success such as 
experience, leadership continuity, and communication skills that 
facilitated open and honest decision making. These program managers 
were empowered to make good decisions, allowing them to be accountable 
for the success or failure of the program. We found that successful 
managers took proactive measures to ensure the stability of their 
programs, including reaching out to stakeholders in the user and 
testing communities to facilitate their collaboration. For example, 
one program manager was described in a lessons learned memo developed 
by program officials as "part technical expert, part bulldog, and part 
diplomat. Steeped in the technical details of weapon development and 
aircraft integration, he sniffed out and pre-empted technical risks, 
made quick decisions, and aptly convinced stakeholders to support his 
positions." 

Officials from our case study programs indicated that prior experience 
gives a program manager the knowledge to recognize and mitigate risks, 
and effectively respond to unanticipated problems that arise. For 
example, an early program manager for SM-6 told us that he and other 
key staff had been involved in the previous STANDARD Missile program 
(SM-2 Block IVA) that had been terminated. He explained that they were 
therefore very clear on what the potential problems were and were 
highly motivated to try to avoid them if possible. In addition, the 
current program manager of SM-6 said he has been in the STANDARD 
Missile program office since the mid-1980s and that he had worked in 
the program in various capacities before becoming program manager. 

In addition, our case study programs experienced continuity of program 
management and other key staff that assisted knowledge-based decision 
making. We have previously reported that frequent program manager 
turnover may promote shortsightedness, challenge continuity, and 
reduce accountability for poor outcomes. In 2007, we reported that, 
for programs started since March 2001, the average program manager 
tenure was only 1.4 years.[Footnote 10] In contrast, managers for our 
five case study programs served an average of 2.4 years. For instance, 
the SDB program had one program manager who served from before 
Milestone B through the low-rate production decision, providing 
leadership continuity for the entire system development phase. Also, 
many of our case study programs told us that other key staff, such as 
senior engineers, served many years in the program office and provided 
continuity and information necessary for knowledge-based decision 
making. JDAM officials also noted that the continuity of key civil 
service and contractor personnel has proven very beneficial because 
several other personnel have left the program due to military 
deployments and reassignments. Specifically, this included the chief 
engineer, who has been with the program from the beginning, and the 
director of production and delivery, who has been called upon to 
perform as the JDAM program manager to cover 6-to 9-month deployments 
of several past military program managers. Several support contractors 
also have been with the JDAM program for many years. 

While the admission of program difficulties may be seen as detrimental 
to a program manager's career, leaders of our case study programs 
understood that direct and candid communication are essential to 
program success. Program officials observed that, by fostering a 
reputation for honesty and leadership, they were able to develop 
credibility with stakeholders, including contractors, and make 
compelling cases for what was needed. They emphasized the importance 
of including stakeholders early on in the process of solving problems 
so they are invested in the solutions being developed. For example, 
one early program manager for P-8A explained that his cooperative 
relationship with the requirements community enabled them to speak 
with a united voice about the need to keep requirements achievable. He 
described their approach as "starting slow to go fast" as opposed to 
the more common approach of "rushing to failure." In addition, he 
noted that candid communication and a welcoming attitude for scrutiny 
fostered top cover support from senior leadership in the Pentagon. 
Together, support from senior leadership and the requirements 
community for an extended concept development phase permitted the 
program to invest in success upfront, and this support carried 
throughout the execution of the program. Also, SDB officials explained 
that an early program manager proactively established collaborative 
relationships with organizations that the program relied on. For 
example, the program manager sought early buy-in from the testing 
community on the program's test and evaluation plans which helped 
facilitate keeping the testing schedule on track later on. The program 
office also cultivated very effective communication with its 
contractor, assigning program office representatives to work as 
facilitators with competing contractors during pre-Milestone-B 
prototyping work. Similarly, the SM-6 program manager said being on a 
first-name basis with the contractor vice-presidents helps him to 
manage the program effectively. SM-6 program officials also stated 
that the contractor program manager meets weekly with staff to ensure 
the best possible talent is working on his program. 

Stable Programs Established Sound, Knowledge-Based Business Plans at 
Program Start and Executed with Discipline: 

The stable programs we studied exhibited the key elements of a sound 
knowledge-based business plan at program development start. These 
programs pursued capabilities through evolutionary or incremental 
acquisition strategies, had clear and well-defined requirements, 
leveraged mature technologies and production techniques, and 
established realistic cost and schedule estimates that accounted for 
risk. They then executed their business plans in a disciplined manner, 
resisting pressures for new requirements and maintaining stable 
funding. 

Stable Programs Had Achievable Increments Based on Well-Defined 
Requirements: 

The programs we reviewed typically took an evolutionary acquisition 
approach, addressing capability needs in achievable increments that 
were based on well-defined requirements. To determine what was 
achievable, the programs invested in systems engineering resources 
early on and generally worked closely with industry to ensure that 
requirements were clearly defined. Performing this up-front 
requirements analysis provided the knowledge for making trade-offs and 
resolving performance and resource gaps by either reducing the 
proposed requirements or deferring them to the future. The programs 
were also grounded in well-understood concepts of how the weapon 
systems would be used. For example: 

* According to program officials, SDB was designed to meet a pressing 
Air Force need for a low-collateral-damage weapon which was small 
enough to maximize the number that can be carried aboard a single 
aircraft. Although the Air Force initially wanted SDB to have 
capability for hitting both fixed and mobile targets, a decision was 
made early on to defer the more difficult mobile target capability to 
a later program (SDB II). According to the Air Force's Deputy for 
Acquisition, the program worked early on with the warfighter community 
to focus on the "art of the possible," enabling them to proceed with 
an evolutionary acquisition approach. An analysis of alternatives was 
conducted which considered a broad range of alternatives and 
adequately assessed their risks. According to program officials, prior 
to Milestone B, the contractor submitted system performance 
specifications which were incorporated into the development contract. 
Once requirements were finalized at Milestone B, they were limited, 
and the program had a firm understanding of what compromises had been 
made and what capability has been deferred. Program officials told us 
they did not want to take on extra cost and schedule risk to try to 
achieve more than was possible. 

* According to program officials, in planning the P-8A program, the 
Navy limited requirements to capabilities "as good as" those of the P- 
3C and deferred additional capabilities to later increments, adopting 
an "open architecture" approach to enable this acquisition strategy. 
The program also began with a robust analysis of alternatives which 
included a commercial derivative approach and helped the Navy 
recognize that an unmanned aircraft platform could perform some of the 
mission thus decreasing P-8A's requirements. The program received 
feedback from competing contractors on proposed requirements to make 
sure the requirements were well-understood before Milestone B. This 
feedback resulted in some requirement modifications before the award 
of the development contract. 

* HIMARS was designed to be a wheeled version of a heavier tracked 
missile launcher--the M270--and lightweight enough to be transported 
on a C-130 aircraft. The design and requirements were well-understood 
and realistic from the outset, with rapid transportability of the 
platform a key goal and weight reduction a key challenge, according to 
program officials. Program officials also stated that there was a 
drive to use as much existing hardware as possible on the program, and 
maximize commonality with the most recent variant of the M270 launcher. 

* The JDAM program was initially laid out in three phases to increase 
capabilities, including an adverse weather precision strike capability 
based on a need that was demonstrated during Operation Desert Storm. 
Phase I had clear, well-defined requirements--to add existing GPS 
technology to an existing inventory of "dumb" warheads. The final 
phase was based on technologies to be developed over time in the 
science and technology environment before they were integrated into 
the acquisition program. According to JDAM program officials, 
communication with the user community while they were in the planning 
phase allowed trade-offs which resulted in considerable cost 
avoidance. The program had an extensive user trade-off program during 
the initial requirements development in which the users and 
contractors adjusted their requirements and designs to accommodate 
each other. The program treated cost as a key performance parameter--
the iterative planning process cut unit costs by more than half. 

* SM-6 is the next generation in the STANDARD Missile program and has 
added extended range and active missile seeker homing capabilities for 
improved flight responsiveness and guidance over previous generations. 
According to program officials, it is designed to adapt to various 
threats, rather than designed to address a specific threat. The 
STANDARD Missile program has been developing ship-based air defense 
missiles for decades, so there was an established program office that 
invested time in pre-Milestone-B planning and coordination with 
stakeholders. According to program officials, the original plan for 
the next generation missile was a more aggressive, costly solution 
dubbed SM-5. After thoroughly considering alternatives, however, the 
Navy decided to take the more cost conscious, incremental approach of 
the SM-6 which program officials said addressed 80 percent of their 
capability needs for half the cost of the SM-5. 

Beyond our case studies, we have seen other successful programs in the 
past, including the F-16 Fighting Falcon fighter aircraft program, 
that also took more incremental acquisition approaches based on well-
defined requirements. For instance, the F-16 program successfully 
evolved capabilities over the span of about 30 years. Using an 
evolutionary approach to develop the aircraft allowed the program to 
quickly deliver new and improved capabilities to the warfighter, and 
to increase the aircraft's capability as new technologies were matured 
and added to the aircraft. The first increment, developed during the 
1970s, provided a "day fighter" aircraft with basic air-to-air and air-
to-ground capabilities. This allowed the developer to deliver new and 
useful military capability to the warfighter in less than 4 years. 
With each subsequent increment, new technology was used to improve the 
engine, radar, structure, avionics, and other systems that allow the 
aircraft today to perform close air support, ground attack, air 
defense, and suppression of enemy defense missions. The evolutionary 
approach also enriched the industrial base capabilities by extending 
the life of the production over the length of this incremental 
approach. 

In contrast, we have previously reported on many acquisition programs, 
including the Future Combat Systems (FCS), the F-22 aircraft, and 
Joint Tactical Radio System, that have proposed unrealistic and poorly 
understood requirements and pursued revolutionary, exotic system 
solutions which were ultimately extremely difficult or impossible to 
achieve. For example, the FCS program--which was comprised of 14 
integrated weapon systems and an advanced information network--was to 
be the centerpiece of the Army's effort to transition to a lighter, 
more agile and capable combat force. However, the Army started this 
ambitious program in May 2003 before defining what the systems would 
be required to do and how they would interact. It did not expect to 
complete defining requirements until at least 2009, 6 years after 
program initiation. The program's failure to adequately define 
requirements early on resulted in design changes as well as 
significant cost and schedule growth. The FCS program has recently had 
elements canceled and some of the remaining elements restructured into 
other programs. 

Stable Programs Leveraged Mature Technologies and Production 
Techniques: 

The stable programs we reviewed also leveraged mature technologies and 
production techniques, anticipated system integration challenges, and 
demonstrated the feasibility of proposed weapon system concepts by 
completing prototypes before awarding development contracts. In these 
programs, the technologies typically needed to meet the essential 
system requirements had been demonstrated to work in relevant or 
realistic environments. Technologies that were immature generally 
either were not considered or were deferred to later program 
increments. Some of the programs also used established production 
lines, fostering stable production capabilities by limiting unknowns, 
achieving cost efficiencies, and accelerating the learning process 
necessary to develop effective production methods. Furthermore, these 
programs understood the challenge of integrating existing 
technologies, particularly of software items. Three of our five case 
study programs--HIMARS, JDAM, and SDB--developed prototypes before 
Milestone B, which allowed for the assessment of technologies, ensured 
that integration complexities were understood, and demonstrated the 
feasibility of the proposed system concept. For example: 

* HIMARS's system feasibility was demonstrated through an ACTD project 
prior to Milestone B. According to an early program manager, although 
different technological solutions were utilized during formal system 
development, the early prototypes developed for this project proved 
that the system was feasible. In addition, HIMARS leveraged two 
technologies already in production: the chassis from the Family of 
Medium Tactical Vehicles program and the rocket pod from the Multiple 
Launch Rocket System. 

* The P-8A airframe was developed from the Boeing 737 commercial 
aircraft, avoiding some time and risk inherent in developing a 
completely new airframe. In addition, according to program officials, 
the P-8A airframe is being produced on the same production line as the 
737 which provides cost efficiencies and a decreased production 
learning curve. The program did not have fully mature technologies at 
development start, but identified existing technologies as back-ups. P-
8A program officials also understood that software integration would 
likely be more complex than the contractor predicted and allocated 
resources accordingly. 

* Previous iterations of the STANDARD Missile allowed significant 
maturity in the SM-6 program and, therefore, program officials said 
they focused development on integration and software challenges. SM-6 
was designed based on the legacy STANDARD Missile airframe and 
propulsion systems and the Air Force's Advanced Medium Range Air-to-
Air Missile active guidance system--both of which were also produced 
by Raytheon. According to program officials, this has allowed the 
program to use existing technologies and to share production 
facilities, which has in turn produced cost efficiencies for the 
program. 

* SDB's acquisition strategy was focused from the beginning on 
utilization of mature technologies. Program officials noted that, in 
their experience, SDB had an unprecedented level of design maturity 
and production readiness prior to Milestone B--the SDB guidance 
system, warhead, and link kit were all developed prior to program 
start. According to the Air Force's Deputy for Acquisition, SDB 
developed competitive prototypes, giving equal funding to two 
contractors with the goal of demonstrating maturity of their concepts 
prior to Milestone B. During source selection, contractors were only 
given credit for demonstrated performance of their prototypes--not for 
performance promised for the future. Program officials told us that 
this meant that the program entered the system development phase with 
production representative hardware that met the requirements--building 
more units, readying the factory for low rate production, and 
completing tests were the only jobs remaining for the then sole-source 
contractor. The program also conducted a critical design review with 
each contractor prior to Milestone B. 

* Program officials told us that technologies for JDAM tail-kits were 
developed and demonstrated in a research environment several years 
before the program began. Similar to SDB, program officials told us 
that JDAM utilized competitive prototyping to develop and test 
proposed technologies. This allowed the program to incentivize 
contractors to achieve a prototype with a low unit cost prior to 
awarding the system development contract. 

In contrast, we have previously reported that many programs rely on 
immature technologies and do not invest in prototypes before starting 
development. For instance, the Armed Reconnaissance Helicopter (ARH) 
failed in large part due to misunderstanding the level of development 
required to integrate a commercial solution. In addition, the F-22 was 
based on advanced technologies and experienced considerable problems 
during development due to a lack of an existing industrial and 
supplier base experienced in working with one another in fabricating, 
assembling and producing the high technology components necessary for 
the aircraft. 

Stable Programs Set Realistic Cost and Schedule Estimates: 

It is well recognized that realistic cost and schedule estimates that 
account for program risks are imperative to establishing a sound basis 
for acquiring new weapon systems. The foundation of a realistic 
estimate is a high degree of knowledge about program requirements, 
technology, design, and manufacturing. The stable programs we reviewed 
had realistic cost and schedule estimates at Milestone B because they 
had a good understanding of what was needed to develop and produce the 
proposed systems. Since Milestone B, our case study programs have 
generally tracked to their initial development funding profiles. 
Specific examples of good cost estimating and risk analysis from our 
case studies include: 

* The P-8A program was funded to an independent cost estimate which 
was about 14 percent higher than the service cost estimate. According 
to program officials, the independent cost estimators deemed the 
program's estimate for software development to be insufficient, and 
included additional funding in the cost estimate for this effort. 
After contract award, the Navy added $500 million to the contract to 
ensure adequate early effort for software development. The December 
2007 development cost estimate was about 4 percent lower than the 
original 2004 estimate. (See figure 5.) 

Figure 5: Comparison of Original and 2007 Development Funding 
Estimates for P-8A: 

[Refer to PDF for image: multiple line graph] 

Fiscal year 2004 dollars in millions: 

Year: 2002; 
6/2004 SRA: $37.6; 
12/2007 SAR: $37.8. 

Year: 2003; 
6/2004 SRA: $65.9; 
12/2007 SAR: $65.8. 

Year: 2004; 
6/2004 SRA: $69.3; 
12/2007 SAR: $65. 

Year: 2005; 
6/2004 SRA: $484.9; 
12/2007 SAR: $449.6. 

Year: 2006; 
6/2004 SRA: $924.3; 
12/2007 SAR: $858.9. 

Year: 2007; 
6/2004 SRA: $1,068.3; 
12/2007 SAR: $996.7. 

Year: 2008; 
6/2004 SRA: $816.8; 
12/2007 SAR: $766.4. 

Year: 2009; 
6/2004 SRA: $1,001.6; 
12/2007 SAR: $986.3. 

Year: 2010; 
6/2004 SRA: $968.3; 
12/2007 SAR: $970.7. 

Year: 2011; 
6/2004 SRA: $651.7; 
12/2007 SAR: $666.6. 

Year: 2012; 
6/2004 SRA: $254.1; 
12/2007 SAR: $257.7. 

Year: 2013; 
6/2004 SRA: $86.8; 
12/2007 SAR: $43.8. 

Year: 2014; 
12/2007 SAR: $12.5. 

Year: 2015; 
12/2007 SAR: $1.2. 

Source: DOD reported 2004 and 2007 SAR data. 

[End of figure] 

* The SM-6 program also effectively estimated costs--according to an 
early program manager, the program insisted on including all related 
costs in its estimate, including field activity costs and the 
program's share of department-wide overhead and expenses. The program 
also allocated risk across the whole program, building in margin for 
each step. Because the program made a point to develop doable 
requirements, it had the prerequisite knowledge about technologies and 
design to make an accurate estimate and establish a realistic funding 
profile. (See figure 6.) According to an early program manager, to 
realistically estimate schedule the program conducted a comparative 
study of major missile development programs. From this study they 
concluded that all of these programs have taken between 9 and 12 years 
to get from Milestone B to initial capability. They used these 
historical numbers as the basis for the SM-6 schedule estimate. 

Figure 6: Comparison of Original and 2007 Development Funding 
Estimates for SM-6: 

[Refer to PDF for image: multiple line graph] 

Fiscal year 2004 dollars in millions: 

Year: 2004; 
7/2004 SRA: $25.3; 
12/2007 SAR: $25. 

Year: 2005; 
7/2004 SRA: $85.6; 
12/2007 SAR: $80. 

Year: 2006; 
7/2004 SRA: $121.7; 
12/2007 SAR: $106.4. 

Year: 2007; 
7/2004 SRA: $152.7; 
12/2007 SAR: $135.9. 

Year: 2008; 
7/2004 SRA: $165.1; 
12/2007 SAR: $153.2. 

Year: 2009; 
7/2004 SRA: $180.8; 
12/2007 SAR: $173.1. 

Year: 2010; 
7/2004 SRA: $123; 
12/2007 SAR: $115.5. 

Year: 2011; 
7/2004 SRA: $62.5; 
12/2007 SAR: $57.4. 

Source: DOD reported 2004 and 2007 SAR data. 

[End of figure] 

Some of the stable programs we reviewed also included additional 
margin upfront for risky activities when constructing schedule 
estimates. A former P-8A program official stated that the program's 
development contract included 12 months of margin for risk; and a 
former SM-6 official said the program specifically included an 
assumption of two flight failures during testing--each entailing 2 to 
3 months' time to address--explaining that no program realistically 
goes through all of its tests without a failure. In addition, this 
official told us that, instead of trying to eliminate risk by simply 
adding on a single schedule "cushion" at the end of the development 
cycle, they allocated schedule risk margin more strategically. First, 
the program identified the specific tasks that were sources of the 
greatest schedule risk and then they added a margin to the scheduled 
time allocated for each task. This decreased the chance that 
contractor personnel would be brought on board (and paid) to work on 
any specific task prematurely, which in turn helped to decrease the 
cost consequences of schedule delays. 

In contrast, others who began with overly optimistic cost and funding 
assumptions have required much more funding per year than first 
requested. For example, the Armed Reconnaissance Helicopter (ARH) 
program rushed through the planning process, skipping key systems 
engineering steps in a drive to obligate remaining funding from its 
predecessor program, the terminated Comanche reconnaissance 
helicopter. In 2009, we found that the analysis of alternatives for 
ARH looked at only two options--improvement of the existing system or 
procurement of nondevelopmental helicopters. We also found that the 
program did not adequately assess risk for these two alternatives. 
[Footnote 11] According to program officials, the plan the program 
chose did not have room to trade-off cost, schedule, or performance. 
Schedule estimates were driven by a desired fielding date and cost was 
determined primarily by multiplying the desired unit cost by the 
number of desired aircraft. These cost and schedule requirements--
which program officials said were directed by Army leadership--were 
developed without an understanding of the issues or a thorough vetting 
with relevant industry stakeholders. As a result, within 2 years of 
Milestone B, actual and estimated development costs had quickly 
escalated and the development schedule had been extended. Ultimately, 
it was determined that the strategy was not executable and the program 
was terminated in October 2008. Figure 7 shows the program's original 
funding estimate from 2005, based on a lack of knowledge about the 
weapon system's requirements and the resources it would take to 
deliver it, and its funding estimate prior to termination. We note 
that the estimated development funds required more than doubled--an 
increase of almost $365 million (fiscal year 2005 dollars)--and the 
development cycle time increased by 27 months between the program's 
start to December 2007. 

Figure 7: Comparison of Original and 2007 Development Funding 
Estimates for the ARH: 

[Refer to PDF for image: multiple line graph] 

Fiscal year 2005 dollars in millions: 

Year: 2004; 
7/2005 SRA: $1.6; 
12/2007 SAR: $35.8. 

Year: 2005; 
7/2005 SRA: $42.6; 
12/2007 SAR: $42.3. 

Year: 2006; 
7/2005 SRA: $113.8; 
12/2007 SAR: $84.1. 

Year: 2007; 
7/2005 SRA: $101.9; 
12/2007 SAR: $174.8. 

Year: 2008; 
7/2005 SRA: $86.8; 
12/2007 SAR: $173.4. 

Year: 2009; 
7/2005 SRA: $12.1; 
12/2007 SAR: $120.9. 

Year: 2010; 
12/2007 SAR: $92.1. 

Source: DOD reported 2005 and 2007 SAR data. 

[End of figure] 

The Air Force's Global Hawk unmanned aerial vehicle was approved for a 
simultaneous development start and low rate production in 2001. DOD 
restructured the Global Hawk acquisition strategy in March 2002 to 
include a second Global Hawk model. The program office estimated the 
cost, time, and funding required to develop this new, bigger model 
with more stringent requirements without a sufficient understanding of 
the new model. It then restructured the program again in December 
2002; changing the capabilities required in the new variant. As a 
result, the initial cost estimates had become outdated within 2 years 
of development start, and estimated total development costs had more 
than doubled. Figure 8 shows the program's original funding estimate 
from 2001 and its funding estimate as of December 2007. Estimated 
development funding has more than tripled and the development program 
has been extended by about 7 years. 

Figure 8: Comparison of Original and 2007 Development Funding 
Estimates for Global Hawk: 

[Refer to PDF for image: multiple line graph] 

Fiscal year 2000 dollars in millions: 

Year: 2001; 
3/2001 SAR: $135.7; 
12/2007 SAR: $126.8. 

Year: 2002; 
3/2001 SAR: $92.9; 
12/2007 SAR: $206. 

Year: 2003; 
3/2001 SAR: $97.5; 
12/2007 SAR: $319.6. 

Year: 2004; 
3/2001 SAR: $164.7; 
12/2007 SAR: $331.7. 

Year: 2005; 
3/2001 SAR: $161.2; 
12/2007 SAR: $342.1. 

Year: 2006; 
3/2001 SAR: $117.2; 
12/2007 SAR: $227. 

Year: 2007; 
3/2001 SAR: $69.2; 
12/2007 SAR: $192.5. 

Year: 2008; 
3/2001 SAR: $2; 
12/2007 SAR: $231.6. 

Year: 2009; 
12/2007 SAR: $235. 

Year: 2010; 
12/2007 SAR: $197.6. 

Year: 2011; 
12/2007 SAR: $155.6. 

Year: 2012; 
12/2007 SAR: $131.4. 

Year: 2013; 
12/2007 SAR: $130.3. 

Year: 2014; 
12/2007 SAR: $123.4. 

Year: 2015; 
12/2007 SAR: $104.5. 

Source: DOD reported 2001 and 2007 SAR data. 

[End of figure] 

Once Under Way, Programs Resisted Adding New Requirements: 

After starting development, our case study programs resisted adding 
new requirements by keeping stakeholders focused on the importance of 
adhering to cost and schedule, as well as performance commitments. For 
example, P-8A officials related that one reason for cost and schedule 
stability was the program office's willingness to limit capability and 
requirements changes proposed by the P-3C user community, and in 
particular, to find workable solutions to user-preferred "gadgets." 
For instance, to detect submarines, the P-3C used a technology that 
measured shifts in the earth's magnetic field. The user community 
insisted that P-8A use this technology, even though contractor 
engineers determined that it would require the development of very 
expensive software to account for differences in the airframes' 
structures. The P-8A program office successfully worked with the user 
community to gain acceptance of an alternative technology which 
provided the same submarine detection capability while keeping the 
program on cost. 

In the JDAM program, some new requirements were added but according to 
program officials they did not alter the basic performance parameters 
of the program. One JDAM capability enhancement--a laser sensor--was 
developed by the contractor with its own resources. In contrast, 
unstable programs sometimes chase performance, with less concern for 
cost implications. Global Hawk, for example, added new, major, and 
unplanned requirements after Milestone B, increasing cost and schedule 
significantly. Rather than establishing separate program increments, 
Global Hawk restructured its original program to add a new aircraft 
variant with enhanced capabilities. 

Programs Maintained Stable Funding During Execution: 

In addition to starting with annual funding baselines based on 
realistic cost estimates, as discussed above, the stable programs we 
reviewed typically received annual development appropriations close to 
their full funding requests. For example, the P-8A program received 96 
percent of its requested development funding for the years 2005 to 
2007. (See figure 9.) 

Figure 9: P-8A Annual Development Funding Requested and Received: 

[Refer to PDF for image: vertical bar graph] 

Dollars in then-year millions: 

Year: 2005; 
Requested: $496.0; 
Received: $470.9. 

Year: 2006; 
Requested: $964.1; 
Received: $927.0. 

Year: 2007; 
Requested: $1,131.67; 
Received: $1,100. 

Source: Navy budget justification documents. 

[End of figure] 

Although these data show the amount received in each year was slightly 
less than requested, P-8A program officials stated that overall the 
program has had very stable funding. They attributed the stability of 
funding to factors including (1) the acute need for the P-8A to be 
procured quickly and (2) the steady development of the aircraft 
accompanied by remaining on track with initial cost and schedule goals 
from the outset of the program. 

Officials from the other stable programs echoed these themes-- 
particularly that good program execution inspired confidence in the 
program--when asked to explain relatively stable funding. In addition, 
HIMARS program officials told us they maintained good communication 
with their liaison to the funding community, ensuring that this 
liaison was fully informed of the status and progress of the program. 
They felt that this was extremely important, because a thorough and 
current knowledge of the program is what allows a liaison to advocate 
for the program and protect against cuts in funding. In addition, 
officials from the SM-6 program discussed proactively anticipating and 
responding to funding cuts. Program officials track what kind of 
funding cuts they could handle and what the effects might be. Program 
officials stated that when there is a request to take funding from the 
program, they always take the opportunity to respond and justify why 
the program cannot spare the money. Often that justification is 
accepted. 

Funding stability is an essential ingredient to a successful program. 
However, in view of the many pressures that characterize the 
acquisition culture, stable funding and support alone will not prevent 
other acquisition problems, such as problems stemming from unrealistic 
performance requirements, immature technologies, and highly concurrent 
schedules. Funding instability has often been pointed to by program 
managers as a factor contributing to program instability. Given that 
there are too many programs for available resources and many programs 
encounter cost, schedule, and performance problems, it is not 
unexpected that some programs experience funding instability. However, 
we have also seen that funding instability can be the result, not the 
cause, of performance problems. For example, in 2002, we found that, 
while the F-22 program office attributed some of its production cost 
increases to a reduction in quantities, the program had been 
significantly affected by design and manufacturing problems that 
started during development.[Footnote 12] For example, in 1997, an 
independent review team determined that the product development effort 
was underestimated. In short, successful programs enjoy funding 
stability, but funding stability does not ensure program success. 

Recent Policy and Legislative Reform Initiatives Reflect Key 
Characteristics of Successful Programs: 

Recently, Congress and DOD have taken major steps towards reforming 
the defense acquisition system in ways that may increase the 
likelihood that weapon programs succeed in meeting planned cost and 
schedule objectives.[Footnote 13] Many of these steps are consistent 
with key elements we found in our five stable case study programs. In 
particular, the new DOD policy and legislative provisions place 
greater emphasis on front-end planning and establishing sound business 
cases for starting programs. For example, the provisions strengthen 
systems engineering and cost estimating, and require early milestone 
reviews, prototyping, and preliminary designs. They are intended to 
enable programs to refine a weapon system concept and make cost, 
schedule, and performance trade-offs before significant commitments 
are made. Fundamentally, the provisions should help programs replace 
risk with knowledge, and set up more executable programs. Key DOD and 
legislative provisions compared with factors we identified in stable 
programs are summarized in table 3. 

Table 3: Comparison of Factors Contributing to Stable Programs and 
Recent Acquisition Reform Initiatives: 

Stability factors: Establish a sound, executable business case; 
Recent acquisition reform initiatives: Overall, strong emphasis on 
front-end planning (pre-systems acquisition). 

Stability factors: Establish a sound, executable business case; 
* incremental approach to acquiring capabilities; 
Recent acquisition reform initiatives: 
* incremental development emphasized, with each increment that 
provides a significant increase in capability to be managed separately. 

Stability factors: Establish a sound, executable business case; 
* clear, well defined requirements; 
Recent acquisition reform initiatives: 
* early reviews to be conducted prior to start of development 
(Milestone B); 
* enhanced requirements for Analysis of Alternatives; 
* new leadership positions established to enhance systems engineering 
and developmental testing. 

Stability factors: Establish a sound, executable business case; 
* leverage mature technologies; 
Recent acquisition reform initiatives: 
* independent review of technology maturity and integration risk prior 
to Milestone B; 
* competitive prototypes; 
* Preliminary Design Review to be conducted earlier, prior to 
Milestone B. 

Stability factors: Establish a sound, executable business case; 
* establish realistic cost and schedule estimates; 
Recent acquisition reform initiatives: 
* new position and organization established to review and conduct 
independent cost estimates for MDAPs and provide cost estimating 
guidance DOD-wide; 
* early cost estimate required for Milestone A; 
* confidence level for cost estimates to be reported. 

Stability factors: Execute business case in disciplined manner; 
* resist new requirements; 
Recent acquisition reform initiatives: 
* configuration steering boards established to stabilize requirements; 
* post-Critical Design Review assessment required to review progress. 

Stability factors: Execute business case in disciplined manner; 
* stable funding; 
Recent acquisition reform initiatives: [Empty]. 

Source: GAO analysis of the Weapon Systems Acquisition Reform Act of 
2009, Pub. L. No. 111-23, and Department of Defense Instruction 
5000.02 (December 8, 2008). 

[End of table] 

While it is too soon to determine if Congress and DOD's reform efforts 
will improve weapon program outcomes, we have seen evidence that DOD 
is taking steps to implement the provisions. For example, in December 
2009, the department issued a new implementation policy, which 
identifies roles and responsibilities and institutionalizes many of 
the requirements of the Weapon Systems Acquisition Reform Act of 2009. 
It has also filled several key leadership positions created by the 
legislation, including the Directors for Cost Analysis and Program 
Evaluation, Developmental Test & Evaluation, and Performance 
Assessments and Root Cause Analyses. To increase oversight, the 
department has embarked on a 5-year effort to increase the size of the 
acquisition workforce by up to 20,000 personnel in 2015. Furthermore, 
the department has begun applying the acquisition reform provisions to 
some new programs currently in the planning pipeline. For example, 
many of the pre-Milestone-B programs we reviewed this year as part of 
our annual assessment of selected weapon programs plan to develop 
competitive prototypes and conduct a preliminary design reviews before 
going to Milestone B.[Footnote 14] In the Joint Air-to-Ground Missile 
program, the Army recently awarded two contracts for a 27-month 
technology development phase which will culminate in test flights of 
competing prototypes prior to Milestone B. 

The success of DOD's efforts, however, will depend in part on how 
consistently the new provisions are implemented and reflected in 
decisions on individual programs. In the past, inconsistent 
implementation of existing policy hindered DOD's ability to plan and 
execute programs effectively. Inconsistent implementation occurred in 
part because decision makers were not held accountable for programs 
outcomes and there were few, if any, consequences when programs ran 
into problems. Furthermore, cultural and environmental forces at DOD 
work against sound management practices. These forces encourage 
programs to pursue overly ambitious requirements and lengthy 
development efforts, and to move forward with risky and unexecutable 
acquisition strategies. We have found too often that program sponsors 
overpromise capabilities and underestimate costs in order to capture 
the funding needed to start and sustain development programs. For 
acquisition reforms to be effective, they will have to address these 
forces as well. For example, while acquisition reform provisions are 
intended to make cost estimates more reliable and realistic, the 
provisions may be compromised by the competition for funding that 
encourages programs to appear affordable when they are not. 
Furthermore, when program sponsors present a program as more than a 
weapon system, but rather as essential to new fighting concepts, 
pressures exist to accept less-than-rigorous cost estimates. If reform 
is to succeed, then programs that present realistic strategies and 
resource estimates must succeed in winning approval and funding, as 
was the case with the stable programs we reviewed. Those programs that 
continue past practices of pushing unexecutable business cases must be 
denied funding before they begin. 

DOD will also need to ensure that adequate resources--funding and 
workforce capacity--are available to support the front-end planning 
activities now required for new weapon programs. In the past, budget 
realities within DOD have made it more advantageous to fund technology 
development in acquisition programs because most of the department's 
research and development funding goes to existing programs of record. 
Weapon system programs have historically received about 80 percent of 
the department's research and development budget whereas science and 
technology activities have received about 20 percent of the budget. 
The money going toward science and technology is spread over several 
thousand projects, while the money going toward weapon programs is 
spread out over considerably fewer projects. This "distribution of 
wealth" makes it easier to finance technology development within an 
acquisition program. Once initiated, a program is in a more 
competitive position to attract funding support from within the 
department. With competition for funding intense, due to the high 
demand for weapon systems and other needs in the department, freeing 
up funds for pre-systems acquisition activities may be challenging. 
Yet, as we have seen in stable programs, strong leadership support 
ensured that there was sufficient funding and other resources to 
effectively plan up front. 

Because programs in the past tended to proceed to Milestone B too 
quickly, there may be limited experience in DOD in conducting many of 
the program planning activities now required by the acquisition reform 
initiatives. Lessons learned from stable programs, such as the case 
study programs we reviewed, could serve as useful tools for the 
successful implementation of the reform initiatives. For example, 
these programs had the following: effective and consistent leadership 
in place early on; program managers who were empowered to plan and 
establish a sound business case for starting a program; resources 
available for conducting front-end systems engineering and planning; 
established mechanisms to engage industry early on and to contract for 
prototypes; flexibility to make cost, schedule and performance trade- 
offs before committing to a business case; cost and schedule baselines 
that realistically accounted for risks; and evolutionary acquisition 
approaches that address capability needs in achievable increments 
based on well-defined requirements. 

Conclusions: 

Although most DOD programs fail to meet their intended cost and 
schedule objectives, some programs have still been successful. No one 
factor will ensure success; instead, a lot of things need to go right 
in both planning and implementing a program. However, it is critical 
to get the systems engineering and planning phase right, because 
extraordinary implementation cannot save a program with a business 
case that was flawed from the beginning. In our case studies, we found 
that stable programs established sound, knowledge-based business cases 
before moving forward and then executed them in a disciplined manner. 
How they were able to successfully do this was largely due to strong 
leadership support and proactive program managers who knew how to get 
results. Getting the right people in place at the right time and 
supporting them with the requisite resources is critical. However, 
programs are also more likely to succeed when there is a sense of 
urgency to deliver a needed capability and senior leadership views the 
program as a high priority. In addition, programs benefit from having 
experienced program managers who provide consistent leadership through 
major phases of a program. 

Relying on strong leadership or treating each program as a priority is 
not scalable across DOD's broad portfolio of weapon programs. Rather, 
good program outcomes ought to occur normally as an outgrowth of 
effective policy, processes, priorities, oversight, and leadership. 
Acquisition policy and processes establish rules and mechanisms that 
facilitate good program decisions. However, rules and mechanisms are 
only as good as the people that implement them. The new round of DOD 
and Congressional acquisition reforms create a renewed opportunity to 
improve acquisition outcomes, but only if it is accompanied with the 
appropriate leadership support that allowed the stable programs we 
reviewed to establish reasonable business cases and execute them with 
confidence. If reform is to succeed, programs that present realistic 
strategies and resource estimates must succeed in winning approval and 
funding. Those programs that continue past practices of pushing 
unexecutable strategies must be denied funding before they begin. 

Agency Comments and Our Evaluation: 

In written comments on a draft of this report, DOD stated that it was 
encouraged that the report cites progress made over the past several 
years to improve acquisition processes and reduce cost and schedule 
growth. DOD's response is reprinted in appendix II. DOD noted that it 
has recently instituted several major changes to acquisition policy 
that are aimed at starting programs right by using early planning and 
systems engineering, joint analysis teams, competitive prototyping, 
configuration steering boards, credible cost estimates, and program 
manager service agreements. DOD anticipates improvements in program 
performance in the ensuing years due to the recent acquisition reform 
initiatives. 

DOD agreed with the reasons we found for program success, such as 
strong leadership, disciplined program managers, executable business 
cases, and achievable increments based on well-defined requirements. 
However, DOD pointed out that our findings are based upon small, less 
costly, and less complicated programs which may not be readily 
scalable to large, software intensive systems such as satellites or 
the Joint Strike Fighter. While we agree that more complex weapon 
system programs present greater challenges, we have previously 
reported that DOD should consider increasing the number of programs 
which provide incremental improvements in capability to the warfighter 
in a timely way. Complex programs that we cite in our report, such as 
the F/A-18E/F and the F-16, were able to balance requirements with 
available resources and produce cutting-edge weapon systems within 
cost and schedule targets. Revolutionary efforts that rely on unproven 
technological breakthroughs should be the exception rather than the 
rule. In addition, regardless of program complexity, the decision to 
begin a weapon program should be knowledge-based and decision makers 
need to be fully informed of the risks and scope of the effort to be 
undertaken. In the past, the decisions to enter into revolutionary 
acquisition programs such as the Joint Strike Fighter were based on 
overly optimistic assumptions of the cost and time involved to acquire 
these systems. 

DOD also noted that although the measures we used to measure program 
performance are useful--change in development cost, unit cost, and 
schedule from the original program baseline--they are only valid when 
operational requirements remain static throughout the program 
lifecycle. According to DOD, in some cases, the warfighter's needs 
change and the department must adapt by enhancing weapon system 
functionality. We agree that the needs of the warfighter are 
paramount; however, DOD would be in a better position to adapt to the 
changing needs of the warfighter by reducing the time it takes to 
field new systems and enhancing weapon system functionality in future 
increments. According to DOD acquisition policy, incremental 
development is the preferred approach and each increment that provides 
a significant increase in capability should be managed separately. 

DOD agreed that program manager tenure is a contributing factor in 
program stability, but thought that our characterization of tenure as 
an average is misleading because it does not reflect the total time 
program managers serve in their positions. The baseline average we 
calculated is based on data collected as part of our 2007 report on 
program manager empowerment and accountability.[Footnote 15] Our work 
has shown that rather than having lengthy assignment periods between 
key milestones as suggested by best practices, many programs we have 
reviewed had multiple program managers within the same milestone. 
Furthermore, the key point we are making in the report is that program 
manager tenure in our case study programs, calculated using the same 
methodology used in our previous report, was longer than what we have 
seen in other programs (2.4 versus 1.4 years), which was a 
contributing factor to their relative success. 

DOD also questioned our criteria for defining "stable" and "unstable" 
programs and thought that our evaluation criteria were subjective and 
should have been based on accepted standards, legislation, and 
regulations. However, aside from the Nunn-McCurdy breach criteria 
established by Congress for "significant" and "critical" unit cost 
growth in major weapon programs--30 and 50 percent growth respectively 
from the original baseline--there are no standard criteria established 
for assessing weapon program stability. We are not suggesting that the 
criteria used in our report is the only way to measure program 
stability; however, we believe it is important to examine programs 
from several perspectives--growth in development and unit costs, and 
delay in achieving initial operational capability. We selected 
thresholds for these three indicators based on historical cost and 
schedule growth in major defense acquisition programs and our judgment 
based on many years of conducting DOD program reviews. 

DOD also provided technical comments which we incorporated where 
appropriate. 

We are sending copies of this report to the Secretary of Defense and 
interested congressional committees. In addition, this report will be 
made available at no charge on the GAO Web site at [hyperlink, 
http://www.gao.gov. If you or your staff have any questions about this 
report or need additional information, please contact me at (202) 512-
4841 or sullivanm@gao.gov. Contact points for our Offices of 
Congressional Relations and Public Affairs may be found on the last 
page of this report. GAO staff who made major contributions to this 
report are listed in appendix III. 

Signed by: 

Michael J. Sullivan, Director: 
Acquisition and Sourcing Management: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

This report identifies and examines some major defense acquisition 
programs that have stayed relatively close to the cost and schedule 
estimates established when they started development, in order to 
identify useful lessons that can be implemented with the current 
acquisition reforms. Specifically, our objectives were to (1) identify 
and describe programs within the Department of Defense's (DOD) 2008 
major weapon system acquisition program portfolio that were stable and 
on track to meet cost and schedule targets outlined at program 
development start; (2) determine what factors enabled some stable 
programs to achieve these cost and schedule targets; and (3) analyze 
recent acquisition reform initiatives to determine how lessons learned 
from these stable programs can be of use as DOD implements acquisition 
reform. 

To identify and describe programs that were stable, we analyzed data 
from DOD's Selected Acquisition Reports (SAR), as well as other 
program data. DOD typically submits SARs on current major defense 
acquisition programs to Congress at the end of the first quarter of 
each fiscal year, which provide a basis to determine each program's 
cost and schedule performance. We first obtained the list of the 
programs that published SARs in December 2007 (the last year in which 
full SARs were published, as of the time we conducted our work) from 
the December 2007 SAR summary tables posted on DOD's public 
acquisition Web site. We excluded programs for which December 2007 was 
the first SAR or the termination SAR, as we could not make valid 
baseline comparisons for these programs. We also made other minor 
adjustments to the program list to enable valid baseline comparisons, 
such as separating subprograms that were listed as one in the 2007 SAR 
table but reported the relevant data separately. We were left with a 
list of 95 programs, for which we obtained SAR data and other 
information through the Defense Acquisition Management Information 
Retrieval (DAMIR) Purview system.[Footnote 16] We also analyzed data 
submitted to us by program offices as part of our annual review of 
selected weapon systems. We then excluded from our analysis programs 
for which not all data necessary for our baseline comparisons were 
published, as well as those which passed Milestone B less than 3 years 
prior to the December 2007 SAR report.[Footnote 17] 

We analyzed each of the remaining 63 programs based on data we 
obtained from DAMIR and from program offices. We retrieved data from 
each program's original development baseline report that showed 
estimated total amount of research, development, test, and evaluation 
(RDT&E) costs, total program acquisition unit costs (PAUC), and date 
of initial operational capability for the program, as of the start of 
development. We then obtained comparable data on each program from its 
December 2007 SAR. We converted all cost information to fiscal year 
2009 dollars using conversion factors from the DOD Comptroller's 
National Defense Budget Estimates for Fiscal Year 2009 (Table 5-9). 
Through discussions with DOD officials responsible for the DAMIR 
database and confirming selected data with program offices, we had 
previously determined that the SAR data and the information retrieved 
from DAMIR were sufficiently reliable for our purposes. 

Using these data, we assessed program stability for each program as 
described below. We defined "program stability" to mean minimal change 
from first full cost and schedule estimates to the December 2007 
estimates. Because it was not feasible to compare original baseline 
versus current program performance on all relevant parameters, we 
chose three that best provided an indication of whether a program had 
remained close to its cost and schedule estimates. These three 
stability indicators included: 

A. Total RDT&E (or "development") costs: This metric represents the 
estimated cost of developing a system from the beginning of 
development to the point at which it is ready for low-rate production. 

B. Program Acquisition Unit Costs (PAUC): This measure represents the 
expected acquisition cost for each unit procured, as determined by 
dividing the sum of a program's estimated total program development, 
procurement, and military construction costs by the number of units to 
be procured. 

C. Initial Operational Capability (IOC) or equivalent date: IOC is 
generally achieved when some units or organizations that are scheduled 
to receive a system have received it and have the ability to employ 
and maintain it. Where programs did not report expected IOC dates, we 
substituted equivalent dates such as "First Unit Equipped" or 
"Required Assets Available" dates reported. 

We compared each program's initial estimate for each of these three 
indicators with the actual/estimated values reported in its December 
2007 SAR. We assigned a point score for change on each indicator based 
on the following thresholds:[Footnote 18] 

A. Development (RDT&E) cost growth: 

* Programs that reported less than 10 percent development cost growth 
received 10 points. 

* Programs that reported at least 10 percent, but less than 35 percent 
estimated development cost growth received 5 points. 

* Programs that reported at least 35 percent development cost growth 
received 0 points. 

B. Expected unit cost (PAUC) growth: 

* Programs that reported less than 10 percent growth in expected unit 
costs received 10 points. 

* Programs that reported at least 10 percent, but less than 30 percent 
growth in expected unit costs received 5 points. 

* Programs that reported at least 30 percent growth in expected unit 
costs received 0 points. 

C. Expected initial capability schedule slip: 

* Programs that reported less than 6 months slip in expected IOC date 
received 10 points. 

* Programs that reported at least 6 months, but less than 12 months 
slip in expected IOC date received 5 points. 

* Programs that reported at least 12 months slip in expected IOC date 
received 0 points. 

We summed the scores for each program across the three indicators, to 
arrive at a total point score representing our assessment of the 
program's overall stability. We categorized each program as "stable," 
"moderately unstable," or "highly unstable" as described in table 4. 

Table 4: Point Scores for Stability Assessment: 

Total point score: 25 or 30 points; 
GAO assessment of stability: Stable. 

Total point score: 10, 15, or 20 points; 
GAO assessment of stability: Moderately unstable. 

Total point score: 0 or 5 points; 
GAO assessment of stability: Highly unstable. 

Source: GAO. 

[End of table] 

We then analyzed the distribution of stable, moderately unstable, and 
highly unstable programs, examining number of programs, average 
program cost, and total acquisition dollar value (current estimates) 
represented by programs in each category. We also summarized the 
distribution of programs by age (measured as years into development) 
among stable, moderately unstable, and highly unstable programs. 
Finally, based on programs' reported Milestone B and C dates, we 
reported development cycle time data for a subset of our programs. We 
limited this inquiry to those programs that had entered production, so 
that data on actual development cycle time was available. We excluded 
ship and satellite systems from this analysis, as system development 
start and end points are defined differently for these systems. 

To more closely examine factors that enhance program stability, we 
chose a selection of five example programs for in-depth study. We 
identified case study programs based on data from a variety of 
sources, including our analysis of programs in the 2008 MDAP 
portfolio, our review of the literature on weapons system 
acquisitions, including work conducted by RAND and the Defense 
Acquisition University, and prior GAO work on programs that have 
demonstrated best practice approaches. We also interviewed defense 
acquisition experts to learn which programs are seen as role models 
among acquisition programs. We selected the five case study examples 
using a criteria-based, nongeneralizable sample in order to achieve 
representation across the military services as well as a variety of 
weapon platforms. The following five programs were selected as case 
studies: the Army's High Mobility Artillery Rocket System; the Air 
Force's Joint Direct Attack Munition and Small Diameter Bomb; and the 
Navy's Poseidon Multi-Maritime Aircraft (P-8A) and STANDARD Missile-6. 
For each case study, we reviewed key documents, information from 
program offices on program managers, and interviewed past and present 
program officials to identify key factors contributing to the 
program's stability. We also met with former senior DOD acquisition 
officials to further understand factors that stabilize programs. In 
addition, we met with program officials and reviewed prior GAO work on 
the Air Force's F-22 Raptor and Global Hawk programs to better 
understand the factors which influenced these unstable programs. We 
also reviewed prior GAO work where we had identified enablers of 
stability in other programs, including the Navy's F/A-18E/F Super 
Hornet and the Air Force's F-16 Fighting Falcon. To assess information 
about programs' cost estimates, we compared original development 
funding estimates from programs' baseline SARs to development funding 
estimates from the most recent December 2007 SARs. In addition, to 
illustrate funding stability for the P-8A program, we compared 
requested and received budget amounts from budget justification 
documents. 

To determine how lessons learned from stable programs can be of use as 
DOD implements acquisition reform, we reviewed recent legislative and 
policy changes relating to defense acquisitions and compared these 
initiatives to our findings regarding the factors that enable program 
stability. 

We conducted this performance audit from November 2008 to May 2010, in 
accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Comments from the Department of Defense: 

Office Of The Under Secretary Of Defense: 
Acquisition, Technology	And Logistics: 
3000 Defense Pentagon: 
Washington, DC 20301-3000: 
	
April 30, 2010: 

Mr. Michael J. Sullivan: 
Director, Acquisition and Sourcing Management: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Sullivan: 

This is the Department of Defense (DoD) response to the GAO Draft 
Report, GA0-10-522, "Defense Acquisitions: Strong Leadership is Key to 
Planning and Executing Stable Weapon Programs," dated March 23, 2010 
(GAO Code 120794). 

The Department is encouraged that the draft report cites the progress 
we have made over the past several years in our efforts to improve 
acquisition processes and reduce cost and schedule growth. We are 
proud that many of our acquisition programs are performing well, and 
we appreciate that the GAO has formally recognized several of them. We 
have instituted several major changes that are beginning to show 
results. As noted in the draft report, recent acquisition policy 
revisions and the Weapon Systems Acquisition Reform Act (WSARA) of 
2009 are aimed at starting programs out right by using early planning 
and systems engineering, joint analysis teams, competitive 
prototyping, configuration steering boards, credible cost estimates, 
and program manager service agreements. In the ensuing years, we 
anticipate improvements in program performance measures due to our 
recent acquisition reform initiatives. While the draft report does 
recognize the gains due to the Department's initiatives, the full 
breadth of those benefits were not emphasized in the text. 

The Department agrees with the reasons for program success, such as 
strong leadership, disciplined program managers, executable business 
cases, and achievable increments based on well-defined requirements. 
While many of the insights in the draft report are informative, they 
are based upon small, less costly and less complicated programs and 
may not readily scale to large, software intensive systems such as 
satellites, the Joint Strike Fighter, or other highly complex weapon 
systems. Moreover, many of the programs are being penalized for 
carrying cost growth and schedule slips that occurred many years ago, 
their original program cost and schedule estimates are very old and 
were overly optimistic. The Department can now report that these 
programs have turned the corner and are now stable and performing well. 

The Department agrees that the use of development cost, unit cost and 
delay in IOC are useful measures of program performance; however, they 
alone are not the only measures of stability. The draft report defines 
stable and unstable programs not in terms of historical outcomes, but 
rather as whether programs are relatively on-track to meet cost and 
schedule targets as defined at program development start." This 
definition is valid when operational requirements remain static 
throughout the program lifecycle. In some cases, the Warfighter's 
needs change through time, and DoD has an obligation to provide the 
best weapon systems to achieve U.S. military objectives. The 
Department owes it to the nation to be able to adapt to a changing 
international threat environment. This strategic agility and its 
associated enhanced system functionality do impact cost and schedule. 
Managing programs on the cutting edge of science, technology, and 
engineering does carry risk with it. For these reasons, it is not 
always useful to assess acquisition programs relative to the initial 
conditions at program inception. 

I thank you and your staff for working with the Department to improve 
the information flow between our organizations and to develop more 
meaningful metrics in this area. I look forward to continuing to 
improve the acquisition process to more effectively and efficiently 
deliver products to our customers, and we need to continue to develop 
better metrics. The Department looks forward to working with the GAO 
in both of these important endeavors. 

The Department appreciates the opportunity to comment on the draft 
report. Additional comments are provided as an enclosure to this 
letter. My point of contact for this effort is Mr. Joseph Alfano, 703-
697-3343. 

Sincerely, 

Signed by: 

Dr. Nancy L. Spruill: 
Director: 
Acquisition Resources & Analysis: 

Enclosure: As stated: 

[End of letter] 

Enclosure: 

GAO Draft Report Dated March 23, 2010: 
GAO-10-522 (GAO Code 120794): 

"Defense Acquisitions: Strong Leadership Is Key To Planning And 
Executing Stable Weapon Programs" 

Department Of Defense Comments On The Draft Report: 

Applications of Lessons Learned: 

* The draft report does not emphasize recent initiatives that the 
Department undertook to enhance program acquisition management. For 
instance, page 4 of the draft report states "...programs enter the 
acquisition process without a full understanding of requirements..." 
which reflects the "sins of the past." The programs that failed to 
properly understand requirements were among the oldest in the 
portfolio. The Department has since placed a much greater emphasis on 
managing requirements. Additionally, given the uncertain and evolving 
military threats that confront our nation, such a high standard for 
requirements may be unachievable. Because the nature of the threats is 
not static throughout the life of a weapon system, neither are the 
program requirements. The Department agrees that resisting new 
requirements is important, and we have implemented configuration 
steering boards to manage requirements growth; however, the Department 
must also be responsive to the Warfighters' future needs. 

* The draft report implies that PM tenure is a significant factor in 
program stability. The Department agrees that while it is a 
contributing factor, characterizing PM tenure as an average is 
misleading. Specifically, on page 16, the draft report states that the 
average program manager (PM) tenure is 1.4 years. By design, PMs serve 
in their positions for three years, so at any point in time the 
population of PMs should be half-way through their three year tenure: 
statistically 1.5 years. The draft report implies that this "snapshot 
in time" average PM tenure of 1.4 years represents the total time PMs 
serve in their positions. Since the full tenure period for a PM is 
three years, this is clearly not the case. Since this is not the first 
time we have disagreed on this statistic, I would offer to collaborate 
with you to sample the PMs and have a shared database from which to 
work. 

Assessment of Programs vis-a-vis cost and schedule growth: 

* The evaluation criteria are subjective. Page 3 of the draft report 
contains the evaluation criteria and concedes that the criteria are 
not traceable to legislation or regulations used to report cost and 
schedule growth. The draft report's evaluation criteria are ad hoc and 
internally inconsistent: the upper cut-off for RDT&E cost growth is 
35%, while the equivalent threshold for program acquisition unit cost 
(PAUC) growth is 30%. When the study's criteria are applied to the 
five programs identified as exemplars, two of them (HIMARS and JDAM) 
do not rank as stable programs. 

* The study used nonstandard cost and schedule data. The Objectives, 
Scope and Methodology section states that "other program data" 
supported the study. The Department has been unable to reproduce the 
study results using Selected Acquisition Report (SAR) data. 

* The Department recommends using evaluation criteria based on 
accepted standards, legislation, and regulations. The table below 
relates the DoD growth parameters to enacted legislation. For IOC 
slip, the cutoffs are 12 months and 24 months total growth beginning 
with the IOC defined at Milestone B. 

Table: 

Growth Parameter: RDT&E; 
DoD Criteria (moderate/high growth): +30%/+50%; 
U.S. Code (Title 10): N/A. 

Growth Parameter: PAUC; 
DoD Criteria (moderate/high growth): +30%/+50%; 
U.S. Code (Title 10): +30%/+50% (Nunn-McCurdy). 

Growth Parameter: IOC; 
DoD Criteria (moderate/high growth): +12 months/+24 months since 
milestone B; 
U.S. Code (Title 10): +6 months since prior quarterly SAR
(10 USC 2432). 

[End of table] 

* Using the Dec 2007 SAR data and the above evaluation criteria, the 
portfolio growth profile appears in the following table. 

Category: Low Growth; 
Programs by Category: 40%; 
Total Acquisition Cost by Category: 15%. 

Category: Moderate Growth; 
Programs by Category: 47%; 
Total Acquisition Cost by Category: 72%. 

Category: High Growth; 
Programs by Category: 13%; 
Total Acquisition Cost by Category: 13%. 

[End of table] 

[End of section] 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Michael J. Sullivan, (202) 512-4841 or sullivanm@gao.gov: 

Staff Acknowledgments: 

In addition to the contact named above, the following individuals made 
key contributions to this report: John Oppenheim (Assistant Director), 
Noah Bleicher, Alexandra Dew, Margaret Holmes, David Messman, Susan 
Neill, and Ann Marie Udale. 

[End of section] 

Footnotes: 

[1] GAO, Defense Acquisitions: Assessments of Selected Weapon 
Programs, [hyperlink, http://www.gao.gov/products/GAO-09-326SP] 
(Washington, D.C.: March 30, 2009). 

[2] GAO, Defense Acquisitions: Assessments of Selected Weapon 
Programs, [hyperlink, http://www.gao.gov/products/GAO-10-388SP] 
(Washington, D.C.: March 30, 2010). 

[3] In March 2007, we reported that DOD lacks an effective, integrated 
approach to balancing its weapon system investments with available 
resources. In July 2008, we reported that a knowledge-based funding 
approach could improve major weapon system program outcomes. In 
September 2008, we reported that DOD's requirements determination 
process had not been effective in prioritizing joint capabilities. See 
GAO, Best Practices: An Integrated Portfolio Management Approach to 
Weapon System Investments Could Improve DOD's Acquisition Outcomes, 
[hyperlink, http://www.gao.gov/products/GAO-07-388] (Washington, D.C.: 
March 30, 2007); Defense Acquisitions: A Knowledge-Based Funding 
Approach Could Improve Major Weapon System Program Outcomes, 
[hyperlink, http://www.gao.gov/products/GAO-08-619] (Washington, D.C.: 
July 2, 2008); and Defense Acquisitions: DOD's Requirements 
Determination Process Has Not Been Effective in Prioritizing Joint 
Capabilities, [hyperlink, http://www.gao.gov/products/GAO-08-1060] 
(Washington, D.C.: September 25, 2008). 

[4] DOD did not issue complete Selected Acquisition Reports for its 
major defense acquisition programs in 2009, so program costs had not 
been updated since the December 2007 reports, as of the time we 
conducted our work. 

[5] We classified programs as "stable," "moderately unstable," or 
"highly unstable" based on growth in research, development, test and 
evaluation (RDT&E) and program acquisition unit cost (PAUC) estimates, 
as well as schedule slip, from their Milestone B estimates. Programs 
were assessed based on whether their RDT&E growth was less than 10 
percent, between 10 and 35 percent, or 35 percent or more; whether 
their PAUC growth was less than 10 percent, between 10 and 30 percent, 
or 30 percent or more; and whether their initial capability schedule 
slip was less than 6 months, between 6 and 12 months, or 12 months or 
longer. We classified a program as stable if it was below the lowest 
threshold in at least two of our cost and schedule measures and no 
higher than the middle threshold range in the third measure. 

[6] [hyperlink, http://www.gao.gov/products/GAO-08-619]. 

[7] We excluded from our analysis 23 programs for which development 
cost, unit cost, or initial operational capability data were not 
available. We also excluded 9 MDAPs that were less than 3 years into 
development as of December 2007. We have previously found that a 
significant portion of cost increases often do not occur until after a 
program is approximately halfway through its development cycle. The 
average projected development cycle time for the programs in the 2008 
MDAP portfolio was approximately 7 years. We therefore determined that 
including in our analysis programs that were less than 3 years into 
development might artificially overstate stability in the portfolio. 
Because we did not conduct any analysis of the excluded programs, we 
do not have any basis on which to comment upon the stability of the 
excluded programs. 

[8] While we attempted to account for our previous finding that cost 
increases often occur late in development by excluding younger 
programs from our analysis, it is likely that some programs that 
appeared stable according to this data may have since exhibited 
instability. For instance, the E-2D Advanced Hawkeye aircraft program, 
which appeared to be on track to meet original cost and schedule 
estimates as of December 2007 (54 months into its development), 
reported a critical cost breach in 2009. 

[9] Three of our five case studies are among the 13 stable programs we 
identified; however, the other two had moderate cost and/or schedule 
growth but were selected because they had other attributes that were 
valuable to study. In particular, we selected JDAM as a case study 
because its unit costs were lower than originally estimated and it is 
widely recognized within the defense acquisition community as a 
successful program. HIMARS delivered capability on-time and it was one 
of the Army's most stable programs in the portfolio. 

[10] GAO, Defense Acquisitions: Department of Defense Actions on 
Program Manager Empowerment and Accountability, [hyperlink, 
http://www.gao.gov/products/GAO-08-62R] (Washington, D.C.: November 9, 
2007). 

[11] GAO, Defense Acquisitions: Many Analyses of Alternatives Have Not 
Provided a Robust Assessment of Weapon System Options, [hyperlink, 
http://www.gao.gov/products/GAO-09-665] (Washington, D.C.: September 
24, 2009). 

[12] GAO, Best Practices: Capturing Design and Manufacturing Knowledge 
Early Improves Acquisition Outcomes, [hyperlink, 
http://www.gao.gov/products/GAO-02-701] (Washington, D.C.: July 15, 
2002). 

[13] The Weapon Systems Acquisition Reform Act of 2009, Pub. L. No. 
111-23, was enacted May 22, 2009. In December 2008, DOD revised its 
acquisition instruction--Department of Defense Instruction 5000.02, 
Operation of the Defense Acquisition System. 

[14] [hyperlink, http://www.gao.gov/products/GAO-10-388SP]. 

[15] [hyperlink, http://www.gao.gov/products/GAO-08-62R]. 

[16] DAMIR Purview is an executive information system operated by the 
Office of the Under Secretary of Defense for Acquisition, Technology 
and Logistics/Acquisition Resources and Analysis. 

[17] We started with 95 programs, then excluded 23 programs for which 
we were lacking data on development costs, unit costs, or initial 
operational capability (IOC) date. We then excluded nine programs that 
had started development less than 3 years ago. 

[18] These thresholds were determined through a consideration of 
criteria such as thresholds for required reporting of cost and 
schedule growth under law and regulation, historical average cost 
growth for weapons systems programs, and judgment based on GAO 
experience. These thresholds were not intended to directly represent 
any thresholds for allowable cost growth under law or DOD regulation. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: